expert-talks

BIM/PLM/Asset Information Hub, digital support for industrial asset management

Digital & data

In industrial environments, data management is a major challenge requiring effective and optimised management of assets throughout their life cycle, from the beginning of the plant specification phase up to its dismantlement. The complexity of the systems operated within some infrastructures means that the volume of data accumulated throughout its life cycle is particularly important. At the end of a nuclear power plant’s design phase, the sets of data can reach: 25,000 documents, 35,000 requirements, 100,000 basic components and 1,500,000 basic technical data records. During the construction and commissioning phases, these figures are multiplied by 10.

However, one of the prerequisites is to know how to create and organise all this data. This is particularly crucial, given the variety of tools and methods used throughout the asset life cycle and the need for each to be mutually compatible, communicating between themselves. The solutions will involve system engineering and digital tools.

Document Centric and Data Centric: what is covered by these two principles?

Even today, documents in the form of hardcopy, memos and photographs still form a large part in complex plant and related systems’ lifecycle management. This is known as a Document-Centric approach. This method uses EDM (Electronic Document Management) tools to manage documents in configuration according to their type to monitor and manage changes made during each phase of a facility’s lifecycle, from preliminary and detailed design phases, execution design work, construction, testing, operation and in the end decommissioning.

Data is created immediately after the definition of specifications and the resulting data is all inter-related: the documents contain requirements, which are then broken down into basic components, which in turn have technical specifications (pressure, flow, weight, etc.) and may be part of a critical chain. “This is where the data-centric principle becomes relevant. Its objective is to manage basic technical data in the configuration phase. However, in view of the quantities involved, a degree of pragmatism is needed, and we will still manage both documents and data for a long time.”

The objective is to create a data model, clearly defining the data that will be generated, transferred and used, by which users and context.

System engineering – an essential theoretical base…

Faced with the need for data consistency through the lifecycle of a plant, some industries have developed a process of system engineering. Developed in the 1950s, this scientific approach aiming at building the theoretical elements to understand and manage complex systems was first implemented by the aerospace and defence industries, before spreading to other sectors such as the naval and automotive industries. Each of these sectors share a strict regulatory framework, combining high levels of technical understanding across multiple disciplines. System engineering focuses on several major fields, with the objective of providing answers to the problems identified on concrete projects (management of requirements to ensure that products designed are compliant, interface management to ensure successful integration of the systems and sub-systems, etc.). The central idea is to break down a product into systems and sub-systems, monitoring the constraints, requirements, and interfaces at each phase in order to achieve a certain degree of complexity that allows safe management.

…with very practical objectives and data model

However, the main objective of system engineering is to ensure a project’s safety and optimisation, i.e. to respond to concrete use case and problems that occur when managing an asset, whether in the design, construction, testing, operation, maintenance or dismantling phases. Within this context, this set of theoretical tools must be adapted, adjusted and modulated, according to the specific expected outcomes for a real use case. For example, we could mention:

  • The management of a complex product configuration during its development, as it is sometimes the case in the nuclear sector, with potentially long development times,
  • Optimisation of the testing phase, through an improved monitoring of requirements throughout the design and construction phases,
  • Quick and efficient access to information, through appropriate indexing and updated breakdown of the product documentation,
  • Better stock management and optimised maintenance, by ensuring that the product configuration monitoring is aligned with the optimised organisation of the product related data.

Depending on the expected use case, the main purpose will therefore to align the appropriate theoretical bricks with the relevant data. These data models are of various types, origins or levels of detail. They are developed for specific professional objectives and use various languages and modelling tools, depending on the professional sector.

Bridges between BIM and PLM become crucial for a flawless management of complex industrial assets.

BIM and PLM, two digital methods that have been left dissociated for a long time…

The development of 3D digital mock-ups to ensure the assets’ design is one of the first steps that has already been achieved. It had become necessary, particularly in the infrastructure field, where a 3D view of assets before construction was essential. The mock-ups are broken down into objects, to which data (or attributes) can be associated. For example, for a ‘pump’ object, technical performance features (pressure, flow, etc.) physical features (dimensions, weight, etc.) can be attached. “In the PBW sector, this is known as BIM (Building Information Modelling) – and this is the same concept.” Thus, BIM means adding a data model to a 3D mock-up of the related asset, that enables to enhance the amount of information and the ‘multi-sector’ feature of the model.

In parallel, the PLM (Product Lifecycle Management) approach enables the management of all information related to a product throughout its entire life cycle, including concepts, methods and collaborative software tools. This became necessary in the systems design field (aircraft industry, automotive, naval, defence), where the volume and diversity of information to be attached to a component or to a system made it impossible to manage without digital assistance. For example, this could mean attaching applicable design requirements to a pump, the configuration of the product for which this pump is used, or all the other systems that have an interface with this pump (pipe systems, control-monitoring, etc.). This approach also allows precise data validity management, through circuits that can be complexified and configured (these are known as workflows) and indexation by a number of crossed file structures that can sometimes be significant (this is known as PBS, Product Breakdown Structure, FBS, Functional Breakdown Structure, GBS, Geographical Breakdown Structure, etc.).

…and that appears to be essential to assemble

Convergence between these two types of tools would therefore appear to be:

  • Possible, due to increasing IT and computing capabilities, that enables the management of increasingly greater volumes of data with more and more complex representations,
  • Consistent, due to the compatibility between the use cases and the data managed by these tools, which also creates synergies and uses based on data transfer,
  • Logical, particularly in the nuclear industry, where the infrastructure and system fields are closely correlated,
  • Necessary, due to the increasing complexity of the assets to manage, the volume of data to be processed and the large number of professional sectors to cover, which makes data distribution and sharing increasingly essential for the success of a project.

The development of bridges, interconnections and common use cases that allow to build efficiently a hub to handle the functions traditionally attached to the BIM world on the one hand and the PLM world on the other, thus appears to be unavoidable, particularly within the context of the nuclear industry.

The Asset Information Hub

However, this BIM/PLM convergence approach should not be limited to that. It must include several other projects, tools and actions that share, with the convergence approach, their applicability to real situations, the support that they provide for optimum asset management actions, and their use for IT and digital tools. This means building an asset management hub around a BIM/PLM convergence. The hub should include the necessary bricks for the identified use cases, for which interactions will be encoded by a purpose-designed data model, an Asset Information Hub. More precisely, the Asset Information Hub involves an approach comprising three phases:

  • Precise understanding of the requirements and the expected result in functional terms,
  • Translation of the requirements into data flows and consumption,
  • Construction of a digital architecture based on existing bricks, with the objective of materialising the data flows.

Within this context, the Asset Information Hub is naturally based on a set of existing or specifiable software and digital bricks including BIM and PLM, which is the reason for the convergence need expressed above. More specific engineering softwares, fluids or lighting modelling softwares, project management, etc. can also be added to this inventory.

The Asset Information Hub is based on specifiable software and digital bricks including BIM and PLM.

In terms of implementation, there are many use cases. The BIM/PLM interfacing enables the rapid identification of the impact of a requirement or of a modification, the bridge between BIM/PLM/CAMM (Computer Assisted Maintenance Management), so that configuration management and stock management can be integrated in graph format.

Artificial intelligence as a complement

This approach of system engineering driven by models and data provides a collaborative environment that regroups all the elements aimed at making the various project phases safe. It is then possible to exploit Artificial Intelligence (AI) technologies to improve the effectiveness of the phases related to requirements engineering, to develop digital twins at the system level and optimisation applications when running the plants. “For example, Data Mining tools can be used for the requirements definition and breakdown phases, and also tools using rules engines to incorporate the feedback results. Decision aid tools can also be used for optimum management of the various phases and to ensure that schedules are safely maintained. Assystem has thus established a partnership with the Lyon-based company Cosmo Tech, to deploy a new complex systems modelling and simulation approach.”

In fact, it is unthinkable that everything could, must or should be modelled and, in any case, who would accept or undertake it? The challenge here is to know how to build (ontological and semantic approaches), to find and provide data from existing and remote databases and to analyse and align reference frameworks with sectorial needs, in order to optimise and ensure the safety of the various phases, from design to operation.

Towards the Digital Twin

When the right tools and concepts have been defined, the Asset Information Hub can then act as a real digital twin of the related asset, based on the real asset data collected throughout its life cycle, incorporating its 3D representation and the IT bricks needed to simulate and/or predict its most relevant behaviour characteristics.

However, the linking and the construction of a hub that incorporates all or some (as a function of the project’s mandatory factors) of the dimensions listed above brings about a major problem, inherently related to the data handled at each phase of the process. The objective throughout the process is to access the data contained in these documents and to link it together and to the physical elements of the plant, while avoiding inconsistencies between data if it is saved in several locations, for example.

The Asset Information Hub can act as a real digital twin of the related asset.

MBSE data models

The MBSE (Model Based System Engineering) approach provides a partial response to the data consistency problem, with the transition from document-based engineering to model-based engineering.

These models are intended to represent and formalise:

  • The plant or system of interest concerned,
  • The operational environment in which the system of interest will be operating throughout its lifecycle,
  • The project system or ‘system to do’, which highlights some models such as scheduling, resources profile, management data and indicators, cost computing, etc.

“These models are essential to achieve the most complete, credible and plausible checking and validation of the system of interest, even before it is developed and delivered to the customer. They also determine the ’data models’ which will ensure the consistency and management of the data throughout the entire life cycle”.

It should be noted that the ‘digital mock-up’ is an essential element within the context of this MBSE approach. It facilitates and feeds the engineering processes with representations that can be confidently shared between all the people involved. PLM completes this approach, to guarantee data uniqueness and configuration management, which, can be confidently shared.

“At Assystem, we use modelling languages, the Arcadia method and the Capella environment to apply this MBSE approach, also providing the link with PLM hubs (such as the Dassault Systèmes 3DExperience). To complement this, we are launching an industrial chair with the Alès IMT Mines, to study and try to resolve the conceptual blockages (what? why?), methods blockages (how?) and technical blockages (with what?) encountered, to effectively deploy the MBSE approach.”

The human in the middle of this tool utilisation

These technologies must give the engineer the time to do what they were trained to do: design, model and simulate systems. It is therefore very important to give a sense to the introduction of these technologies, by choosing practical cases that are quick to set up, to have rapidly exploitable results.

There is also a point of attention concerning the HMI (Human-Machine Interface) in particular, as these tools are intended for all generations, those who saw the NICT (New Information and Communication Technologies) arrive and digital natives.

The expert concludes: “The human must always remain at the centre of the utilisation of these tools and of the implementation of the related processes. Change management is the key, and is certainly the most delicate factor to successfully achieve”.

Gestion des actifs industriels

About the expert

Eric DEVINGT

Vice President Nuclear Engineering & Assystem Connect

An engineer by training, Eric has a strong experience in complex systems, acquired as Technical Director or Project Director in the airport – defence and nuclear industries.

Victor Richet - expert assystem

About the expert

Victor Richet

Technical Coordinator

An engineer in nuclear reactor physics by training, Victor has a strong background in the development and deployment of system engineering and digital tools.


Download the PDF version

BIM and PLM: our projects

Stratégie BIM
System engineering Transportations

BIM strategy, tests and commissioning for the Doha metro

déploiement plm
Design Nuclear

Studies and implementation of PLM for a dismantling

maquette numérique BIM ferroviaire
System engineering Transportations

BIM digital model for a railway company

Études gestion des déchets nucléaires
Project management Nuclear

Design, commissioning and PLM for nuclear waste management

A question, a project?

Contact us