A nuclear reactor physics engineer by training, Victor has solid experience in the design and deployment of digital engineering tools.
In industrial environments, data management is a major challenge requiring effective and optimised management of assets throughout their life cycle, from the beginning of the plant specification phase up to its dismantlement. The complexity of the systems operated within some infrastructures means that the volume of data accumulated throughout its life cycle is particularly important.
In the case of a nuclear power plant, at the end of the design phase, there may be as many as 25,000 documents; 35,000 requirements; 100,000 elementary components and 1,500,000 elementary technical data. Some of these figures can be multiplied by 10 during the construction and commissioning phase.
However, one of the prerequisites is to know how to create and organise all this data. This is particularly crucial, given the variety of tools and methods used throughout the asset life cycle and the need for each to be mutually compatible, communicating between themselves. The solutions will involve system engineering and digital tools.
Document Centric and Data Centric: what is covered by these two principles?
Even today, documents in the form of hardcopy, memos and photographs still form a large part in complex plant and related systems’ lifecycle management. This is known as a Document-Centric approach. This method uses EDM (Electronic Document Management) tools to manage documents in configuration according to their type to monitor and manage changes made during each phase of a facility’s lifecycle, from preliminary and detailed design phases, execution design work, construction, testing, operation and in the end decommissioning.
Data is created immediately after the definition of specifications and the resulting data is all inter-related: the documents contain requirements, which are then broken down into basic components, which in turn have technical specifications (pressure, flow, weight, etc.) and may be part of a critical chain. “This is where the data-centric principle becomes relevant. Its objective is to manage basic technical data in the configuration phase. However, in view of the quantities involved, a degree of pragmatism is needed, and we will still manage both documents and data for a long time.”
The objective is to create a data model, as early as possible in the project, clearly defining the data that will be generated, transferred and used, by which users and context. Naturally, this data model will evolve to reflect changes and the life of the project and the asset.
System engineering – an essential theoretical base…
Faced with the need for data consistency through the lifecycle of a plant, some industries have developed a process of system engineering. Developed in the 1950s, this scientific approach aiming at building the theoretical elements to understand and manage complex systems was first implemented by the aerospace and defence industries, before spreading to other sectors such as the naval and automotive industries. Each of these sectors share a strict regulatory framework, combining high levels of technical understanding across multiple disciplines. System engineering focuses on several major fields, with the objective of providing answers to the problems identified on concrete projects (management of requirements to ensure that products designed are compliant, interface management to ensure successful integration of the systems and sub-systems, etc.). The central idea is to break down a product into systems and sub-systems, monitoring the constraints, requirements, and interfaces at each phase in order to achieve a certain degree of complexity that allows safe management.
…with very practical objectives and data model
However, the main objective of system engineering is to ensure a project’s safety and optimisation, i.e. to respond to concrete use case and problems that occur when managing an asset, whether in the design, construction, testing, operation, maintenance or dismantling phases. Within this context, this set of theoretical tools must be adapted, adjusted and modulated, according to the specific expected outcomes for a real use case. For example, we could mention:
- The management of a complex product configuration during its development, as it is sometimes the case in the nuclear sector, with potentially long development times,
- Optimisation of the testing phase, through an improved monitoring of requirements throughout the design and construction phases,
- Quick and efficient access to information, through appropriate indexing and updated breakdown of the product documentation,
- Better stock management and optimised maintenance, by ensuring that the product configuration monitoring is aligned with the optimised organisation of the product related data.
Depending on the expected use case, the main purpose will therefore to align the appropriate theoretical bricks with the relevant data. These data models are of various types, origins or levels of detail. They are developed for specific professional objectives and use various languages and modelling tools, depending on the professional sector. Moreover, given the volume of data involved, these data models must be based on a variety of digital tools that will allow a holistic approach to the asset in question.
Bridges between BIM and PLM become crucial for a flawless management of complex industrial assets,particularly in view of the volume of data to be managed.
BIM and PLM, two digital methods that have been left dissociated for a long time…
The development of 3D digital mock-ups to ensure the assets’ design is one of the first steps that has already been achieved. It had become necessary, particularly in the infrastructure field, where a 3D view of assets before construction was essential. The mock-ups are broken down into objects, to which data (or attributes) can be associated. For example, for a ‘pump’ object, technical performance features (pressure, flow, etc.) physical features (dimensions, weight, etc.) can be attached. “In the PBW sector, this is known as BIM (Building Information Modelling) – and this is the same concept.” Thus, BIM means adding a data model to a 3D mock-up of the related asset, that enables to enhance the amount of information and the ‘multi-sector’ feature of the model.
In parallel, the PLM (Product Lifecycle Management) approach enables the management of all information related to a product throughout its entire life cycle, including concepts, methods and collaborative software tools. This became necessary in the systems design field (aircraft industry, automotive, naval, defence), where the volume and diversity of information to be attached to a component or to a system made it impossible to manage without digital assistance. For example, this could mean attaching applicable design requirements to a pump, the configuration of the product for which this pump is used, or all the other systems that have an interface with this pump (pipe systems, control-monitoring, etc.). This approach also allows precise data validity management, through configurable verification circuits involving various stakeholders (these are known as workflows) and indexation by a number of crossed file structures that can sometimes be significant (this is known as PBS, Product Breakdown Structure, FBS, Functional Breakdown Structure, GBS, Geographical Breakdown Structure, etc.).
…and that appears to be essential to assemble
Convergence between these two types of tools would therefore appear to be:
- Possible, due to increasing IT and computing capabilities, that enables the management of increasingly greater volumes of data with more and more complex representations,
- Consistent, due to the compatibility between the data managed and the uses of these tools, which also creates synergies and uses based on data transfer,
- Logical, particularly in the nuclear industry, where the infrastructure and system fields are closely correlated,
- Necessary, due to the increasing complexity of the assets to manage, the volume of data to be processed and the large number of professional sectors to cover, which makes data distribution and sharing increasingly essential for the success of a project.
The development of bridges, interconnections and common use cases that allow to build efficiently a hub to handle the functions traditionally attached to the BIM world on the one hand and the PLM world on the other, thus appears to be unavoidable, particularly within the context of the nuclear industry.
The Asset Information Hub
However, this BIM/PLM convergence approach should not be limited to that. It must include several other projects, tools and actions that share. Indeed, other approaches apply to the same engineering and asset design work, aim to provide support for an optimised asset management approach, and use IT and digital tools (for example, project or programme management software). This means building an asset management hub around a BIM/PLM convergence. The hub should include the necessary bricks for the identified use cases, for which interactions will be encoded by a purpose-designed data model, an Asset Information Hub. More precisely, the Asset Information Hub involves an approach comprising three phases:
- Precise understanding of the requirements and the expected result in functional terms,
- Translation of the requirements into data flows and consumption,
- Construction of a digital architecture based on existing bricks, with the objective of materialising the data flows.
Within this context, the Asset Information Hub is naturally based on a set of existing or specifiable software and digital bricks including BIM and PLM, which is the reason for the convergence need expressed above. More specific engineering softwares, fluids or lighting modelling softwares, project management, etc. can also be added to this inventory.
The Asset Information Hub is based on specifiable software and digital bricks including BIM and PLM.
In terms of implementation, there are many use cases. The BIM/PLM interfacing enables the rapid identification of the impact of a requirement or of a modification, the bridge between BIM/PLM/CAMM (Computer Assisted Maintenance Management), so that configuration management and stock management can be integrated in graph format.
Artificial intelligence as a complement
This approach to systems engineering, which is tool-based or even model-based, provides a collaborative environment that brings together all the data and technological building blocks needed to secure the various phases of a project. It is then possible, for example, to exploit artificial intelligence (AI) technologies to improve the efficiency of the requirements engineering phases. This can also be used as a basis for developing digital twins of complex systems, or for optimising the operation of facilities." For example, data mining tools can be used for the requirements capture and declension phases; or tools implementing rule engines to capitalise on feedback. Data-driven decision support can also be provided to better manage the various stages and secure schedules."
In fact, it is unthinkable that everything could, must or should be modelled and, in any case, who would accept or undertake it? The challenge here is to know how to build (ontological and semantic approaches), to find and provide data from existing and remote databases and to analyse and align reference frameworks with sectorial needs, in order to optimise and ensure the safety of the various phases, from design to operation.
Towards the Digital Twin
When the right tools and concepts have been defined, the Asset Information Hub can then act as a real digital twin of the related asset, based on the real asset data collected throughout its life cycle, incorporating its 3D representation and the IT bricks needed to simulate and/or predict its most relevant behaviour characteristics.
However, the construction of a platform integrating all or part (depending on project requirements) of the dimensions listed above poses a major problem, intrinsically linked to the data manipulated at all stages of the process. It is not only a question of allowing the various tools, as well as their specifiers, integrators and developers, to access the data frequently contained in one or more documents, but also of linking them to each other and to the physical elements of the installation. Finally, it is necessary to carry out these activities while avoiding inconsistencies between data if they are filed in several places.
The Asset Information Hub can act as a real digital twin of the related asset.
MBSE data models
The MBSE (Model Based System Engineering) approach provides a partial response to the data consistency problem, with the transition from document-based engineering to model-based engineering.
These models are intended to represent and formalise the installation or system concerned (known as the system of interest, or "system to be made"), the operational environment (in which this system of interest will have to evolve during its life cycle) and finally the project system or "system to be made" (which puts forward models of planning, resource profile, dashboards and indicators, cost calculation, etc.).
“These models are essential, for example, to achieve the most complete, credible and plausible checking and validation of the system of interest, even before it is developed and delivered to the customer. In the context of a data organisation scheme - or data model - they will host, for example, the interfaces, or the list of materials of the systems under study."
This MBSE approach is, along with BIM and PLM, an essential element in the design of complex systems combining infrastructure and product. All these elements facilitate and feeds the engineering processes with representations that can be confidently shared between all the people involvedAt the same time, each component provides specific functionalities, in particular PLM, which completes this approach, to guarantee data uniqueness and configuration management, which, can be confidently shared.
“At Assystem, we use modelling languages, the Arcadia method and the Capella environment to apply this MBSE approach, also providing the link with PLM hubs (such as the Dassault Systèmes 3DExperience). To complement this, we are launching an industrial chair with the Alès IMT Mines, to study and try to resolve the conceptual blockages (what? why?), methods blockages (how?) and technical blockages (with what?) encountered, to effectively deploy the MBSE approach.”
The human in the middle of this tool utilisation
These technologies must give the engineer the time to do what they were trained to do: design, model and simulate systems. It is therefore very important to give a sense to the introduction of these technologies, by choosing practical cases that are quick to set up, to have rapidly exploitable results.
It is also essential to bear in mind that these tools are intended to be used by the greatest number of people in engineering, by all generations, those who have seen the arrival of NICTs (New Information and Communication Technologies) and digital natives. Consequently, both the interface and the support and management of change to enable their adoption are a component that should not be neglected.
The expert concludes: “The human must always remain at the centre of the utilisation of these tools and of the implementation of the related processes. Change management is the key, and is certainly the most delicate factor to successfully achieve”.