You are currently browsing the tag archive for the ‘model-based’ tag.

Imagine you are a supplier working for several customers, such as big OEMs or smaller companies. In Dec 2020, I wrote about PLM and the Supply Chain because it was an underexposed topic in many companies. Suppliers need their own PLM and IP protection and work as efficiently as possible with their customers, often the OEMs.

Most PLM implementations always start by creating the ideal internal collaboration between functions in the enterprise. Historically starting with R&D and Engineering, next expanding to Manufacturing, Services and Marketing. Most of the time in this logical order.

In these implementations, people are not paying much attention to the total value chain, customers and suppliers. And that was one of the interesting findings at that time, supported by surveys from Gartner and McKinsey:

  • Gartner: Companies reported improvements in the accuracy of product data and product development as the main benefit of their PLM implementation. They did not see so much of a reduced time to market or reduced product development costs. After analysis, Gartner believes the real issue is related to collaboration processes and supply chain practices. Here the lead times did not change, nor did the number of changes.
  • McKinsey: In their article, The Case for Digital Reinvention, digital supply chains were mentioned as the area with the potential highest ROI; however, as the image shows below, it was the area with the lowest investment at that time.

In 2020 we were in the middle of broken supply chains and wishful thinking related to digital transformation, all due to COVID-19.

Meanwhile, the further digitization in PLM (systems of engagement) and the new topic, Sustainability of the supply chain, became visible.

Therefore it is time to make a status again, also driven by discussions in the past few weeks.

 

The old “connected” approach (loose-loose).

A preferred way for OEMs in the past was to have the Supplier or partner directly work in their PLM environment. The OEM could keep control of the product development process and the incremental maturity of the BOM, where the Supplier could connect their part data and designs to the OEM environment. T

The advantage for the OEM is clear – direct visibility of the supplier data when available. The benefit for the Supplier could also be immediate visibility of the broader context of the part they are responsible for.

However, the disadvantages for a supplier are more significant. Working in the OEM environment exposes all your IP and hinders knowledge capitalization from the Supplier. Not a big thing for perhaps a tier 3 supplier; however, the more advanced the products from the Supplier are, the higher the need to have its own PLM environment.

Therefore the old connected approach is a loose-loose relationship in particular for the Supplier and even for the OEM (having less knowledgeable suppliers)

 

The modern “connected” approach (wins t.b.d.)

In this situation, the target infrastructure is a digital infrastructure, where datasets are connected in real-time, providing the various stakeholders in engagement access to a filtered set of data relevant to their roles.

In my terminology, I refer to them as Systems of Engagement, as the target is that all stakeholders work in this environment.

The counterpart of Systems of Engagement is the Systems of Record, which provides a product baseline, manufacturing baseline, and configuration baseline of information consumed by other disciplines.

These baselines are often called Bills of Information, and the traditional PLM system has been designed as a System of Record. Major Bills of Information are the eBOM, the mBOM and sometimes people talk about the sBOM(service BOM).

Typical examples of Systems of Engagement I have seen in alphabetical order are:

  • Arena Solutions has a long-term experience in BOM collaboration between engineering teams, suppliers and contract manufacturers.
  • CATENA-X might be a strange player in this list, as CATENA-X is more a German Automotive consortium targeting digital collaboration between stakeholders, ensuring security and IP protection.
  • Colab is a provider of cloud-based collaboration software allowing design teams and suppliers to work in real time together.
  • OnShape – a cloud-based collaborative product design environment for dispersed engineering teams and partners.
  • OpenBOM – a SaaS solution focusing on BOM collaboration connected to various CAD systems along with design teams and their connected suppliers

These are some of the Systems of Engagement I am aware of. They focus on specific value streams that can improve the targeted time to market and product introduction efficiency. In companies with no extensive additional PLM infrastructure, they can become crucial systems of engagement.

The main challenge for these systems of engagement is how they will connect to traditional Systems or Records – the classical PLM systems that we know in the market (Aras, Dassault, PTC, Siemens).

Image on the left from a presentation done by Eric Herzog from SAAB at last year’s CIMdata/PDT conference.

You can read more about this here.

When establishing a mix of Systems of Engagement and Systems of Record in your organization digitally connected, we will see overall benefits. My earlier thoughts, in general, are here: Time to split PLM?

The almost Connected approach

As I mentioned, in most companies, it is already challenging to manage their internal System of Record, which is needed for current operations and the traceability of information. In addition, most of the data stored in these systems is document-driven, not designed for real-time collaboration. So how would these companies collaborate with their suppliers?

The Model-Based Enterprise

In the bigger image below, I am referring to an image published by Jennifer Herron from her book Re-use Your CAD, where she describes the various stages of interaction between engineering, manufacturing and the extended enterprise.

Her mission is to promote and educate organizations in moving to a Model-Based Definition and, in the long term, to a Model-Base Enterprise.

The ultimate target of information exchange in this diagram is that the OEM and the Supplier are separate entities. However, they can exchange Digital Product Definition Packages and TDPs over the web (electronically). In this exchange, we have a mix of systems of engagement and systems of record on the OEM and Supplier sides.

Depending on the type of industry, in my ecosystem of companies, many suppliers are still at level 2, dreaming or pushed to become level 3, illustrating there is a difficult job to do – learning new practices. And why would you move to the next level?

Every step can have significant benefits, as reported by companies that did this.

So what’s stopping your company from moving ahead? People, Processes, Skills, Work Pressure? It is one of the most common excuses: “We are too busy, no time to improve”.

A supply chain collaboration hub

On March 21, I discussed with  Magnus Färneland from Eurostep their cloud-based PLM collaboration hub, ShareAspace. You can read the interview here: PLM and Supply Chain Collaboration

I believe this concept can be compelling for a connected enterprise. The OEM and the Supplier share (or connect) only the data they want to share, preferably based on the PLCS data schema (ISO 10303-239).

In a primitive approach, this can be BOM structures with related files; however, it could become a real model-based connection hub in the advanced mode. “

Now you ask yourself why this solution is not booming.

In my opinion, there are several points to consider:

  • Who designs, operates and maintains the collaboration hub?
    It is likely not the suppliers, and when the OEM takes ownership, they might believe there is no need for the extra hub; just use the existing PLM infrastructure.
  • Could a third party find a niche market for this? Eurostep has already been working on this for many years, but adopting the concept seems higher in de BIM or Asset Management domains. Here the owner/operator sees the importance of a collaboration hub.

A final remark, we are still far from a connected enterprise; concepts like Catena-X and others need to become mature to serve as a foundation – there is a lot of technology out there -now we need the skilled people and tested practices to use the right technology and tune solutions concepts.

Sustainability demands a connected enterprise.

I focused on the Supplier dilemma this time because it is one of the crucial aspects of a circular economy and sustainable product development.

Only by using virtual models of the To-Be products/systems can we seriously optimize them. Virtual models and Digital Twins do not run on documents; they require accurate data from anywhere connected.

You can read more details in my post earlier this year: MBSE and Sustainability or look at the PLM and Sustainability recording on our PLM Global Green Alliance YouTube channel.

Conclusion

Due to various discussions I recently had in the field, it became clear that the topic of supplier integration in a best-connected manner is one of the most important topics to address in the near future. We cannot focus longer on our company as an isolated entity – value streams implemented in a connected manner become a must.

And now I am going to enjoy Liveworx in Boston, learning, discussing and understanding more about what PTC is doing and planning in the context of digital transformation and sustainability. More about that in my next post: The week(end) after Liveworx 2023 (to come)

I am writing this post because one of my PLM peers recently asked me this question: “Is the BOM losing its position? He was in discussion with another colleague who told him:

“If you own the BOM, you own the Product Lifecycle”.

This statement made me think of ä recent post from Jan Bosch recent post:  Product Development fallacy #8: the bill of materials has the highest priority.

Software becomes increasingly an essential part of the final product, and combined with Jan’s expertise in software development, he wrote this article.  I recommend reading the full post (4 min read) and next browse through the comments.

If you cannot afford these 10 minutes, here is my favorite quote from the article:

An excessive focus on the bill of materials leads to significant challenges for companies that are undergoing a digital transformation and adopting continuous value delivery. The lack of headroom, high coupling and versioning hell may easily cause an explosion of R&D expenditure over time.

Where did the BOM focus come from? A historical overview related to the rise (and fall) of the BOM.

 

In the beginning, there was the drawing.

Before the era of computers, there was “THE drawing”, describing assemblies, subassemblies or parts. And on the drawing, you can find the parts list if relevant. This parts list was the first Bill of Material, describing the parts/materials shown on the drawing.

 

Next came MRP/ERP

With the introduction of the MRP system (Material Requirement Planning), it was the first step that by using computers, people could collect the material requirements for one system as data and process. Entering new materials/parts described on drawings was still a manual process, as well as referring to existing parts on the drawing. Reuse of parts was a manual process based on individual knowledge.

In the nineties, MRP evolved into ERP (Enterprise Resource Planning), which included the MRP part and added resource and manufacturing planning and financial reporting.

The ERP system became the most significant IT system, the execution system of the company. As it was the first enterprise system implemented, it was the first moment we learned about implementation challenges – people change and budget overruns. However, as the ERP system brought visibility to the company’s execution, it became a “must-have” system for management.

The introduction of mainstream 2D CAD did not affect the company’s culture so much. Drawings became electronic drawings, and the methodology of the parts list on the drawing remained.

Sometimes the interaction with the MRP/ERP system was enhanced by an interface – sending the drawing BOM to ERP. The advantage of the interface: no manual transfer of data reducing typos and BOM errors. The disadvantages at that time: relatively expensive (connectivity between systems was a challenge) and mostly one direction.

 

And then there was PDM.

In parallel with the introduction of ERP systems, mainstream 3D CAD systems became affordable, particularly SolidWorks, Solid Edge and Inventor. These 3D CAD systems allow sharing of parts and assemblies in different products, and the PDM database was the first aid to support part reuse, versioning and standardization.

By extracting the parts from the assemblies and subassemblies, it was possible to generate a BOM structure in the PDM system to be transferred or typed into the ERP system. We did not talk about EBOM or MBOM then, as there was only one BOM in the ERP system, and the PDM system was a tool to feed the ERP system.

Many companies still have based their processes on this approach. ERP (read SAP nowadays) is the central execution system, and PDM is an external system. You might remember the story and image from my previous post about people, processes and tools. The bad practice example: Asking the ERP system to provide a part number when starting to design a part.

 

And then products started to change.

In the early 2000s, I worked with SmarTeam to define the E&E (Electronics and Electrical) template. One of the new concepts was to synchronize all design data coming from different disciplines to a single BOM structure.

It was the time we started to talk about the EBOM. A type of BOM, as the structure to consolidate all the design data, was based on parts.

The EBOM, most of the time, reflects the design intent in logical groups and sending the relevant parts in the correct order to the ERP system was a favorite expensive customization for service providers. How to transfer an engineering BOM view to an ERP system that only understands the manufacturing view?
Note: not all ERP systems have the data model to differentiate between engineering parts and manufacturing parts

The image below illustrates the challenge and the customer’s perception.

The automated link between the design side (EBOM) and manufacturing side (MBOM) was a mission impossible – too many exceptions for the (spaghetti) code.

 

And then came the MBOM.

The identified issues connecting PDM and ERP led to the concept of implementing the MBOM in the PLM system. The MBOM in PLM is one of the characteristics of a PLM implementation compared to a PDM implementation. In a traditional PLM system, there is an interaction and connection between the EBOM and MBOM. EBOM parts should end up as MBOM parts. This interaction can be supported by automation, however, as it is in the same system, still leaving manual changes possible.

The MBOM structure in PLM could then be the information structure to transfer to the ERP system; however, there is more, as Jörg W. Fischer wrote in his provoking post-Die MBOM muss weg (The MBOM must go). He rightly points out (in German) that the MBOM is not a structure on its own but a combination of different views based on Assembly Drawings, Process Planning and Material Requirements.

His conclusion:

Calling these structures, MBOM is trying to squeeze all three structures into one. That usually doesn’t work and then leads to much more emotional discussions in the project. It also costs a lot of money. It is, therefore, better not to use the term MBOM at all.

And indeed, just having an MBOM in your PLM system might help you to prepare some of the manufacturing steps, the needed resources and parts. The MBOM result still has to be localized at the local plant where the manufacturing takes place. And here, the systems used are the ERP system and the MES system.

The main advantage of having the MBOM in the PLM system is the direct relation between specification and manufacturing intent, allowing manufacturing engineering to work collaboratively with engineering in the same environment.

  • The first benefit is fewer iterations and a shorter time to production, thanks to early interaction and manufacturing involvement in the engineering process.
  • The second benefit is: product knowledge is centralized in a single system. Consolidating your Product Knowledge in ERP does not make sense due to global localization and the missing capabilities to manage the iterative engineering processes on non-existing parts.

 

And then came the SBOM, the xBOM

Traditional PLM vendors and implementations kept using xBOM structures as placeholders for related specification data (mechanical designs, electrical, software deliverables, serialized products). Most of the time, related files.

And with this approach, talking about digital thread, PLM systems also touch on the concepts of Configuration Management.

I will not go into the details here but look at the two images by clicking on them and see a similar mindset.

It is about the traceability of information in structures and systems. These structures work well in a relatively static and linear product development and delivery environment, as illustrated below:

Engineering change and release processes are based on managing the changes in different structures from the left to the right.

 

And then came software!

Modern connected products are no longer mechanical products. The product’s functionality no longer depends on the mechanical properties but mainly on embedded electronics and software used. For example, look at the mechanical design of a telecom transmission tower – its behavior merely comes from non-mechanical components, and they can change over time. Still, the Bill of Material contains a lot of concrete and steel parts.

The ultimate example is comparing a Tesla (software on wheels) with a traditional car. For modern connected products, electronics and software need to be part of the solution. Software and electronics allow the product to be upgraded over time. Managing these products in the same manner as mechanical products is impossible, inefficient and therefore threatening your company’s future business.

I requote Jan Bosch:

An excessive focus on the bill of materials leads to significant challenges for companies that are undergoing a digital transformation and adopting continuous value delivery. The lack of headroom, high coupling and versioning hell may easily cause an explosion of R&D expenditure over time.

 

The model-based, connected enterprise

I will not solve the puzzle of the future in this post. You can read my observations in my series: The road to model-based and connected PLM. We need a new infrastructure with at least two modes. One that still serves as a System of Record, storing information in a traditional manner, like a Bill of Materials for the static parts, as not everyone and everything can be connected.

In addition, we need various Systems of Engagement that enable close to real-time interaction between products (systems) and relevant stakeholders for the engagement scope(multidisciplinary / consumers).

Digital twins are examples of such environments. Currently, these Systems of Engagement often work disconnected from the System of Record due to the lack of understanding of how to connect. (standard connectors? / OSLC?)

Our mission is to explore, as I wrote in my post Time to split PLM and drop our mechanical mindset.

And while I was finalizing this post, I read a motivating post from Jan Bosch again for all of you working on understanding and pushing the digital transformation in your eco-system.
The title: Be the protagonist of your life: 15 rules  A starting point for more to come.

 

Conclusion

The BOM is no longer the master of the product lifecycle when it comes to managing connected products, where functionality mainly depends on software. BOM structures with related documents are just one of the extracted baselines from a data-driven, connected enterprise. This traditional PLM infrastructure requires other, non-BOM-driven structures to represent the actual status of a virtual or physical product.
The BOM is not dead, but there is more ………

Your thoughts?

In this post, I want to explain why Model-Based Systems Engineering (MBSE) and Sustainability are closely connected. I would claim sustainability in our PLM domain will depend on MBSE.

Can we achieve Sustainability without MBSE? Yes, but it will be costly and slow. And as all businesses want to be efficient and agile, they should consider MBSE.

 

What is MBSE?

The abbreviation MBSE stands for Model-Based Systems Engineering, a specialized manner to perform Systems Engineering. Look at the Wikipedia definition in short:

MBSE is a technical approach to systems engineering that focuses on creating and exploiting domain models as the primary means of information exchange rather than on document-based information exchange.

Model-Based fits in the digital transformation scope of PLM – from a document-based approach to a data-driven, model-based one. In 2018, I focused on facets of the model-based enterprise and related to MBSE in this post: Model-Based: System Engineering (MBSE).

My conclusion in that post was:

Model-Based Systems Engineering might have been considered as a discipline for the automotive and aerospace industry only. As products become more and more complex, thanks to IoT-based applications and software, companies should consider evaluating the value of model-based systems engineering for their products/systems.

I drew this conclusion before I focused on sustainability and systems thinking. Implementing sustainability concepts, like the Circular Economy, require more complex engineering efforts, justifying a Model-Based Systems Engineering approach. Let’s have a look.

If you want to learn more about why we need MBSE, look at this excellent keynote speech lecture from Zhang Xin Guo at the Incose 2018 conference below:

The Mission / the stakeholders

A company might deliver products to the market with the best price/quality ratio and regulatory compliance,  perceived and checked by the market. This approach is purely focusing on economic parameters.

There is no need for a system engineering approach as the complexity is manageable. The mission is more linear,  a “job to do,” and a limited number of stakeholders are involved in this process.

… with sustainability

Once we start to include sustainability in our product’s mission, we need a systems engineering approach, as several factors will push for different considerations. The most obvious considerations are the choice of materials and the optimizing the production process (reducing carbon emissions).

However, the repairability/serviceability of the product should be considered with a more extended lifetime vision.

What about upgradeability and reusing components? Will the customer pay for these extra sustainable benefits?

Probably Yes, when your customer has a long-term vision, as the overall lifecycle costs of the product will be lower.

Probably No if none of your competitors delivers non-sustainable products much cheaper.

As long as regulations will not hurt traditional business models, there might be no significant change.

However, the change has already started. Higher energy prices will impact the production of specific resources and raise costs. In addition, energy-intensive manufacturing processes will lead to more expensive materials. Combined with raising carbon taxes, this will be a significant driver for companies to reconsider their product offering and manufacturing processes.

The more expensive it becomes to create new products, the more attractive repairable and upgradable products will become. And this brings us to the concept of the circular economy, which is one of the pillars of sustainability.

In short, looking at the diagram – the vertical flow from renewables and finite materials from part to product to product in service leads ultimately to wasted resources if there are no feedback loops. This is the traditional product delivery process that most companies are using.

You can click on the image to the left to zoom in on the details.

The renewable loop on the left side of the diagram is the usage of renewables during production and the use of the product. The more we use renewables instead of fossil fuels, the more sustainable this loop will be. This is the area where engineers should use simulations to find the optimal manufacturing processes and product behavior. Again click on the image to zoom in on the details.

The right side of the loop, related to the materials, is where we see the options for repairable, serviceable, upgradeable, and even further refurbishment and recycling to avoid leakage of precious materials. This is where mechanical engineers should dominate the activities.  Focussing on each of the loops and how to enable them in the product.  Click on the image to see the relevant loops.

Looking at the circular economy diagram, it is clear that we are no longer talking about a linear process – it has become the implementation of a system. Systems Engineering or MBSE?

 

The benefits of MBSE

Developing products with the circular economy in mind is no longer a “job to do,” a simple linear exercise. Instead, if we walk down the systems engineering V-shape, there are a lot of modeling exercises to perform before we reach the final solution.

To illustrate the benefits of MBSE, let’s walk through the following scenario.

A well-known company sells lighting projects for stadiums and public infrastructure. Their current business model is based on reliable lighting equipment with a competitive price and range of products.

Most of the time, their contracts have clauses about performance/cost and maintenance. The company sells the products when they win the deal and deliver spare parts when needed.

Their current product design is quite linear – without systems engineering.

Now this company has decided to change its business model towards Product As A Service, or in their terminology LaaS (Lightening as a Service). For a certain amount per month, they will provide lighting to their customers, a stadium, a city, and a road infrastructure.

To implement this business model, this is how they used a Model-Based Systems Engineering approach.

Modeling the Mission

Example of a business model

Before even delivering any products, the process starts with describing and analyzing the business model needed for Lightening as a Service.

Then, with modeling estimates about the material costs, there are exercises about the resources required to maintain the service, the potential market, and the possible price range.

It is the first step of using a model to define the mission of the service. After that, the model can be updated, adjusted, and used for a better go-to-market approach when the solution becomes more mature.

Part of the business modeling is also the intention to deliver serviceable and upgradeable products. As the company now owns the entire lifecycle, this is the cheapest way to guarantee a continuous or improved service over time.

Modeling the Functions

Example of a function diagram

Providing Lighting as a Service also means you must be in touch with your installations in real time. Power consumption needs to be measured and analyzed in real-time for (predictive) maintenance, and the light-providing service should be as cheap as possible during operation.

Therefore LED technology is the most reliable, and connectivity functions need to be implemented in the solution. The functional design ensures installation, maintenance and service can be done in a connected manner (cheapest in operation – beneficial for the business).

Modeling the Logical components

As an owner of the solution, the design of the logical components of the lighting solution is also crucial. How to address various lighting demands efficiently? Modularity is one of the first topics to address. With modular components, it is possible to build customer-specific solutions with a reduced engineering effort. However, the work needs to be done by generically designing the solutions and focusing on the interfaces.

Example of a logical diagram

Such a design starts with a logical process and flow diagrams combined with behavior modeling. Without already having a physical definition, we can analyze the components’ behavior within an electrical scheme. Decisions on whether specific scenarios will be covered by hardware or software can be analyzed here. The company can define the lower-level requirements for the physical component by using virtual trade-offs on the logical models.

At this stage, we have used business modeling, functional modeling and logical modeling to understand our solution’s behavior.

Modeling the Physical product

The final stage of the solution design is to implement the logical components into a physical solution. The placement of components and interfaces between the components becomes essential. For the physical design, there are still a lot of sustainability requirements to verify:

  • Repairability and serviceability – are the components reachable and replaceable? Reducing the lifecycle costs of the solution
  • Upgradeability – are there components that can behave differently due to software choices, or are there components that can be replaced with improved functionality. Reducing the cost of creating entirely new solutions.
  • Reuse & recyclable – are the materials used in the solution recyclable or reusable, reducing the cost of new materials or reducing the cost of dumping waste.
  • RoHS/ REACH compliance

The image below from Zhang Xin Guo’s presentation nicely demonstrates the iterative steps before reaching a physical product

Before committing to a hardware implementation, the virtual product can be analyzed, behavior can be simulated, and it carbon impact can be calculated for the various potential variants.

The manufacturing process and energy usage during operation are also a part of the carbon impact calculation. The best performing virtual solution, including its simulations models, can be chosen for the realization to ensure the most environmentally friendly solution.

 

The digital twin for follow-up

Once the solution has been realized, the company still has a virtual model of the solution. By connecting the physical product’s observed and measured behavior, the virtual side’s modeling can be improved or used to identify improvement candidates – maintenance or upgrades. At this stage, the virtual twin is the actual twin of the physical solution. Without going deeper into the digital twin at this stage, I hope you also realize MBSE is a starting point for implementing digital twins serving sustainability outcomes.

The image below, published by Boeing, illustrates the power of the connected virtual and physical world and the various types of modeling that help to assess the optimal solution.

Conclusion

For sustainability, it all starts with the design. The design decisions for the product contribute for 80 % to the carbon footprint of the solution. Afterward, optimization is possible within smaller margins. MBSE is the recommended approach to get a trustworthy understanding and follow-up of the product’s environmental impact.

What do you think can we create sustainable products without MBSE?

 

A month ago, I wrote: It is time for BLM – PLM is not dead, which created an anticipated discussion. It is practically impossible to change a framed acronym. Like CRM and ERP, the term PLM is there to stay.

However, it was also interesting to see that people acknowledge that PLM should have a business scope and deserves a place at the board level.

The importance of PLM at business level is well illustrated by the discussion related to this LinkedIn post from Matthias Ahrens referring to the CIMdata roadmap conference CEO discussion.

My favorite quote:

Now it’s ‘lifecycle management,’ not just EDM or PDM or whatever they call it. Lifecycle management is no longer just about coming up with new stuff. We’re seeing more excitement and passion in our customers, and I think this is why.”

But it is not that simple

This is a perfect message for PLM vendors to justify their broad portfolio. However, as they do not focus so much on new methodologies and organizational change, their messages remain at the marketing level.

In the field, there is more and more awareness that PLM has a dual role. Just when I planned to write a post on this topic, Adam Keating, CEO en founder of CoLab, wrote the post System of Record meet System of Engagement.

Read the post and the comments on LinkedIn. Adam points to PLM as a System of Engagement, meaning an environment where the actual work is done all the time. The challenge I see for CoLab, like other modern platforms, e.g., OpenBOM, is how it can become an established solution within an organization. Their challenge is they are positioned in the engineering scope.

I believe for these solutions to become established in a broader customer base, we must realize that there is a need for a System of Record AND System(s) of Engagement.

In my discussions related to digital transformation in the PLM domain, I addressed them as separate, incompatible environments.

See the image below:

Now let’s have a closer look at both of them

What is a System of Record?

For me, PLM has always been the System of Record for product information. In the coordinated manner, engineers were working in their own systems. At a certain moment in the process, they needed to publish shareable information, a document(e.g., PDF) or BOM-table (e.g., Excel). The PLM system would support New Product Introduction processes, Release and Change Processes and the PLM system would be the single point of reference for product data.

The reason I use the bin-image is that companies, most of the time, do not have an advanced information-sharing policy. If the information is in the bin, the experts will find it. Others might recreate the same information elsewhere,  due to a lack of awareness.

Most of the time, engineers did not like PLM systems caused by integrations with their tools. Suddenly they were losing a lot of freedom due to check-in / check-out / naming conventions/attributes and more. Current PLM systems are good for a relatively stable product, but what happens when the product has a lot of parallel iterations (hardware & software, for example). How to deal with Work In Progress?

Last week I visited the startup company PAL-V in the context of the Dutch PDM Platform. As you can see from the image, PAL-V is working on the world’s first Flying Car Production Model. Their challenge is to be certified for flying (here, the focus is on the design) and to be certified for driving (here, the focus is on manufacturing reliability/quality).

During the PDM platform session, they showed their current Windchill implementation, which focused on managing and providing evidence for certification. For this type of company, the System of Record is crucial.

Their (mainly) SolidWorks users are trained to work in a controlled environment. The Aerospace and Automotive industries have started this way, which we can see reflected in current PLM systems.

Image: Aras impression of the digital thread

And to finish with a PLM buzzword: modern systems of record provide a digital thread.

 

What is a System of Engagement?

The characteristic of a system of engagement is that it supports the user in real-time. This could be an environment for work in progress. Still, more importantly, all future concepts from MBSE, Industry 4.0 and Digital Twins rely on connected and real-time data.

As I previously mentioned, Digital Twins do not run on documents; they run on reliable data.

A system of engagement is an environment where different disciplines work together, using models and datasets. I described such an environment in my series The road to model-based and connected PLM. The System of Engagement environment must be user-friendly enough for these experts to work.

Due to the different targets of a system engagement, I believe we have to talk about Systems of Engagement as there will be several engagement models on a connected (federated) set of data.

Yousef Hooshmand shared the Daimler paper: “From a Monolithic PLM Landscape to a Federated Domain and Data Mesh” in that context. Highly recommended to read if you are interested in a potential PLM future infrastructure.

Let’s look at two typical Systems of Engagement without going into depth.

The MBSE System of Engagement

In this environment, systems engineering is performed in a connected manner, building connected artifacts that should be available in real-time, allowing engineers to perform analysis and simulations to construct the optimal virtual solution before committing to physical solutions.

It is an iterative environment. Click on the image for an impression.

The MBSE space will also be the place where sustainability needs to start. Environmental impact, the planet as a stakeholder,  should be added to the engineering process. Life Cycle Assessment (LCA) defining the process and material choices will be fed by external data sources, for example, managed by ecoinvent, Higg and others to come. It is a new emergent market.

The Digital Twin

In any phase of the product lifecycle, we can consider a digital twin, a virtual data-driven environment to analyze, define and optimize a product or a process. For example, we can have a digital twin for manufacturing, fulfilling the Industry 4.0 dreams.

We can have a digital twin for operation, analyzing, monitoring and optimizing a physical product in the field. These digital twins will only work if they use connected and federated data from multiple sources. Otherwise, the operating costs for such a digital twin will be too high (due to the inefficiency of accurate data)

In the end, you would like to have these digital twins running in a connected manner. To visualize the high-level concept, I like Boeing’s diamond presented by Don Farr at the PDT conference in 2018 – Image below:

Combined with the Daimler paper “From a Monolithic PLM Landscape to a Federated Domain and Data Mesh.” or the latest post from Oleg Shilovistky How PLM Can Build Ontologies? we can start to imagine a Systems of Engagement infrastructure.

 

You need both

And now the unwanted message for companies – you need both: a system of record and potential one or more systems of engagement. A System of Record will remain as long as we are not all connected in a blockchain manner. So we will keep producing reports, certificates and baselines to share information with others.

It looks like the Gartner bimodal approach.

An example: If you manage your product requirements in your PLM system as connected objects to your product portfolio, you will and still can generate a product specification document to share with a supplier, a development partner or a certification company.

So do not throw away your current System of Record. Instead, imagine which types of Systems of Engagement your company needs. Most Systems of Engagement might look like a siloed solution; however, remember they are designed for the real-time collaboration of a certain community – designers, engineers, operators, etc.

The real challenge will be connecting them efficiently with your System of Record backbone, which is preferable to using standard interface protocols and standards.

 

The Hybrid Approach

For those of you following my digital transformation story related to PLM, this is the point where the McKinsey report from 2017 becomes actual again.

 

Conclusion

The concepts are evolving and maturing for a digital enterprise using a System of Record and one or more Systems of Engagement. Early adopters are now needed to demonstrate these concepts to agree on standards and solution-specific needs. It is time to experiment (fast). Where are you in this process of learning?

 

 

 

 

 

 

 

 

 

Once and a while, the discussion pops up if, given the changes in technology and business scope, we still should talk about PLM. John Stark and others have been making a point that PLM should become a profession.

In a way, I like the vagueness of the definition and the fact that the PLM profession is not written in stone. There is an ongoing change, and who wants to be certified for the past or framed to the past?

However, most people, particularly at the C-level, consider PLM as something complex, costly, and related to engineering. Partly this had to do with the early introduction of PLM, which was a little more advanced than PDM.

The focus and capabilities made engineering teams happy by giving them more access to their data. But unfortunately, that did not work, as engineers are not looking for more control.

Old (current) PLM

Therefore, I would like to suggest that when we talk about PLM, we frame it as Product Lifecycle Data Management (the definition). A PLM infrastructure or system should be considered the System of Record, ensuring product data is archived to be used for manufacturing, service, and proving compliance with regulations.

In a modern way, the digital thread results from building such an infrastructure with related artifacts. The digital thread is somehow a slow-moving environment, connecting the various as-xxx structures (As-Designed, As-Planned, As-Manufactured, etc.). Looking at the different PLM vendor images, Aras example above, I consider the digital thread a fancy name for traceability.

I discussed the topic of Digital Thread in 2018:  Document Management or Digital Thread. One of the observations was that few people talk about the quality of the relations when providing traceability between artifacts.

The quality of traceability is relevant for traditional Configuration Management (CM). Traditional CM has been framed, like PLM, to be engineering-centric.

Both PLM and CM need to become enterprise activities – perhaps unified.

Read my blog post and see the discussion with Martijn Dullaart, Lisa Fenwick and Maxim Gravel when discussing the future of Configuration Management.

New digital PLM

In my posts, I talked about modern PLM. I described it as data-driven, often in relation to a model-based approach. And as a result of the data-driven approach, a digital PLM environment could be connected to processes outside the engineering domain. I wrote a series of posts related to the potential of such a new PLM infrastructure (The road to model-based and connected PLM)

Digital PLM, if implemented correctly, could serve people along the full product lifecycle, from marketing/portfolio management until service and, if relevant, decommissioning). The bigger challenge is even connecting eco-systems to the same infrastructure, in particular suppliers & partners but also customers. This is the new platform paradigm.

Some years ago, people stated IoT is the new PLM  (IoT is the new PLM – PTC 2017). Or MBSE is the foundation for a new PLM (Will MBSE be the new PLM instead of IoT? A discussion @ PLM Roadmap conference 2018).

Even Digital Transformation was mentioned at that time. I don’t believe Digital Transformation is pointing to a domain, more to an ongoing process that most companies have t go through. And because it is so commonly used, it becomes too vague for the specifics of our domain. I liked Monica Schnitger‘s LinkedIn post: Digital Transformation? Let’s talk. There is enough to talk about; we have to learn and be more specific.

 

What is the difference?

The challenge is that we need more in-depth thinking about what a “digital transformed” company would look like. What would impact their business, their IT infrastructure, and their organization and people? As I discussed with Oleg Shilovitsky, a data-driven approach does not necessarily mean simplification.

I just finished recording a podcast with Nina Dar while writing this post. She is even more than me, active in the domain of PLM and strategic leadership toward a digital and sustainable future. You can find the pre-announcement of our podcast here (it was great fun to talk), and I will share the result later here too.

What is clear to me is that a new future data-driven environment becomes like a System of Engagement. You can simulate assumptions and verify and qualify trade-offs in real-time in this environment. And not only product behavior, but you can also simulate and analyze behaviors all along the lifecycle, supporting business decisions.

This is where I position the digital twin. Modern PLM infrastructures are in real-time connected to the business. Still, PLM will have its system of record needs; however, the real value will come from the real-time collaboration.

The traditional PLM consultant should transform into a business consultant, understanding technology. Historically this was the opposite, creating friction in companies.

Starting from the business needs

In my interactions with customers, the focus is no longer on traditional PLM; we discuss business scenarios where the company will benefit from a data-driven approach. You will not obtain significant benefits if you just implement your serial processes again in a digital PLM infrastructure.

Efficiency gains are often single digit, where new ways of working can result in double-digit benefits or new opportunities.

Besides traditional pressure on companies to remain competitive, there is now a new additional driver that I have been discussing in my previous post, the Innovation Dilemma. To survive on our planet, we and therefore also companies, need to switch to sustainable products and business models.

This is a push for innovation; however, it requires a coordinated, end-to-end change within companies.

Be the change

When do you decide to change your business model from pushing products to the marker into a business model of Product as a Service? When do you choose to create repairable and upgradeable products? It is a business need. Sustainability does not start with the engineer. It must be part of the (new) DNA of a company.

Interesting to read is this article from Jan Bosch that I read this morning: Resistance to Change. Read the article as it makes so much sense, but we need more than sense – we need people to get involved. My favorite quote from the article:

“The reasonable man adapts himself to the world; the unreasonable one persists in trying to adapt the world to himself. Therefore, all progress depends on the unreasonable man”.

Conclusion

PLM consultants should retrain themselves in System Thinking and start from the business. PLM technology alone is no longer enough to support companies in their (digital/sustainable) transformation. Therefore, I would like to introduce BLM (Business Lifecycle Management) as the new TLA.

However, BLM has been already framed as Black Lives Matter. I agree with that, extending it to ALM (All Lives Matter).

What do you think should we leave the comfortable term PLM behind us for a new frame?

Yes, it is not a typo. Clayton Christensen famous book written in 1995 discussed the Innovator’s Dilemma when new technologies cause great firms to fail. This was the challenge two decades ago. Existing prominent companies could become obsolete quickly as they were bypassed by new technologies.

The examples are well known. To mention a few: DEC (Digital Equipment Corporation), Kodak, and Nokia.

Why the innovation dilemma?

This decade the challenge has become different. All companies are forced to become more sustainable in the next ten years. Either pushed by global regulations or because of their customer demands. The challenge is this time different. Besides the priority of reducing greenhouse gas emissions, there is also the need to transform our society from a linear, continuous growth economy into a circular doughnut economy.

The circular economy makes the creation, the usage and the reuse of our products more complex as the challenge is to reduce the need for raw materials and avoid landfills.

The circular economy concept – the regular product lifecycle in the middle

The doughnut economy makes the values of an economy more complex as it is not only about money and growth, human and environmental factors should also be considered.

Doughnut Economics: Trying to stay within the green boundaries

To manage this complexity, I wrote SYSTEMS THINKING – a must-have skill in the 21st century, focusing on the logical part of the brain. In my follow-up post, Systems Thinking: a second thought, I looked at the human challenge. Our brain is not rational and wants to think fast to solve direct threats. Therefore, we have to overcome our old brains to make progress.

An interesting and thought-provoking was shared by Nina Dar in this discussion, sharing the video below. The 17 Sustainability Development Goals (SDGs) describe what needs to be done. However, we also need the Inner Development Goals (IDGs) and the human side to connect. Watch the movie:

Our society needs to change and innovate; however, we cannot. The Innovation Dilemma.

The future is data-driven and digital.

What is clear to me is that companies developing products and services have only one way to move forward: becoming data-driven and digital.

Why data-driven and digital?

Let’s look at something companies might already practice, REACH (Registration, Evaluation, Authorization and Restriction of Chemicals). This European directive, introduced in 2007, had the aim to protect human health and protect the environment by communicating information on chemicals up and down the supply chain. This would ensure that manufacturers, importers, and their customers are aware of information relating to the health and safety of the products supplied.

The regulation is currently still suffering in execution as most of the reporting and evaluation of chemicals is done manually. Suppliers report their chemicals in documents, and companies report the total of chemicals in their summary reports. Then, finally, authorities have to go through these reports.

Where the scale of REACH is limited, the manual effort to have end-to-end reporting is relatively high. In addition, skilled workers are needed to do the job because reporting is done in a document-based manner.

Life Cycle Assessments (LCA)

Where you might think REACH is relatively simple, the real new challenges for companies are the need to perform Life Cycle Assessments for their products. In a Life Cycle Assessment. The Wiki definition of LCA says:

Life cycle assessment or LCA (also known as life cycle analysis) is a methodology for assessing environmental impacts associated with all the stages of the life cycle of a commercial product, process, or service. For instance, in the case of a manufactured product, environmental impacts are assessed from raw material extraction and processing (cradle), through the product’s manufacture, distribution and use, to the recycling or final disposal of the materials composing it (grave)

This will be a shift in the way companies need to define products. Much more thinking and analysis are required in the early design phases. Before committing to a physical solution, engineers and manufacturing engineers need to simulate and calculate the impact of their design decisions in the virtual world.

This is where the digital twin of the design and the digital twin of the manufacturing process becomes relevant. And remember: Digital Twins do not run on documents – you need connected data and various types of models to calculate and estimate the environmental impact.

LCA done in a document-based manner will make your company too slow and expensive.

I described this needed transformation in my series from last year: The road to model-based and connected PLM – nine posts exploring the technology and concept of a model-based, data-driven PLM infrastructure.

Digital Product Passport (DPP)

The European Commission has published an action plan for the circular economy, one of the most important building blocks of the European Green Deal. One of the defined measures is the gradual introduction of a Digital Product Passport (DPP). As the quality of an LCA depends on the quality and trustworthy information about products and materials, the DPP is targeting to ensure circular economy metrics become reliable.

This will be a long journey. If you want to catch a glimpse of the complexity, read this Medium article: The digital product passport and its technical implementation related to the DPP for batteries.

The innovation dilemma

Suppose you agree with my conclusion that companies need to change their current product or service development into a data-driven and model-based manner. In that case, the question will come up: where to start?

Becoming data-driven and model-based, of course, is not the business driver. However, this change is needed to be able to perform Life Cycle Assessments and comply with current and future regulations by remaining competitive.

A document-driven approach is a dead-end.

Now let’s look at the real dilemmas by comparing a startup (clean sheet / no legacy) and an existing enterprise (experience with the past/legacy). Is there a winning approach?

The Startup

Having lived in Israel – the nation where almost everyone is a startup – and working with startups afterward in the past 10 years, I always get inspired by these people’s energy in startup companies. They have a unique value proposition most of the time, and they want to be visible on the market as soon as possible.

This approach is the opposite of systems thinking. It is often a very linear process to deliver this value proposition without exploring the side effects of such an approach.

For example, the new “green” transportation hype. Many cities now have been flooded with “green” scooters and electric bikes to promote transportation as a service. The idea behind this concept is that citizens do not require to own polluting motorbikes or cars anymore, and transportation means will be shared. Therefore, the city will be cleaner and greener.

However, these “green” vehicles are often designed in the traditional linear way. Is there a repair plan or a plan to recycle the batteries? Reuse of materials used.? Most of the time, not. Please, if you have examples contradicting my observations, let me know. I like to hear good news.

When startup companies start to scale, they need experts to help them grow the company. Often these experts are seasoned people, perhaps close to retirement. They will share their experience and what they know best from the past:  traditional linear thinking.

As a result, even though startup companies can start with a clean sheet, their focus on delivering the product or service blocks further thinking. Instead, the seasoned experts will drive the company towards ways of working they know from the past.

Out of curiosity: Do you know or work in a startup that has started with a data-driven and model-based vision from scratch?  Please add the name of this company in the comments, and let’s learn how they did it.

The Existing company

Working in an established company is like being on board a big tanker. Changing its direction takes a clear eye on the target and navigation skills to come there. Unfortunately, most of the time, these changes take years as it is impossible to switch the PLM infrastructure and the people skills within a short time.

From the bimodal approach in 2015 to the hybrid approach for companies, inspired by this 2017 McKinsey article: Toward an integrated technology operating model, I discovered that this is probably the best approach to ensure a change will happen. In this approach – see image – the organization keeps running on its document-driven PLM infrastructure. This type of infrastructure becomes the system of record. Nothing different from what PLM currently is in most companies.

In parallel, you have to start with small groups of people who independently focus on a new product, a new service. Using the model-based approach, they work completely independently from the big enterprise in a data-driven approach. Their environment can be considered the future system of engagement.

The data-driven approach allows all disciplines to work in a connected, real-time manner. Mastering the new ways of working is usually the task of younger employees that are digital natives. These teams can be completed by experienced workers who behave as coaches. However, they will not work in the new environment; these coaches bring business knowledge to the team.

People cannot work in two modes, but organizations can. As you can see from the McKinsey chart, the digital teams will get bigger and more important for the core business over time. In parallel, when their data usage grows, more and more data integration will occur between the two operation modes. Therefore, the old PLM infrastructure can remain a System of Record and serve as a support backbone for the new systems of engagement.

The Innovation Dilemma conclusion

The upcoming ten years will push organizations to innovate their ways of working to become sustainable and competitive. As discussed before, they must learn to work in a data-driven, connected manner. Both startups and existing enterprises have challenges – they need to overcome the “thinking fast and acting slow” mindset. Do you see the change in your company?

 

Note: Before publishing this post, I read this interesting and complementary post from Jan Bosch Boost your digitalization: instrumentation.

It is in the air – grab it.

 

In my previous post, “My PLM Bookshelf,” on LinkedIn, I shared some of the books that influenced my thinking related to PLM. As you can see in the LinkedIn comments, other people added their recommendations for PLM-related books to get inspired or more knowledgeable.

 

Where reading a book is a personal activity, now I want to share with you how to get educated in a more interactive manner related to PLM. In this post, I talk with Peter Bilello, President & CEO of CIMdata. If you haven’t heard about CIMdata and you are active in PLM, more to learn on their website HERE. Now let us focus on Education.

CIMdata

Peter, knowing CIMdata from its research valid for the whole PLM community, I am curious to learn what is the typical kind of training CIMdata is providing to their customers.

Jos, throughout much of CIMdata’s existence, we have delivered educational content to the global PLM industry. With a core business tenant of knowledge transfer, we began offering a rich set of PLM-related tutorials at our North American and pan-European conferences starting in the earlier 1990s.

Since then, we have expanded our offering to include a comprehensive set of assessment-based certificate programs in a broader PLM sense. For example, systems engineering and digital transformation-related topics. In total, we offer more than 30 half-day classes. All of which can be delivered in-person as a custom configuration for a specific client and through public virtual-live or in-person classes. We have certificated more than 1,000 PLM professionals since the introduction in 2009 of this PLM Leadership offering.

Based on our experience, we recommend that an organization’s professional education strategy and plans address the organization’s specific processes and enabling technologies. This will help ensure that it drives the appropriate and consistent operations of its processes and technologies.

For that purpose, we expanded our consulting offering to include a comprehensive and strategic digital skills transformation framework. This framework provides an organization with a roadmap that can define the skills an organization’s employees need to possess to ensure a successful digital transformation.

In turn, this framework can be used as an efficient tool for the organization’s HR department to define its training and job progression programs that align with its overall transformation.

 

The success of training

We are both promoting the importance of education to our customers. Can you share with us an example where Education really made a difference? Can we talk about ROI in the context of training?

Jos, I fully agree. Over the years, we have learned that education and training are often minimized (i.e., sub-optimized). This is unfortunate and has usually led to failed or partially successful implementations.

In our view, both education and training are needed, along with strong organizational change management (OCM) and a quality assurance program during and after the implementation.

In our terms, education deals with the “WHY” and training with the “HOW”. Why do we need to change? Why do we need to do things differently? And then “HOW” to use new tools within the new processes.

We have seen far too many failed implementations where sub-optimized decisions were made due to a lack of understanding (i.e., a clear lack of education). We have also witnessed training and education being done too early or too late.

This leads to a reduced Return on Investment (ROI).

Therefore a well-defined skills transformation framework is critical for any company that wants to grow and thrive in the digital world. Finally, a skills transformation framework needs to be tied directly to an organization’s digital implementation roadmap and structure, state of the process, and technology maturity to maximize success.

 

Training for every size of the company?

When CIMdata conducts PLM training, is there a difference, for example, when working with a big global enterprise or a small and medium enterprise?

You might think the complexity might be similar; however, the amount of internal knowledge might differ. So how are you dealing with that?

We basically find that the amount of training/education required mostly depends on the implementation scope. Meaning the scope of the proposed digital transformation and the current maturity level of the impacted user community.

It is important to measure the current maturity and establish appropriate metrics to measure the success of the training (e.g., are people, once trained, using the tools correctly).

CIMdata has created a three-part PLM maturity model that allows an organization to understand its current PLM-related organizational, process, and technology maturity.

The three-part PLM maturity model

The PLM maturity model provides an important baseline for identifying and/or developing the appropriate courses for execution.

This also allows us, when we are supporting the definition of a digital skills transformation framework, to understand how the level of internal knowledge might differ within and between departments, sites, and disciplines. All of which help define an organization-specific action plan, no matter its size.

 

Where is CIMdata training different?

Most of the time, PLM implementers offer training too for their prospects or customers. So, where is CIMdata training different?

 

For this, it is important to differentiate between education and training. So, CIMdata provides education (the why) and training and education strategy development and planning.

We don’t provide training on how to use a specific software tool. We believe that is best left to the systems integrator or software provider.

While some implementation partners can develop training plans and educational strategies, they often fall short in helping an organization to effectively transform its user community. Here we believe training specialists are better suited.

 

Digital Transformation and PLM

One of my favorite topics is the impact of digitization in the area of product development. CIMdata introduced the Product Innovation Platform concept to differentiate from traditional PDM/PLM. Who needs to get educated to understand such a transformation, and what does CIMdata contribute to this understanding.

We often start with describing the difference between digitalization and digitization. This is crucial to be understood by an organization’s management team. In addition, management must understand that digitalization is an enterprise initiative.

It isn’t just about product development, sales, or enabling a new service experience. It is about maximizing a company’s ROI in applying and leveraging digital as needed throughout the organization. The only way an organization can do this successfully is by taking an end-to-end approach.

The Product Innovation Platform is focused on end-to-end product lifecycle management. Therefore, it must work within the context of other enterprise processes that are focused on the business’s resources (i.e., people, facilities, and finances) and on its transactions (e.g., purchasing, paying, and hiring).

As a result, an organization must understand the interdependencies among these domains. If they don’t, they will ultimately sub-optimize their investment. It is these and other important topics that CIMdata describes and communicates in its education offering.

The Product Innovation Platform in a digital enterprise

More than Education?

As a former teacher, I know that a one-time education, a good book or slide deck, is not enough to get educated. How does CIMdata provide a learning path or coaching path to their customers?

Jos, I fully agree. Sustainability of a change and/or improved way of working (i.e., long-term sustainability) is key to true and maximized ROI. Here I am referring to the sustainability of the transformation, which can take years.

With this, organizational change management (OCM) is required. OCM must be an integral part of a digital transformation program and be embedded into a program’s strategy, execution, and long-term usage. That means training, education, communication, and reward systems all have to be managed and executed on an ongoing basis.

For example, OCM must be executed alongside an organization’s digital skills transformation program. Our OCM services focus on strategic planning and execution support. We have found that most companies understand the importance of OCM, often don’t fully follow through on it.

 

A model-based future?

During the CIMdata Roadmap & PDT conferences, we have often discussed the importance of Model-Based Systems Engineering methodology as a foundation of a model-based enterprise. What do you see? Is it only the big Aerospace and Defense companies that can afford this learning journey, or should other industries also invest? And if yes, how to start.

Jos, here I need to step back for a minute. All companies have to deal with increasing complexity for their organization, supply chain, products, and more.

So, to optimize its business, an organization must understand and employ systems thinking and system optimization concepts. Unfortunately, most people think of MBSE as an engineering discipline. This is unfortunate because engineering is only one of the systems of systems that an organization needs to optimize across its end-to-end value streams.

The reality is all companies can benefit from MBSE. As long as they consider optimization across their specific disciplines, in the context of their products and services and where they exist within their value chain.

The MBSE is not just for Aerospace and Defense companies. Still, a lot can be learned from what has already been done. Also, leading automotive companies are implementing and using MBSE to design and optimize semi- and high-automated vehicles (i.e., systems of systems).

The starting point is understanding your systems of systems environment and where bottlenecks exist.

There should be no doubt, education is needed on MBSE and how MBSE supports the organization’s Model-Based Enterprise requirements.

Published work from the CIMdata administrated A&D PLM Action Group can be helpful. Also, various MBE and systems engineering maturity models, such as one that CIMdata utilizes in its consulting work.

Want to learn more?

Thanks, Peter, for sharing your insights. Are there any specific links you want to provide to get educated on the topics discussed? Perhaps some books to read or conferences to visit?

x
Jos, as you already mentioned:

x

  • the CIMdata Roadmap & PDT conferences have provided a wealth of insight into this market for more than 25 years.
    [Jos: Search for my blog posts starting with the text: “The weekend after ….”]
  • In addition, there are several blogs, like yours, that are worth following, and websites, like CIMdata’s pages for education or other resources which are filled with downloadable reading material.
  • Additionally, there are many user conferences from PLM solution providers and third-party conferences, such as those hosted by the MarketKey organization in the UK.

These conferences have taken place in Europe and North America for several years. Information exchange and formal training and education are offered in many events. Additionally, they provide an excellent opportunity for networking and professional collaboration.

What I learned

Talking with Peter made me again aware of a few things. First, it is important to differentiate between education and training. Where education is a continuous process, training is an activity that must take place at the right time. Unfortunately, we often mix those two terms and believe that people are educated after having followed a training.

Secondly, investing in education is as crucial as investing in hard- or software. As Peter mentioned:

We often start with describing the difference between digitalization and digitization. This is crucial to be understood by an organization’s management team. In addition, management must understand that digitalization is an enterprise initiative.

System Thinking is not just an engineering term; it will be a mandate for managing a company, a product and even a planet into the future

Conclusion

This time a quote from Albert Einstein, supporting my PLM coaching intentions:

“Education is not the learning of facts
but the training of the mind to think.”

 

In my previous post, I discovered that my header for this series is confusing. Although a future implementation of system lifecycle management (SLM/PLM) will rely on models, the most foundational change needed is a technical one to create a data-driven infrastructure for connected ways of working.

My previous article discussed the concept of the dataset, which led to interesting discussions on LinkedIn and in my personal interactions. Also, this time Matthias Ahrens (HELLA) shared again a relevant but very academic article in this context – how to harmonize company information.

For those who want to dive deeper into the concept of connected datasets, read this article: The euBusinessGraph ontology: A lightweight ontology for harmonizing basic company information.

The article illustrates that the topic is relevant for all larger enterprises (and it is not an easy topic).

This time I want to share my thoughts about the two statements from my introductory post, i.e.:

A model-based approach with connected datasets seems to be the way forward. Managing data in documents will become inefficient as they cannot contribute to any digital accelerator, like applying algorithms. Artificial Intelligence relies on direct access to qualified data.

A model-based approach with connected datasets

We discussed connected datasets in the previous post; now, let’s explore why models and datasets are related. In the traditional CAD-centric PLM domain, most people will associate the word model with a CAD model, to be more precise, the 3D CAD Model. However, there are many other types of models used related to product development, delivery and operations.

A model can be a:

Physical Model

  • A smaller-scale object for the first analysis, e.g., a city or building model, an airplane model

Conceptual Model

  • A conceptual model describes the entities and their relations, e.g., a Process Flow Diagram (PFD)
  • A mathematical model describes a system concept using a mathematical language, e.g., weather or climate models. Modelica and MATLAB would fall in this category
  • A CGI (Computer Generated Imagery) or 3D CAD model is probably the most associated model in the mind of traditional PLM practitioners
  • Functional and Logical Models describing the services and components of a system are crucial in an MBSE

Operational Model

  • A model providing performance analysis based on (real-time) data coming from selected data sources. It could be an operational business model, an asset performance model; even my Garmin’s training performance model is such an operating model.

The list of all models above is not extensive nor academically defined. Moreover, some model term definitions might overlap, e.g., where would we classify software models or manufacturing models?

All models are a best-so-far approach to describing reality. Based on more accurate data from observations or measurements, the model comes closer to what happens in reality.

A model and its data

Never blame the model when there is a difference between what the model predicts and the observed reality. It is still a model.  That’s why we need feedback loops from the actual physical world to the virtual world to fine-tune the model.

Part of what we call Artificial Intelligence is nothing more than applying algorithms to a model. The more accurate data available, the more “intelligent” the artificial intelligence solution will be.

By using data analysis complementary to the model, the model may get better and better through self-learning. Like our human brain, it starts with understanding the world (our model) and collecting experiences (improving our model).

There are two points I would like to highlight for this paragraph:

  • A model is never 100 % the same as reality – so don’t worry about deviations. There will always be a difference between virtual predicted and physical measured – most of the time because reality has much more influencing parameters.
  • The more qualified data we use in the model, the closer to reality – so focus on accurate (and the right) data for your model. Although, as most of the time, it is impossible to fully model a system, focus on the most significant data sources.

The ultimate goal: THE DIGITAL TWIN

The discussion related to data-driven and the usage of models might feel abstract and complex (and that’s the case). However the term “digital twin” is well known and even used in board rooms.

The great benefits of a digital twin for business operations and for sustainability are promoted by many software vendors and consultancy firms.

My statement and reason for this series of blog posts: Digital Twins do not run on documents, you need to have a data-driven, model-based infrastructure to efficiently benefit from digital twin concepts.

Unfortunate a reliable and sustainable implementation of a digital twin requires more than software – it is a learning journey to connect the right data to the right model.
A puzzle every company has to solve as there is no 100 percent blueprint at this time.

Are Low Code platforms the answer?

I mentioned the importance of accurate data. Companies have different systems or even platforms managing enterprise data. The digital dream is that by combining datasets from different systems and platforms, we can provide to any user the needed information in real-time. My statement from my introductory post was:

I don’t believe in Low-Code platforms that provide ad-hoc solutions on demand. The ultimate result after several years might be again a new type of spaghetti. On the other hand, standardized interfaces and protocols will probably deliver higher, long-term benefits. Remember: Low code: A promising trend or a Pandora’s Box?

Let’s look into some of the low-code platform messages mentioned by Low-Code advocates:

You will have an increasingly hard time finding developers to keep up with global app development demands (reason #1 for PEGA)

This statement reminded me of the early days of SmarTeam implementations. With a Data model Wizard, a Form Designer, and a Visual Basic COM API, you could create any kind of data management application with SmarTeam. By using its built-in behaviors for document lifecycle management, item lifecycle management, and CAD integrations combined with easy customizations.

The sky was the limit to satisfy end users.  No need for an experienced partner or to be a skilled programmer (this was 2003+). SmarTeam was a low-code platform the marketing department would say now.

A lot of my activities between 2003 and 2010 were related fixing the problems related to flexibility,  making sense (again) of customizations.  I wrote about this in a 2015 post: The importance of a (PLM) data model sharing the experiences of “fixing” issues created to flexibility.

Think first

The challenge is that an enthusiastic team creates a (low code) solution rapidly. Immediate success is celebrated by the people involved. However, the future impact of this solution is often forgotten – we did the job,  right?

Documentation and a broader visibility are often lacking when implementing such a solution.

For example, suppose your product data is going to be consumed by another app. In that case, you need to make sure that the information you consume is accurate. On the other hand, perhaps the information was valid when you created the app.

However, if your friendly co-worker has moved on to another job and someone with different data standards becomes responsible for the data you consume, the reliability might fail. So how do you guarantee its quality?

Easy tools have often led to spaghetti, starting from Clipper (the old days), Visual Basic (the less old days) to highly customizable systems (like Aras is promoting) and future low-code platforms (and Aras is there again).

However, the strength of being highly flexible is also the weaknesses if not managed and understood correctly. In particular, in a digital enterprise architecture, you need skilled people who guarantee a reliable anchorage of the solution.

The HBR article When Low-Code/No-Code Development Works — and When It Doesn’t mentions the same point:

There are great benefits from LC/NC software development, but management challenges as well. Broad use of these tools institutionalizes the “shadow IT phenomenon, which has bedeviled IT organizations for decades — and could make the problem much worse if not appropriately governed. Citizen developers tend to create applications that don’t work or scale well, and then they try to turn them over to IT. Or the person may leave the company, and no one knows how to change or support the system they developed.

The fundamental difference: from coordinated to connected

For the moment, I remain skeptical about the low-code hype, because I have seen this kind of hype before. The most crucial point companies need to understand is that the coordinated world and the connected world are incompatible.

Using new tools based on old processes and existing data is not a digital transformation. Instead, a focus on value streams and their needed (connected) data should lead to the design of a modern digital enterprise, not the optimization and connectivity between organizational siloes.
Before buying a tool (a medicine) to reduce the current pains, imagine your future ways of working, discover what is possible with your existing infrastructure and identify the gaps.

Next, you need to analyze if these gaps are so significant that it requires a technology change. Probably it does, as historically, systems were not designed to share data horizontally in an organization.

In this context, have a look at Lionel Grealou’s s article for Engineering.com:
Data Readiness in the new age of digital collaboration.

Conclusion

We discussed the crucial relation between models and data. Models have only value if they acquire the right and accurate data (exercise 1).

Next, even the simplest development platforms, like low-code platforms, require brains and a long-term strategy (exercise 2) – nothing is simple at this moment in transformational times.  

The next and final post in this series will focus on configuration management – a new approach is needed. I don’t have the answers, but I will share some thoughts

A recommended event and an exciting agenda and a good place to validate and share your thoughts.

I will be there and look forward to meeting you at this conference (unfortunate still virtually)

In my last post, I zoomed in on a preferred technical architecture for the future digital enterprise. Drawing the conclusion that it is a mission impossible to aim for a single connected environment. Instead, information will be stored in different platforms, both domain-oriented (PLM, ERP, CRM, MES, IoT) and value chain oriented (OEM, Supplier, Marketplace, Supply Chain hub).

In part 3, I posted seven statements that I will be discussing in this series. In this post, I will zoom in on point 2:

Data-driven does not mean we do not need any documents anymore. Read electronic files for documents. Likely, document sets will still be the interface to non-connected entities, suppliers, and regulatory bodies. These document sets can be considered a configuration baseline.

 

System of Record and System of Engagement

In the image below, a slide from 2016,  I show a simplified view when discussing the difference between the current, coordinated approach and the future, connected approach.  This picture might create the wrong impression that there are two different worlds – either you are document-driven, or you are data-driven.

In the follow-up of this presentation, I explained that companies need both environments in the future. The most efficient way of working for operations will be infrastructure on the right side, the platform-based approach using connected information.

For traceability and disconnected information exchanges, the left side will be there for many years to come. Systems of Record are needed for data exchange with disconnected suppliers, disconnected regulatory bodies and probably crucial for configuration management.

The System of Record will probably remain as a capability in every platform or cross-section of platform information. The Systems of Engagement will be the configured real-time environment for anyone involved in active company processes, not only ERP or MES, all execution.

Introducing SysML and SML

This summer, I received a copy of Martin Eigner’s System Lifecycle Management book, which I am reading at his moment in my spare moments. I always enjoyed Martin’s presentations. In many ways, we share similar ideas. Martin from his profession spent more time on the academic aspects of product and system lifecycle than I. But, on the other hand, I have always been in the field observing and trying to make sense of what I see and learn in a coherent approach. I am halfway through the book now, and for sure, I will come back on the book when I have finished.

A first impression: A great and interesting book for all. Martin and I share the same history of data management. Read all about this in his second chapter: Forty Years of Product Data Management

From PDM via PLM to SysLM, is a chapter that everyone should read when you haven’t lived it yourself. It helps you to understand the past (Learning for the past to understand the future). When I finish this series about the model-based and connected approach for products and systems, Martin’s book will be highly complementary given the content he describes.

There is one point for which I am looking forward to is feedback from the readers of this blog.

Should we, in our everyday language, better differentiate between Product Lifecycle Management (PLM) and System Lifecycle Management(SysLM)?

In some customer situations, I talk on purpose about System Lifecycle Management to create the awareness that the company’s offering is more than an electro/mechanical product. Or ultimately, in a more circular economy, would we use the term Solution Lifecycle Management as not only hardware and software might be part of the value proposition?

Martin uses consistently the abbreviation SysLM, where I would prefer the TLA SLM. The problem we both have is that both abbreviations are not unique or explicit enough. SysLM creates confusion with SysML (for dyslectic people or fast readers). SLM already has so many less valuable meanings: Simulation Lifecycle Management, Service Lifecycle Management or Software Lifecycle Management.

For the moment, I will use the abbreviation SLM, leaving it in the middle if it is System Lifecycle Management or Solution Lifecycle Management.

 

How to implement both approaches?

In the long term, I predict that more than 80 percent of the activities related to SLM will take place in a data-driven, model-based environment due to the changing content of the solutions offered by companies.

A solution will be based on hardware, the solid part of the solution, for which we could apply a BOM-centric approach. We can see the BOM-centric approach in most current PLM implementations. It is the logical result of optimizing the product lifecycle management processes in a coordinated manner.

However, the most dynamic part of the solution will be covered by software and services. Changing software or services related to a solution has completely different dynamics than a hardware product.

Software and services implementations are associated with a data-driven, model-based approach.

The management of solutions, therefore, needs to be done in a connected manner. Using the BOM-centric approach to manage software and services would create a Kafkaesque overhead.

Depending on your company’s value proposition to the market, the challenge will be to find the right balance. For example, when you keep on selling disconnectedhardware, there is probably no need to change your internal PLM processes that much.

However, when you are moving to a connected business model providing solutions (connected systems / Outcome-based services), you need to introduce new ways of working with a different go-to-market mindset. No longer linear, but iterative.

A McKinsey concept, I have been promoting several times, illustrates a potential path – note the article was not written with a PLM mindset but in a business mindset.

What about Configuration Management?

The different datasets defining a solution also challenge traditional configuration management processes. Configuration Management (CM) is well established in the aerospace & defense industry. In theory, proper configuration management should be the target of every industry to guarantee an appropriate performance, reduced risk and cost of fixing issues.

The challenge, however, is that configuration management processes are not designed to manage systems or solutions, where dynamic updates can be applied whether or not done by the customer.

This is a topic to solve for the modern Connected Car (system) or Connected Car Sharing (solution)

For that reason, I am inquisitive to learn more from Martijn Dullaart’s presentation at the upcoming PLM Roadmap/PDT conference. The title of his session: The next disruption please …

In his abstract for this session, Martijn writes:

From Paper to Digital Files brought many benefits but did not fundamentally impact how Configuration Management was and still is done. The process to go digital was accelerated because of the Covid-19 Pandemic. Forced to work remotely was the disruption that was needed to push everyone to go digital. But a bigger disruption to CM has already arrived. Going model-based will require us to reexamine why we need CM and how to apply it in a model-based environment. Where, from a Configuration Management perspective, a digital file still in many ways behaves like a paper document, a model is something different. What is the deliverable? How do you manage change in models? How do you manage ownership? How should CM adopt MBx, and what requirements to support CM should be considered in the successful implementation of MBx? It’s time to start unraveling these questions in search of answers.

One of the ideas I am currently exploring is that we need a new layer on top of the current configuration management processes extending the validation to software and services. For example, instead of describing every validated configuration, a company might implement the regular configuration management processes for its hardware.

Next, the systems or solutions in the field will report (or validate) their configuration against validation rules. A topic that requires a long discussion and more than this blog post, potentially a full conference.

Therefore I am looking forward to participating in the CIMdata/PDT FALL conference and pick-up the discussions towards a data-driven, model-based future with the attendees.  Besides CM, there are several other topics of great interest for the future. Have a look at the agenda here

 

Conclusion

A data-driven and model-based infrastructure still need to be combined with a coordinated, document-driven infrastructure.  Where the focus will be, depends on your company’s value proposition.

If we discuss hardware products, we should think PLM. When you deliver systems, you should perhaps talk SysML (or SLM). And maybe it is time to define Solution Lifecycle Management as the term for the future.

Please, share your thoughts in the comments.

 

So far, I have been discussing PLM experiences and best practices that have changed due to introducing electronic drawings and affordable 3D CAD systems for the mainstream. From vellum to PDM to item-centric PLM to manage product designs and manufacturing specifications.

Although the technology has improved, the overall processes haven’t changed so much. As a result, disciplines could continue to work in their own comfort zone, most of the time hidden and disconnected from the outside world.

Now, thanks to digitalization, we can connect and format information in real-time. Now we can provide every stakeholder in the company’s business to have almost real-time visibility on what is happening (if allowed). We have seen the benefits of platformization, where the benefits come from real-time connectivity within an ecosystem.

Apple, Amazon, Uber, Airbnb are the non-manufacturing related examples. Companies are trying to replicate these models for other businesses, connecting the concept owner (OEM ?), with design and manufacturing (services), with suppliers and customers. All connected through information, managed in data elements instead of documents – I call it connected PLM

Vendors have already shared their PowerPoints, movies, and demos from how the future would be in the ideal world using their software. The reality, however, is that implementing such solutions requires new business models, a new type of organization and probably new skills.

The last point is vital, as in schools and organizations, we tend to teach what we know from the past as this gives some (fake) feeling of security.

The reality is that most of us will have to go through a learning path, where skills from the past might become obsolete; however, knowledge of the past might be fundamental.

In the upcoming posts, I will share with you what I see, what I deduct from that and what I think would be the next step to learn.

I firmly believe connected PLM requires the usage of various models. Not only the 3D CAD model, as there are so many other models needed to describe and analyze the behavior of a product.

I hope that some of my readers can help us all further on the path of connected PLM (with a model-based approach). This series of posts will be based on the max size per post (avg 1500 words) and the ideas and contributes coming from you and me.

What is platformization?

In our day-to-day life, we are more and more used to direct interaction between resellers and services providers on one side and consumers on the other side. We have a question, and within 24 hours, there is an answer. We want to purchase something, and potentially the next day the goods are delivered. These are examples of a society where all stakeholders are connected in a data-driven manner.

We don’t have to create documents or specialized forms. An app or a digital interface allows us to connect. To enable this type of connectivity, there is a need for an underlying platform that connects all stakeholders. Amazon and Salesforce are examples for commercial activities, Facebook for social activities and, in theory, LinkedIn for professional job activities.

The platform is responsible for direct communication between all stakeholders.

The same applies to businesses. Depending on the products or services they deliver, they could benefit from one or more platforms. The image below shows five potential platforms that I identified in my customer engagements. Of course, they have a PLM focus (in the middle), and the grouping can be made differently.

Five potential business platforms

The 5 potential platforms

The ERP platform
is mainly dedicated to the company’s execution processes – Human Resources, Purchasing, Finance, Production scheduling, and potentially many more services. As platforms try to connect as much as possible all stakeholders. The ERP platform might contain CRM capabilities, which might be sufficient for several companies. However, when the CRM activities become more advanced, it would be better to connect the ERP platform to a CRM platform. The same logic is valid for a Product Innovation Platform and an ERP platform.  Examples of ERP platforms are SAP and Oracle (and they will claim they are more than ERP)

Note: Historically, most companies started with an ERP system, which is not the same as an ERP platform.  A platform is scalable; you can add more apps without having to install a new system. In a platform, all stored data is connected and has a shared data model.

The CRM platform

a platform that is mainly focusing on customer-related activities, and as you can see from the diagram, there is an overlap with capabilities from the other platforms. So again, depending on your core business and products, you might use these capabilities or connect to other platforms. Examples of CRM platforms are Salesforce and Pega, providing a platform to further extend capabilities related to core CRM.

The MES platform
In the past, we had PDM and ERP and what happened in detail on the shop floor was a black box for these systems. MES platforms have become more and more important as companies need to trace and guide individual production orders in a data-driven manner. Manufacturing Execution Systems (and platforms) have their own data model. However, they require input from other platforms and will provide specific information to other platforms.

For example, if we want to know the serial number of a product and the exact production details of this product (used parts, quality status), we would use an MES platform. Examples of MES platforms (none PLM/ERP related vendors) are Parsec and Critical Manufacturing

The IoT platform

these platforms are new and are used to monitor and manage connected products. For example, if you want to trace the individual behavior of a product of a process, you need an IoT platform. The IoT platform provides the product user with performance insights and alerts.

However, it also provides the product manufacturer with the same insights for all their products. This allows the manufacturer to offer predictive maintenance or optimization services based on the experience of a large number of similar products.  Examples of IoT platforms (none PLM/ERP-related vendors) are Hitachi and Microsoft.

The Product Innovation Platform (PIP)

All the above platforms would not have a reason to exist if there was not an environment where products were invented, developed, and managed. The Product Innovation Platform PIP – as described by CIMdata  -is the place where Intellectual Property (IP) is created, where companies decide on their portfolio and more.

The PIP contains the traditional PLM domain. It is also a logical place to manage product quality and technical portfolio decisions, like what kind of product platforms and modules a company will develop. Like all previous platforms, the PIP cannot exist without other platforms and requires connectivity with the other platforms is applicable.

Look below at the CIMdata definition of a Product Innovation Platform.

You will see that most of the historical PLM vendors aiming to be a PIP (with their different flavors): Aras, Dassault Systèmes, PTC and Siemens.

Of course, several vendors sell more than one platform or even create the impression that everything is connected as a single platform. Usually, this is not the case, as each platform has its specific data model and combining them in a single platform would hurt the overall performance.

Therefore, the interaction between these platforms will be based on standardized interfaces or ad-hoc connections.

Standard interfaces or ad-hoc connections?

Suppose your role and information needs can be satisfied within a single platform. In that case, most likely, the platform will provide you with the right environment to see and manipulate the information.

However, it might be different if your role requires access to information from other platforms. For example, it could be as simple as an engineer analyzing a product change who needs to know the actual stock of materials to decide how and when to implement a change.

This would be a PIP/ERP platform collaboration scenario.

Or even more complex, it might be a product manager wanting to know how individual products behave in the field to decide on enhancements and new features. This could be a PIP, CRM, IoT and MES collaboration scenario if traceability of serial numbers is needed.

The company might decide to build a custom app or dashboard for this role to support such a role. Combining in real-time data from the relevant platforms, using standard interfaces (preferred) or using API’s, web services, REST services, microservices (for specialists) and currently in fashion Low-Code development platforms, which allow users to combine data services from different platforms without being an expert in coding.

Without going too much in technology, the topics in this paragraph require an enterprise architecture and vision. It is opportunistic to think that your existing environment will evolve smoothly into a digital highway for the future by “fixing” demands per user. Your infrastructure is much more likely to end up congested as spaghetti.

In that context, I read last week an interesting post Low code: A promising trend or Pandora’s box. Have a look and decide for yourself

I am less focused on technology, more on methodology. Therefore, I want to come back to the theme of my series: The road to model-based and connected PLM. For sure, in the ideal world, the platforms I mentioned, or other platforms that run across these five platforms, are cloud-based and open to connect to other data sources. So, this is the infrastructure discussion.

In my upcoming blog post, I will explain why platforms require a model-based approach and, therefore, cause a challenge, particularly in the PLM domain.

It took us more than fifty years to get rid of vellum drawings. It took us more than twenty years to introduce 3D CAD for design and engineering. Still primarily relying on drawings. It will take us for sure one generation to switch from document-based engineering to model-based engineering.

Conclusion

In this post, I tried to paint a picture of the ideal future based on connected platforms. Such an environment is needed if we want to be highly efficient in designing, delivering, and maintaining future complex products based on hardware and software. Concepts like Digital Twin and Industry 4.0 require a model-based foundation.

In addition, we will need Digital Twins to reach our future sustainability goals efficiently. So, there is work to do.

Your opinion, Your contribution?

 

 

 

 

 

 

Translate

Categories

%d bloggers like this: