I am writing this post because one of my PLM peers recently asked me this question: “Is the BOM losing its position? He was in discussion with another colleague who told him:

“If you own the BOM, you own the Product Lifecycle”.

This statement made me think of ä recent post from Jan Bosch recent post:  Product Development fallacy #8: the bill of materials has the highest priority.

Software becomes increasingly an essential part of the final product, and combined with Jan’s expertise in software development, he wrote this article.  I recommend reading the full post (4 min read) and next browse through the comments.

If you cannot afford these 10 minutes, here is my favorite quote from the article:

An excessive focus on the bill of materials leads to significant challenges for companies that are undergoing a digital transformation and adopting continuous value delivery. The lack of headroom, high coupling and versioning hell may easily cause an explosion of R&D expenditure over time.

Where did the BOM focus come from? A historical overview related to the rise (and fall) of the BOM.

 

In the beginning, there was the drawing.

Before the era of computers, there was “THE drawing”, describing assemblies, subassemblies or parts. And on the drawing, you can find the parts list if relevant. This parts list was the first Bill of Material, describing the parts/materials shown on the drawing.

 

Next came MRP/ERP

With the introduction of the MRP system (Material Requirement Planning), it was the first step that by using computers, people could collect the material requirements for one system as data and process. Entering new materials/parts described on drawings was still a manual process, as well as referring to existing parts on the drawing. Reuse of parts was a manual process based on individual knowledge.

In the nineties, MRP evolved into ERP (Enterprise Resource Planning), which included the MRP part and added resource and manufacturing planning and financial reporting.

The ERP system became the most significant IT system, the execution system of the company. As it was the first enterprise system implemented, it was the first moment we learned about implementation challenges – people change and budget overruns. However, as the ERP system brought visibility to the company’s execution, it became a “must-have” system for management.

The introduction of mainstream 2D CAD did not affect the company’s culture so much. Drawings became electronic drawings, and the methodology of the parts list on the drawing remained.

Sometimes the interaction with the MRP/ERP system was enhanced by an interface – sending the drawing BOM to ERP. The advantage of the interface: no manual transfer of data reducing typos and BOM errors. The disadvantages at that time: relatively expensive (connectivity between systems was a challenge) and mostly one direction.

 

And then there was PDM.

In parallel with the introduction of ERP systems, mainstream 3D CAD systems became affordable, particularly SolidWorks, Solid Edge and Inventor. These 3D CAD systems allow sharing of parts and assemblies in different products, and the PDM database was the first aid to support part reuse, versioning and standardization.

By extracting the parts from the assemblies and subassemblies, it was possible to generate a BOM structure in the PDM system to be transferred or typed into the ERP system. We did not talk about EBOM or MBOM then, as there was only one BOM in the ERP system, and the PDM system was a tool to feed the ERP system.

Many companies still have based their processes on this approach. ERP (read SAP nowadays) is the central execution system, and PDM is an external system. You might remember the story and image from my previous post about people, processes and tools. The bad practice example: Asking the ERP system to provide a part number when starting to design a part.

 

And then products started to change.

In the early 2000s, I worked with SmarTeam to define the E&E (Electronics and Electrical) template. One of the new concepts was to synchronize all design data coming from different disciplines to a single BOM structure.

It was the time we started to talk about the EBOM. A type of BOM, as the structure to consolidate all the design data, was based on parts.

The EBOM, most of the time, reflects the design intent in logical groups and sending the relevant parts in the correct order to the ERP system was a favorite expensive customization for service providers. How to transfer an engineering BOM view to an ERP system that only understands the manufacturing view?
Note: not all ERP systems have the data model to differentiate between engineering parts and manufacturing parts

The image below illustrates the challenge and the customer’s perception.

The automated link between the design side (EBOM) and manufacturing side (MBOM) was a mission impossible – too many exceptions for the (spaghetti) code.

 

And then came the MBOM.

The identified issues connecting PDM and ERP led to the concept of implementing the MBOM in the PLM system. The MBOM in PLM is one of the characteristics of a PLM implementation compared to a PDM implementation. In a traditional PLM system, there is an interaction and connection between the EBOM and MBOM. EBOM parts should end up as MBOM parts. This interaction can be supported by automation, however, as it is in the same system, still leaving manual changes possible.

The MBOM structure in PLM could then be the information structure to transfer to the ERP system; however, there is more, as Jörg W. Fischer wrote in his provoking post-Die MBOM muss weg (The MBOM must go). He rightly points out (in German) that the MBOM is not a structure on its own but a combination of different views based on Assembly Drawings, Process Planning and Material Requirements.

His conclusion:

Calling these structures, MBOM is trying to squeeze all three structures into one. That usually doesn’t work and then leads to much more emotional discussions in the project. It also costs a lot of money. It is, therefore, better not to use the term MBOM at all.

And indeed, just having an MBOM in your PLM system might help you to prepare some of the manufacturing steps, the needed resources and parts. The MBOM result still has to be localized at the local plant where the manufacturing takes place. And here, the systems used are the ERP system and the MES system.

The main advantage of having the MBOM in the PLM system is the direct relation between specification and manufacturing intent, allowing manufacturing engineering to work collaboratively with engineering in the same environment.

  • The first benefit is fewer iterations and a shorter time to production, thanks to early interaction and manufacturing involvement in the engineering process.
  • The second benefit is: product knowledge is centralized in a single system. Consolidating your Product Knowledge in ERP does not make sense due to global localization and the missing capabilities to manage the iterative engineering processes on non-existing parts.

 

And then came the SBOM, the xBOM

Traditional PLM vendors and implementations kept using xBOM structures as placeholders for related specification data (mechanical designs, electrical, software deliverables, serialized products). Most of the time, related files.

And with this approach, talking about digital thread, PLM systems also touch on the concepts of Configuration Management.

I will not go into the details here but look at the two images by clicking on them and see a similar mindset.

It is about the traceability of information in structures and systems. These structures work well in a relatively static and linear product development and delivery environment, as illustrated below:

Engineering change and release processes are based on managing the changes in different structures from the left to the right.

 

And then came software!

Modern connected products are no longer mechanical products. The product’s functionality no longer depends on the mechanical properties but mainly on embedded electronics and software used. For example, look at the mechanical design of a telecom transmission tower – its behavior merely comes from non-mechanical components, and they can change over time. Still, the Bill of Material contains a lot of concrete and steel parts.

The ultimate example is comparing a Tesla (software on wheels) with a traditional car. For modern connected products, electronics and software need to be part of the solution. Software and electronics allow the product to be upgraded over time. Managing these products in the same manner as mechanical products is impossible, inefficient and therefore threatening your company’s future business.

I requote Jan Bosch:

An excessive focus on the bill of materials leads to significant challenges for companies that are undergoing a digital transformation and adopting continuous value delivery. The lack of headroom, high coupling and versioning hell may easily cause an explosion of R&D expenditure over time.

 

The model-based, connected enterprise

I will not solve the puzzle of the future in this post. You can read my observations in my series: The road to model-based and connected PLM. We need a new infrastructure with at least two modes. One that still serves as a System of Record, storing information in a traditional manner, like a Bill of Materials for the static parts, as not everyone and everything can be connected.

In addition, we need various Systems of Engagement that enable close to real-time interaction between products (systems) and relevant stakeholders for the engagement scope(multidisciplinary / consumers).

Digital twins are examples of such environments. Currently, these Systems of Engagement often work disconnected from the System of Record due to the lack of understanding of how to connect. (standard connectors? / OSLC?)

Our mission is to explore, as I wrote in my post Time to split PLM and drop our mechanical mindset.

And while I was finalizing this post, I read a motivating post from Jan Bosch again for all of you working on understanding and pushing the digital transformation in your eco-system.
The title: Be the protagonist of your life: 15 rules  A starting point for more to come.

 

Conclusion

The BOM is no longer the master of the product lifecycle when it comes to managing connected products, where functionality mainly depends on software. BOM structures with related documents are just one of the extracted baselines from a data-driven, connected enterprise. This traditional PLM infrastructure requires other, non-BOM-driven structures to represent the actual status of a virtual or physical product.
The BOM is not dead, but there is more ………

Your thoughts?

Those who have read my blog posts over the years will have seen the image to the left.

The people, processes and tools slogan points to the best practice of implementing (PLM and CM) systems.

Theoretically, a PLM implementation will move smoothly if the company first agrees on the desired processes and people involved before a system implementation using the right tools.

Too often, companies start from their historical landscape (the tools – starting with a vendor selection) and then try to figure out the optimal usage of their systems. The best example of this approach is the interaction between PDM(PLM) and ERP.

 

PDM and ERP

Historically ERP was the first enterprise system that most companies implemented. For product development, there was the PDM system, an engineering tool, and for execution, there was the ERP system. Since ERP focuses on the company’s execution, the system became the management’s favorite.

The ERP system and its information were needed to run and control the company. Unfortunately, this approach has introduced the idea that the ERP system should also be the source of the part information, as it was often the first enterprise system for a company. The PDM system was often considered an engineering tool only. And when we talk about a PLM system, who really implements PLM as an enterprise system or was it still an engineering tool?

This is an example of Tools, Processes, and People – A BAD PRACTICE.

Imagine an engineer who wants to introduce a new part needed for a product to deliver. In many companies at the beginning of this century, even before starting the exercise, the engineer had to request a part number from the ERP system. This is implementation complexity #1.

Next, the engineer starts developing versions of the part based on the requirements. Ultimately the engineer might come to the conclusion this part will never be implemented. The reserved part number in ERP has been wasted – what to do?

It sounds weird, but this was a reality in discussions on this topic until ten years ago.

Next, as the ERP system could only deal with 7 digits, what about part number reuse? In conclusion, it is a considerable risk that reused part numbers can lead to errors. With the introduction of the PLM systems, there was the opportunity to bridge the gap between engineering and manufacturing. Now it is clear for most companies that the engineer should create the initial part number.

Only when the conceptual part becomes approved to be used for the realization of the product, an exchange with the ERP system will be needed. Using the same part number or not, we do not care if we can map both identifiers between these environments and have traceability.

It took almost 10 years from PDM to PLM until companies agreed on this approach, and I am curious about your company’s status.

Meanwhile, in the PLM world, we have evolved on this topic. The part and the BOM are no longer simple entities. Instead, we often differentiate between EBOM and MBOM, and the parts in those BOMs are not necessarily the same.

In this context, I like Prof. Dr. Jörg W. Fischer‘s framing:
EBOM is the specification, and MBOM is the realization.
(Leider schreibt Er viel auf Deutsch).

An interesting discussion initiated by Jörg last week was again about the interaction between PLM and ERP. The article is an excellent example of how potentially mainstream enterprises are thinking. PLM = Siemens, ERP = SAP – an illustration of the “tools first” mindset before the ideal process is defined.

There was nothing wrong with that in the early days, as connectivity between different systems was difficult and expensive. Therefore people with a 20 year of experience might still rely on their systems infrastructure instead of data flow.

But enough about the bad practice – let’s go to people, processes, (data), and Tools

People, Processes, Data and Tools?

I got inspired by this topic, seeing this post two weeks ago from Juha Korpela, claiming:

Okay, so maybe a hot take, maybe not, but: the old “People, Process, Technology” trinity is one of the most harmful thinking patterns you can have. It leaves out a key element: Data.

His full post was quite focused on data, and I liked the ” wrapping post” from Dr. Nicolas Figay here, putting things more in perspective from his point of view. The reply made me think about how this discussion fits into the PLM digital transformation discussion. How would it work in the two major themes I use to explain the digital transformation in the PLM landscape?

For incidental readers of my blog, these are the two major themes I am using:

  1. From Coordinated to Connected, based on the famous diagram from Marc Halpern (image below). The coordinated approach based on documents (files) requires a particular timing (processes) and context (Bills of Information) – it is the traditional and current PLM approach for most companies. On the other hand, the Connected approach is based on connected datasets (here, we talk about data – not files). These connected datasets are available in different contexts, in real-time, to be used by all kinds of applications, particularly modeling applications. Read about it in the series: The road to model-based and connected PLM.
    .
  2. The need to split PLM, thinking in System(s) of Record and Systems of Engagement. (example below) The idea behind this split is driven by the observation that companies need various Systems of Record for configuration management, change management, compliance and realization. These activities sound like traditional PLM targets and could still be done in these systems. New in the discussion is the System of Engagement which focuses on a specific value stream in a digitally connected manner. Here data is essential.I discussed the coexistence of these two approaches in my post Time to Split PLM. A post on LinkedIn with many discussions and reshares illustrating the topic is hot. And I am happy to discuss “split PLM architectures” with all of you.

These two concepts discuss the processes and the tools, but what about the people? Here I came to a conclusion to complete the story, we have to imagine three kinds of people. And this will not be new. We have the creators of data, the controllers of data and the consumers of data. Let’s zoom in on their specifics.

 

A new representation?

I am looking for a new simplifaction of the people, processes, and tools trinity combined with data; I got inspired by the work Don Farr did at Boeing, where he worked on a new visual representation for the model-based enterprise. You might have seen the image on the left before – click on it to see it in detail.

I wrote the first time about this new representation in my post: The weekend after CIMdata Roadmap / PDT Europe 2018

Related to Configuration Management, Martijn Dullaart and Martin Haket have also worked on a diagram with their peers to depict the scope of CM and Impact Analysis. The image leads to the post with my favorite quote: Communication is merely an exchange of information, but connections tell the story.

Below I share my first attempt to combine the people, process and tools trinity with the concepts of document and data, system(s) of record and system(s) of engagement. Trying to build the story.  Look if you recognize the aspects of the discussion above, and feel free to develop enhancements.

I look forward to your suggestions. Like the understanding that we have to split PLM thinking, as it impacts how we look at implementations.

Conclusion

Digital transformation in the PLM domain is forcing us to think differently. There will still be processes based on people collecting, interpreting and combining information. However, there will also be a new domain of connected data interpreted by models and algorithms, not necessarily depending on processes.

Therefore we need to work on new representations that can be used to tell this combined story. What do you think? How can we improve?

 

In this post, I want to explain why Model-Based Systems Engineering (MBSE) and Sustainability are closely connected. I would claim sustainability in our PLM domain will depend on MBSE.

Can we achieve Sustainability without MBSE? Yes, but it will be costly and slow. And as all businesses want to be efficient and agile, they should consider MBSE.

 

What is MBSE?

The abbreviation MBSE stands for Model-Based Systems Engineering, a specialized manner to perform Systems Engineering. Look at the Wikipedia definition in short:

MBSE is a technical approach to systems engineering that focuses on creating and exploiting domain models as the primary means of information exchange rather than on document-based information exchange.

Model-Based fits in the digital transformation scope of PLM – from a document-based approach to a data-driven, model-based one. In 2018, I focused on facets of the model-based enterprise and related to MBSE in this post: Model-Based: System Engineering (MBSE).

My conclusion in that post was:

Model-Based Systems Engineering might have been considered as a discipline for the automotive and aerospace industry only. As products become more and more complex, thanks to IoT-based applications and software, companies should consider evaluating the value of model-based systems engineering for their products/systems.

I drew this conclusion before I focused on sustainability and systems thinking. Implementing sustainability concepts, like the Circular Economy, require more complex engineering efforts, justifying a Model-Based Systems Engineering approach. Let’s have a look.

If you want to learn more about why we need MBSE, look at this excellent keynote speech lecture from Zhang Xin Guo at the Incose 2018 conference below:

The Mission / the stakeholders

A company might deliver products to the market with the best price/quality ratio and regulatory compliance,  perceived and checked by the market. This approach is purely focusing on economic parameters.

There is no need for a system engineering approach as the complexity is manageable. The mission is more linear,  a “job to do,” and a limited number of stakeholders are involved in this process.

… with sustainability

Once we start to include sustainability in our product’s mission, we need a systems engineering approach, as several factors will push for different considerations. The most obvious considerations are the choice of materials and the optimizing the production process (reducing carbon emissions).

However, the repairability/serviceability of the product should be considered with a more extended lifetime vision.

What about upgradeability and reusing components? Will the customer pay for these extra sustainable benefits?

Probably Yes, when your customer has a long-term vision, as the overall lifecycle costs of the product will be lower.

Probably No if none of your competitors delivers non-sustainable products much cheaper.

As long as regulations will not hurt traditional business models, there might be no significant change.

However, the change has already started. Higher energy prices will impact the production of specific resources and raise costs. In addition, energy-intensive manufacturing processes will lead to more expensive materials. Combined with raising carbon taxes, this will be a significant driver for companies to reconsider their product offering and manufacturing processes.

The more expensive it becomes to create new products, the more attractive repairable and upgradable products will become. And this brings us to the concept of the circular economy, which is one of the pillars of sustainability.

In short, looking at the diagram – the vertical flow from renewables and finite materials from part to product to product in service leads ultimately to wasted resources if there are no feedback loops. This is the traditional product delivery process that most companies are using.

You can click on the image to the left to zoom in on the details.

The renewable loop on the left side of the diagram is the usage of renewables during production and the use of the product. The more we use renewables instead of fossil fuels, the more sustainable this loop will be. This is the area where engineers should use simulations to find the optimal manufacturing processes and product behavior. Again click on the image to zoom in on the details.

The right side of the loop, related to the materials, is where we see the options for repairable, serviceable, upgradeable, and even further refurbishment and recycling to avoid leakage of precious materials. This is where mechanical engineers should dominate the activities.  Focussing on each of the loops and how to enable them in the product.  Click on the image to see the relevant loops.

Looking at the circular economy diagram, it is clear that we are no longer talking about a linear process – it has become the implementation of a system. Systems Engineering or MBSE?

 

The benefits of MBSE

Developing products with the circular economy in mind is no longer a “job to do,” a simple linear exercise. Instead, if we walk down the systems engineering V-shape, there are a lot of modeling exercises to perform before we reach the final solution.

To illustrate the benefits of MBSE, let’s walk through the following scenario.

A well-known company sells lighting projects for stadiums and public infrastructure. Their current business model is based on reliable lighting equipment with a competitive price and range of products.

Most of the time, their contracts have clauses about performance/cost and maintenance. The company sells the products when they win the deal and deliver spare parts when needed.

Their current product design is quite linear – without systems engineering.

Now this company has decided to change its business model towards Product As A Service, or in their terminology LaaS (Lightening as a Service). For a certain amount per month, they will provide lighting to their customers, a stadium, a city, and a road infrastructure.

To implement this business model, this is how they used a Model-Based Systems Engineering approach.

Modeling the Mission

Example of a business model

Before even delivering any products, the process starts with describing and analyzing the business model needed for Lightening as a Service.

Then, with modeling estimates about the material costs, there are exercises about the resources required to maintain the service, the potential market, and the possible price range.

It is the first step of using a model to define the mission of the service. After that, the model can be updated, adjusted, and used for a better go-to-market approach when the solution becomes more mature.

Part of the business modeling is also the intention to deliver serviceable and upgradeable products. As the company now owns the entire lifecycle, this is the cheapest way to guarantee a continuous or improved service over time.

Modeling the Functions

Example of a function diagram

Providing Lighting as a Service also means you must be in touch with your installations in real time. Power consumption needs to be measured and analyzed in real-time for (predictive) maintenance, and the light-providing service should be as cheap as possible during operation.

Therefore LED technology is the most reliable, and connectivity functions need to be implemented in the solution. The functional design ensures installation, maintenance and service can be done in a connected manner (cheapest in operation – beneficial for the business).

Modeling the Logical components

As an owner of the solution, the design of the logical components of the lighting solution is also crucial. How to address various lighting demands efficiently? Modularity is one of the first topics to address. With modular components, it is possible to build customer-specific solutions with a reduced engineering effort. However, the work needs to be done by generically designing the solutions and focusing on the interfaces.

Example of a logical diagram

Such a design starts with a logical process and flow diagrams combined with behavior modeling. Without already having a physical definition, we can analyze the components’ behavior within an electrical scheme. Decisions on whether specific scenarios will be covered by hardware or software can be analyzed here. The company can define the lower-level requirements for the physical component by using virtual trade-offs on the logical models.

At this stage, we have used business modeling, functional modeling and logical modeling to understand our solution’s behavior.

Modeling the Physical product

The final stage of the solution design is to implement the logical components into a physical solution. The placement of components and interfaces between the components becomes essential. For the physical design, there are still a lot of sustainability requirements to verify:

  • Repairability and serviceability – are the components reachable and replaceable? Reducing the lifecycle costs of the solution
  • Upgradeability – are there components that can behave differently due to software choices, or are there components that can be replaced with improved functionality. Reducing the cost of creating entirely new solutions.
  • Reuse & recyclable – are the materials used in the solution recyclable or reusable, reducing the cost of new materials or reducing the cost of dumping waste.
  • RoHS/ REACH compliance

The image below from Zhang Xin Guo’s presentation nicely demonstrates the iterative steps before reaching a physical product

Before committing to a hardware implementation, the virtual product can be analyzed, behavior can be simulated, and it carbon impact can be calculated for the various potential variants.

The manufacturing process and energy usage during operation are also a part of the carbon impact calculation. The best performing virtual solution, including its simulations models, can be chosen for the realization to ensure the most environmentally friendly solution.

 

The digital twin for follow-up

Once the solution has been realized, the company still has a virtual model of the solution. By connecting the physical product’s observed and measured behavior, the virtual side’s modeling can be improved or used to identify improvement candidates – maintenance or upgrades. At this stage, the virtual twin is the actual twin of the physical solution. Without going deeper into the digital twin at this stage, I hope you also realize MBSE is a starting point for implementing digital twins serving sustainability outcomes.

The image below, published by Boeing, illustrates the power of the connected virtual and physical world and the various types of modeling that help to assess the optimal solution.

Conclusion

For sustainability, it all starts with the design. The design decisions for the product contribute for 80 % to the carbon footprint of the solution. Afterward, optimization is possible within smaller margins. MBSE is the recommended approach to get a trustworthy understanding and follow-up of the product’s environmental impact.

What do you think can we create sustainable products without MBSE?

 

This year started for me with a discussion related to federated PLM. A topic that I highlighted as one of the imminent trends of 2022. A topic relevant for PLM consultants and implementers. If you are working in a company struggling with PLM, this topic might be hard to introduce in your company.

Before going into the discussion’s topics and arguments, let’s first describe the historical context.

 

The traditional PLM frame.

Historically PLM has been framed first as a system for engineering to manage their product data. So you could call it PDM first. After that, PLM systems were introduced and used to provide access to product data, upstream and downstream. The most common usage was the relation with manufacturing, leading to EBOM and MBOM discussions.

The traditional ENOVIA PLM backbone

IT landscape simplification often led to an infrastructure of siloed solutions – PLM, ERP, CRM and later, MES. IT was driving the standardization of systems and defining interfaces between systems. System capabilities were leading, not the flow of information.

As many companies are still in this stage, I would call it PLM 1.0

PLM 1.0 systems serve mainly as a System of Record for the organization, where disciplines consolidate their data in a given context, the Bills of Information. The Bill of Information then is again the place to connect specification documents, i.e., CAD models, drawings and other documents, providing a Digital Thread.

Aras – Bills of Information creating the Digital Thread

The actual engineering work is done with specialized tools, MCAD/ECAD, CAE, Simulation, Planning tools and more. Therefore, each person could work in their discipline-specific environment and synchronize their data to the PLM system in a coordinated manner.

However, this interaction is not easy for some of the end-users. For example, the usability of CAD integrations with the PLM system is constantly debated.

Many of my implementation discussions with customers were in this context. For example, suppose your products are relatively simple, or your company is relatively small. In that case, the opinion is that the System or Record approach is overkill.

That’s why many small and medium enterprises do not see the value of a PLM backbone.

This could be true till recently. However, the threats to this approach are digitization and regulations.

Customers, partners, and regulators all expect more accurate and fast responses on specific issues, preferably instantly. In addition, sustainability regulations might push your company to implement a System of Record.

 

PLM as a business strategy

For the past fifteen years, we have discussed PLM more as a business strategy implemented with business systems and an infrastructure designed for sharing. Therefore, I choose these words carefully to avoid overhanging the expression: PLM as a business strategy.

The reason for this prudence is that, in reality, I have seen many PLM implementations fail due to the ambiguity of PLM as a system or strategy. Many enterprises have previously selected a preferred PLM Vendor solution as a starting point for their “PLM strategy”.

One of the most neglected best practices.

In reality, this means there was no strategy but a hope that with this impressive set of product demos, the company would find a way to support its business needs. Instead of people, process and then tools to implement the strategy, most of the time, it was starting with the tools trying to implement the processes and transform the people. That is not really the definition of business transformation.

In my opinion, this is happening because, at the management level, decisions are made based on financials.

Developing a PLM-related business strategy requires management understanding and involvement at all levels of the organization.

This is often not the case; the middle management has to solve the connection between the strategy and the execution. By design, however, the middle management will not restructure the organization. By design, they will collect the inputs van the end users.

And it is clear what end users want – no disruption in their comfortable way of working.

Halfway conclusion:

Rebranding PLM as a business strategy has not really changed the way companies work. PLM systems remain a System of Record mainly for governance and traceability.

To understand the situation in your company, look at who is responsible for PLM.

  • If IT is responsible, then most likely, PLM is not considered a business strategy but more an infrastructure.
  • If engineering is responsible for PLM, then you are still in the early days of PLM, the engineering tools to be consulted by others upstream or downstream.

Only when PLM accountability is at the upper management level, it might be a business strategy (assume the upper management understands the details)

 

Connected is the game changer

Connecting all stakeholders in an engagement has been a game changer in the world. With the introduction of platforms and the smartphone as a connected device, consumers could suddenly benefit from direct responses to desired service requests (Spotify, iTunes, Uber, Amazon, Airbnb, Booking, Netflix, …).

The business change: connecting real-time all stakeholders to deliver highly rapid results.

What would be the game changer in PLM was the question? The image below describes the 2014 Accenture description of digital PLM and its potential benefits.

 

Is connected PLM a utopia?

Marc Halpern from Gartner shared in 2015 the slide below that you might have seen many times before. Digital Transformation is really moving from a coordinated to a connected technology, it seems.

The image below gives an impression of an evolution.

I have been following this concept till I was triggered by a 2017 McKinsey publication: “our insights/toward an integrated technology operating model“.

This was the first notion for me that the future should be hybrid, a combination of traditional PLM   (system of record) complemented with teams that work digitally connected; McKinsey called them pods that become product-centric (multidisciplinary team focusing on a product) instead of discipline-centric (marketing/engineering/manufacturing/service)

In 2019 I wrote the post: The PLM migration dilemma supporting the “shocking” conclusion “Don’t think about migration when moving to data-driven, connected ways of working. You need both environments.”

One of the main arguments behind this conclusion was that legacy product data and processes were not designed to ensure data accuracy and quality on such a level that it could become connected data. As a result, converting documents into reliable datasets would be a costly, impossible exercise with no real ROI.

The second argument was that the outside world, customers, regulatory bodies and other non-connected stakeholders still need documents as standardized deliverables.

The conclusion led to the image below.

Systems of Record (left) and Systems of Engagement (right)

 

Splitting PLM?

In 2021 these thoughts became more mature through various publications and players in the PLM domain.

We saw the upcoming of Systems of Engagement – I discussed OpenBOM, Colab and potentially Configit in the post: A new PLM paradigm. These systems can be characterized as connected solutions across the enterprise and value chain, focusing on a platform experience for the stakeholders.

These are all environments addressing the needs of a specific group of users as efficiently and as friendly as possible.

A System of Engagement will not fit naturally in a traditional PLM backbone; the System of Record.

Erik Herzog with SAAB Aerospace and Yousef Houshmand at that time with Daimler published that year papers related to “Federated PLM” or “The end of monolithic PLM.”. They acknowledged a company needs to focus on more than a single PLM solution. The presentation from Erik Herzog at the PLM Roadmap/PDT conference was interesting because Erik talked about the Systems of Engagement and the Systems of Record. He proposed using OSLC as the standard to connect these two types of PLM.

It was a clear example of an attempt to combine the two kinds of PLM.

And here comes my question: Do we need to split PLM?

When I look at PLM implementations in the field, almost all are implemented as a System of Record, an information backbone proved by a single vendor PLM. The various disciplines deliver their content through interfaces to the backbone (Coordinated approach).

However, there is low usability or support for multidisciplinary collaboration; the PLM backbone is not designed for that.

Due to concepts of Model-Based Systems Engineering (MBSE) and Model-Based Definition (MBD), there are now solutions on the market that allow different disciplines to work jointly related to connected datasets that can be manipulated using modeling software  (1D, 2D, 3D, 4D,…).

These environments, often a mix of software and hardware tools, are the Systems of Engagement and provide speedy results with high quality in the virtual world. Digital Twins are running on Systems of Engagements, not on Systems of Records.

Systems of Engagement do not need to come from the same vendor, as they serve different purposes. But how to explain this to your management, who wants simplicity. I can imagine the IT organization has a better understanding of this concept as, at the end of 2015, Gartner introduced the concept of the bimodal approach.

Their definition:

Mode 1 is optimized for areas that are more well-understood. It focuses on exploiting what is known. This includes renovating the legacy environment so it is fit for a digital world. Mode 2 is exploratory, potentially experimenting to solve new problems. Mode 2 is optimized for areas of uncertainty. Mode 2 often works on initiatives that begin with a hypothesis that is tested and adapted during a process involving short iterations.

No Conclusion – but a question this time:

At the management level, unfortunately, there is most of the time still the “Single PLM”-mindset due to a lack of understanding of the business. Clearly splitting your PLM seems the way forward. IT could be ready for this, but will the business realize this opportunity?

What are your thoughts?

 

Happy New Year to all of you, and may this year be a year of progress in understanding and addressing the challenges ahead of us.

To help us focus, I selected three major domains I will explore further this year. These domains are connected – of course – as nothing is isolated in a world of System Thinking. Also, I wrote about these domains in the past, as usually, noting happens out of the blue.

Meanwhile, there are a lot of discussions related to Artificial Intelligence (AI), in particular ChatGPT (openAI). But can AI provide the answers? I believe not, as AI is mainly about explicit knowledge, the knowledge you can define by (learning) algorithms.

Expert knowledge, often called Tacit knowledge, is the knowledge of the expert, combining information from different domains into innovative solutions.

I started my company, TacIT, in 1999 because I thought (and still think) that Tacit knowledge is the holy grail for companies.

Let’s see with openAI how far we get ……

 

Digitization of the PLM domain

The PLM domain is suffering from its legacy data (documents), legacy processes (linear – mechanical focus) and legacy people (siloed). The statement is a generalization.

More details can be found in my blog series: The road to model-based and connected PLM.

So why should companies move to a model-based and connected approach for their PLM infrastructure?

There are several reasons why companies may want to move to a model-based and connected approach for their Product Lifecycle Management (PLM) infrastructure:

  • Increased efficiency: A model-based approach allows for creating a digital twin of the product, which can be used to simulate and test various design scenarios, reducing the need for physical prototypes and testing. This can lead to faster and more efficient product development.
  • Improved collaboration: A connected PLM infrastructure allows for better collaboration between different teams and departments, as all product-related information is stored in a central location and can be accessed by authorized personnel. This can improve communication and decision-making within the organization.
  • Enhanced visibility: A model-based PLM system provides a single source of truth for all product-related data, giving management a clear and comprehensive view of the product development process. This can help identify bottlenecks and areas for improvement.
  • Reduced risk: By keeping all product-related information in a centralized location, the risk of data loss or inconsistencies is reduced. This can help ensure that the product is developed in accordance with regulatory requirements and company standards.
  • Increased competitiveness: A model-based and connected PLM infrastructure can help companies bring new products to market faster and with fewer errors, giving them a competitive advantage in their industry.

The text in italics was created by ChatGPT. After three learning cycles, this was the best answer I got. What we are missing in this answer is the innovative and transformative part that modern PLM can bring. Where is the concept of different ways of working, and new business models, both drivers for digitalization in many businesses?

Expert knowledge related to Federated PLM (or Killing the PLM Monolith) are topics you will not find through AI. This is, for me, the most interesting part to explore.

We see the need but lack a common understanding of the HOW.

Algorithms will not innovate; for that, you need Tacit intelligence & Curiosity instead of Artificial Intelligence. More exploration of Federated PLM this year.

 

PLM and Sustainability

Last year as part of the PLM Global Green Alliance, we spoke with six different PLM solution providers to understand their sustainability goals, targets, and planned support for Sustainability. All of them confirmed Sustainability has become an important issue for their customers in 2022. Sustainability is on everyone’s agenda.

Why is PLM important for Sustainability?

PLM is important for Sustainability because a PLM helps organizations manage the entire lifecycle of a product, from its conception and design to its manufacture, distribution, use, and disposal. PLM can be important for Sustainability because it can help organizations make more informed decisions about the environmental impacts of their products and take steps to minimize those impacts throughout the product’s lifecycle.

For example, using PLM, an organization can consider the environmental impacts of the materials that are used in a product, the energy consumption of the manufacturing process, the product’s end-of-life disposal, and other factors that may affect its overall Sustainability. By considering these factors early in the design process, organizations can make more sustainable choices that reduce the environmental impact of their products over their lifecycle.

In addition, PLM can help organizations track and measure the Sustainability of their products over time, allowing them to continuously improve and optimize their products for Sustainability. This can be particularly important for organizations that are looking to meet regulatory requirements or consumer demand for more sustainable products.

Again not a wrong answer, but there is no mention of supporting the concepts of the circular economy, the potential of using digital twins at the various lifecycle stages of the products to perform trade-off studies in the virtual world, or to analyze the product behavior from the physical world in the virtual world for optimization.

This is (again), for me, the most interesting part to explore.

We see the need but lack the sense of urgency on the NOW.

More on Sustainability this year.

 

PLM Education at all levels

Historically PLM was framed as an “engineering thing” or a system needed for R&D/Engineering. Modern PLM is much broader and, unfortunately, more hard to grasp. I support the CIMdata definition:

CIMdata defines PLM as:

  • A strategic business approach that applies a consistent set of business solutions that support the collaborative creation, management, dissemination, and use of product definition information

  • Supporting the extended enterprise (customers, design and supply partners, etc.)

  • Spanning from concept to end of life of a product or plant

  • Integrating people, processes, business systems, and information

And even here, I would say this definition is challenging as the scope is extensive and could touch a lot of activities inside an enterprise. As I wrote last year in my series of posts related to PLM and Education, there is:

Strategy education (what are the trends/potential of digital transformation in the business domain related to the company’s products)

Methodology/ Academic Education (independent from tools, what are the best practices, where they reside, potential lessons learned and pitfalls to avoid)

Organisational Change Management (the Education and support needed to enable a business transformation – modern PLM is a transformation enabler)

By asking ChatGPT for their opinion, we learn the following:

Product Lifecycle Management (PLM) education refers to training and development programs that focus on teaching individuals how to effectively manage the lifecycle of a product within an organization. This can include a wide range of topics, such as product design, development, and engineering; supply chain management; quality control and assurance; and product retirement and disposal.

Some specific types of PLM education that a company may need to provide to its employees include:

  1. Introduction to PLM: This type of training is designed to provide a general overview of PLM concepts and practices and may be suitable for employees who are new to the field or who have limited experience with PLM.
  2. PLM software training: Many companies use specialized PLM software to manage their product lifecycle processes. Employees who will be using this software will need training on how to use it effectively.
  3. Industry-specific PLM training: Depending on the industry in which a company operates, there may be specific PLM practices and regulations that employees need to be familiar with. For example, a company in the aerospace industry may need to provide training on PLM practices that are specific to the aerospace industry.
  4. Leadership and management training: For employees who are responsible for leading and managing PLM projects or teams, additional training in leadership and project management may be necessary.

Overall, the types of PLM education that a company needs will depend on the specific needs and goals of the organization, as well as the roles and responsibilities of its employees.

In general, we see from the answer that strategy and organizational change management beyond the tool are not considered.

This is precisely the area where a PLM Expert can help.

We see the need for Education, but we lack the willingness to invest in it.

 

Conclusion

It was an exciting exercise to combine my blogging thoughts with the answers from OpenAI. I am impressed by the given answers, knowing that the topics discussed about PLM are not obvious. On the other hand, I am not worried that AI will take over the job of the PLM consultant. As I mentioned before, the difference between Explicit Knowledge and Tacit Knowledge is clear, and business transformations will largely depend on the usage of Tacit knowledge.

I am curious about your experiences and will follow the topics mentioned in this post and write about them with great interest.

 

 

 

We are happy to close the year with the first round of the PLM Global Green Alliances (PGGA) series: PLM and Sustainability.

We interviewed PLM-related software vendors in this series, discussing their sustainability mission and offering.

We talked with SAP, Autodesk, Dassault Systèmes, Sustaira and Aras and now with PTC. It was an exciting discussion, looking back at their Lifecycle Analysis (LCA) history and ending with a cliffhanger about what’s coming next year.

PTC

The discussion was with Dave Duncan,  VP Sustainability at PTC, focusing on industrial Sustainability as well as PTC’s internal footprint reduction programs, joined by James Norman, who globally leads PTC’s Community of Practice for PLM and Design-for-Sustainability.

Interesting to notice from this discussion, listen to the introduction of Dave and James and their history with Sustainability long before it became a buzzword and then notice how long it takes till digital thread and digital twin are mentioned – enjoy the 38 minutes of interaction below


Slides shown during the interview combined with additional company information can be found HERE.

 

What we have learned

  • It was interesting to learn that just before the financial crisis in 2008, PTC invested (together with James Norman) in lifecycle analysis. But, unfortunately, a focus on restoring the economy silenced this activity until (as Dave Duncan says) a little more than six months ago, when Sustainability is almost in the top 3 of every company’s agenda.
  • Regulation and financial reporting are the current drivers for companies to act related to Sustainability.
  • The digital thread combined with the notion of relying on data quality are transformational aspects.
  • Another transformational aspect is connecting sustainability as an integrated part of product development instead of a separate marketing discipline.
  • Early next year, we will learn more about the realization of the PTC Digital Twin.

Want to learn more

Here are some links to the topics discussed in our meeting:

 

Conclusions

It was great to conclude with PTC this year. I hope readers following this series:  “The PLM Global Green Alliance meets  …” has given a good first impression of where PLM-related vendors are heading regarding their support for a sustainable future.

We touched base with them, the leaders, and the experts in their organizations. We discussed the need for data-driven infrastructures, the relation with the circular economy and compliance.

Next year we plan to follow up with them, now looking more into the customer experiences, tools, and methodology used.

 

 

 

 

This week there was an interesting discussion on LinkedIn initiated by Alex Bruskin from Senticore Technologies. I have known Alex for over 20 years, starting from the SmarTeam days and later through encounters in the PLM space. Alex is a real techie on the outside but also a person with a very creative mind to connect technology to business.

You can see his LinkedIn featured posts here to get an impression.

 

Where is PLM @ Startups?

This time Alex shared an observation from an event organized by the Pittsburgh Robotics Network, where he spoke with several startups.

His point, and I quote Alex:

Then, I spoke to a number of presenters there, explaining Senticore capabilities and listening to their situation around engineering/ manufacturing.

– many startups offered an add-on to other platforms => an autonomous module for UAV/helicopter/Vehicle. Some offered robotic components or entire robots (robot-dog).

– all startups use #solidworks , and none use #catia or #nx

– none of them have a PLM system nor an MES. I am 90% certain none of them have ERP, either. They all are apparently using #excel for all these purposes.

– only a handful of them are considering getting a PLM system in the near future.

Read the full post here and the comments below to get a broader insight into the topic.

 

The PLM Doctor knows it all.

The point reminded me of an episode I did together with Helena Gutierrez from Share PLM last year. She asked the same question to the PLM Doctor.

Do you think PLM is only for big corporations or can startups also benefit from it?

You can see the conversation here:

 

Meanwhile, the PLM Doctor is unemployed due to the lack of incoming questions.

When looking at startups, I could see two paths. One is the traditional path based on historical mechanical PLM, and a second (potential) approach which is based on understanding the future complexity of the startup offering.

 

There are two paths – path #1

The first evolutionary path you might have seen a few times before in my blog post is the one depicted by Marc Halpern from Gartner in 2015. At that time, we started discussing Product Innovation Platforms and the new generation of PLM. You can see Marc’s slide below, which is still valid for most situations.

In the slide above, you see the startup company on the left side.

Often the main purpose of a startup company is to be visible on the market with their concept as fast as possible. Startups are often driven by a small group of multifunctional people developing a solution. In this approach, there is no place for people and reflection on processes as they are considered overhead.

Only when you target your solution in a strongly regulated environment, e.g., medical devices and aerospace, you need to focus on the process too.

Therefore it is logical that most startup companies focus on the tools to develop their solution. A logical path, as what could you do without tools? Next, the choice of the tools will be, most of the time, driven by the team’s experience and available skills in the market.

Again statistics show it is not likely that advanced tools like NX or CATIA will be chosen for the design part. More likely mid-market products like SolidWorks or Autodesk products. And for data management and reporting, the logical tools are the office tools, Excel, Word and Visio.

And don’t forget PowerPoint to sell the solution.

The role of investors is often also here to question investments that are not clearly understood or relevant at that time.

How a startup scales up very much depends on the choices they make for Repeatable business. This is the moment that a company starts to create its legacy. Processes and best practices need to be established and why you often see is that seasoned people join the company. These people have proven their skills in the past, and most likely, they are willing to repeat this.

And here comes the risk – experienced people come with a much better holistic overview of the product lifecycle aspects. They know what critical steps are needed to move the company to an Integrated business. These experiences are crucial; however, they should not become the new single standard.

Implementing the past is not a guarantee for success in a digital and connected future.

Implementing their past experiences would focus too much on creating a System of Record (PLM 1.0), which is crucial for configuration management, change management and compliance. However, it would also create a productivity dip for those developing the product or solution.

This is the same dilemma that very small and medium enterprises face. They function reasonably well in a Repeatable business. How much should they invest in an Integrated or Collaborating business approach?

Following the evolution path described by Marc Halpern always brings you to the point where technology changes from Coordinated to Connected. This is a challenging and immature topic, which I have discussed in my blog posts and during conferences.

See: The Challenges of a connected ecosystem for PLM or this full series of posts:  The road to model-based and connected PLM.

 

There are two paths – path #2

Another path that startups could follow is a more forward-looking path, understanding that you need a coordinated and connected approach in the long term. For the fastest execution, you would like to work in a multidisciplinary mode in real time, exactly the characteristic of a startup.

However, in path #2, the startup should have a longer-term vision. Instead of choosing the obvious tools, they should focus on their company’s most important value streams. They have the opportunity to select integrated domains that are based on a connected, often model-based approach. Some examples of these integrated domains:

  • An MBSE environment focusing on real-time interaction related to product architecture and solution components(RFLP)
  • A connected product design environment, where in real-time a virtual product can be created, analyzed, and optimized – connected software might be relevant here.
  • A connected product realization environment where product engineering and suppliers work together in real time.

All three examples are typical Systems of Engagement. The big difference with individual tools is that they already focus on multidisciplinary collaboration on a data-driven, model-based approach.

In addition, having these systems in place allows the startup company to invest separately in a System of Record(s) environment when scaling up. This could be a traditional PLM system combined with a Configuration Management System or an Asset Management System.

System of Record choices, of course, depends on the industry needs and the usage of the product in the field. We should not consider one system that serves all; it is an infrastructure.

In the image below, you see the concept of this approach described by Erik Herzog from SAAB Aeronautics during the recent PLM Roadmap / PDT Europe conference. You can read more details of this approach in this post: The Week after PLM Roadmap PDT Europe.

Note: SAAB is not a startup; therefore, they must deal with their legacy. They are now working on business sustainable concepts for the future: Heterogeneous and federated PLM.

My opinion: The heterogeneous and federated approach is the ultimate target for any enterprise. I already mentioned the importance of connected environments regarding digital twins and sustainability. Material properties, process environmental impacts and product behavior coming from the field will all work only efficiently if dealt with in a connected and federated manner.

 

Conclusion

The challenge for startups is that they often start without the knowledge and experience that multidisciplinary collaboration within a value stream is crucial for a connected future. This a topic that I would like to explore further with startups and peers in my ecosystem. What do you think? What are your questions? Join the conversation.

 

 

In the last few weeks, I thought I had a writer’s block, as I usually write about PLM-related topics close to my engagements.
Where are the always popular discussions related to EBOM or MBOM? Where is the Form-Fit-Function discussion or the traditional “meaningful numbers” discussions?

These topics always create a lot of interaction and discussion, as many of us have mature opinions.

However, last month I spent most of the time discussing the connection between digital PLM strategies and sustainability. With the Russian invasion of Ukraine, leading to high energy prices, combined with several climate disasters this year, people are aware that 2022 is not a year as usual. A year full of events that force us to rethink our current ways of living.

The notion of urgency

Sustainability for the planet and its people has all the focus currently. COP27 gives you the impression that governments are really serious. Are they? Read this post from Kimberley R. Miner, Climate Scientist at NASA, Polar Explorer& Professor.

She doubts if we really grasp the urgency needed to address climate change. Or are we just playing to be on stage? I agree with her doubts.

So what to do with my favorite EBOM-MBOM discussions?

Last week I attended an event organized by Dassault Systems in the Netherlands for their Dutch/Belgium customers.

The title of the event was: Sustainable innovation for a digital future. I expected a techy event. Click on the image to see the details.

Asking my grandson, who had just started to his study Aerospace Engineering in Delft (NL), learning to work with CAD and PLM-tools, to join me – he replied:

“Too many software demos”

It turned out that my grandson was wrong. The keynote speech from Ruud Veltenaar made most of the audience feel uncomfortable. He really pointed to the fact that we are aware of climate change and our impact on the planet, but in a way, we are paralyzed. Nothing new, but confronting and unexpected when going to a customer event.

Ruud’s message: Accept that we are at the end of an existing world order, and we should prepare for a new world order with the right moral leadership. It starts within yourself. Reflect on who you really are, where you are in your life path, and finally, what you want.

It sounds simple, and I can see it helps to step aside and reflect on these points.

Otherwise, you might feel we are in a rat race as shown below (recommend to watch).

The keynote was the foundation for a day of group and panel discussions on sustainability. Learning from their customers their sustainability plans and experiences.

It showed Dassault Systems, with its 2012  purpose (click on the link to see its history), Harmonizing Products, Nature and Life is ahead of the curve (at least they were for me).

The event was energizing, and my grandson was wrong:
“No software – next time?”

 

The impact of legacies – data, processes & people

For those who haven’t read my previous post, The week after PLM Roadmap / PDT Europe 2022, I wrote about the importance of Heterogeneous and federated PLM, one of the discussions related to data-driven PLM.

Looking back, I have been writing about data-driven PLM since 2014, and few companies have made progress here. Understandable, first of all, due to legacy data, which is not in the right format or quality to support data-driven processes.

However, also here, legacy processes and legacy people are blocking the change. There is no blame here; it is difficult to change. You might have a visionary management team, but then it comes down to the execution of the strategy. The organizational structure and the existing people skills are creating more resistance than progress.

For that reason, I wrote this post in 2015: PLM and Global Warming, where I compared the progress we made within our PLM community with the lack of progress we are making in solving global warming. We know the problem, but we are unable to act due to the lack of feeling the urgency.

This blog post triggered Rich McFall to start together in 2018 the PLM Global Green Alliance.

 

In my PLM Roadmap / PDT Europe session Sustainability and Data-driven PLM – the perfect storm, I raised the awareness that we need to speed up. We have 10 perhaps 15 years to implement radical changes, according to scientists, before we reach irreversible tipping points.

 

Why PLM and Sustainability?

Sustainability starts with the business strategy. How does your company want to contribute to a more sustainable future? The strategy to follow with probably the most impact is the concept of a circular economy – image below and more info here.

The idea behind the circular economy is to minimize the need for new finite materials (the right side) and to use for energy delivery only renewables. Implementing these principles clearly requires a more holistic design of products and services. Each loop should be analyzed and considered when delivering solutions to the market.

Therefore, a logical outcome of the circular economy would be transforming from selling products to the market towards a product-as-a-service model. In this case, the product manufacturer becomes responsible for the full product lifecycle and its environmental impact.

And here comes the importance of PLM. You can measure and tune your environmental impact during production in your ERP or MES environment. However, 80 % of the environmental impact is defined during the design phase, the domain of PLM. All these analysis together are called Life Cycle Analysis or Life Cycle Assessment (LCA). A practice that starts at the moment you start to think about a product or solution – a specialized systems thinking approach.

So how to define and select the right options for future products?

 

Virtual products / Digital Twins

This is where sustainability is pushing for digitization of the product lifecycle. Building and analyzing products in the virtual world is much cheaper than working with physical prototypes.

The importance of a model-based approach here allows companies efficiently deal with trade-off studies for each solution.

In addition, the choice and the behavior of materials also have an impact. These material properties will come from various databases, some based on hazardous substances, others on environmental parameters. Connecting these databases to the virtual model is crucial to remain efficient.

Imagine you need manually collect and process in these properties whenever studying an alternative. The manual process will be too costly (fewer trade-offs and not finding the optimum) and too slow (time-to-market impact).

That’s why I am greatly interested in all the developments related to a federated PLM infrastructure. A monolithic system cannot be the solution for such a model-based environment. In my terminology, here we need an architecture with systems of engagement combined with system(s) of record.

I will publish more on this topic in the future.

In the previous paragraphs, I wrote about the virtual product environment, which some companies call the virtual twin. However, besides the virtual twin, we also need several digital twins. These digital models allow us to monitor and optimize the production process, which can lead to design changes.

Also, monitoring the product in operation using a digital twin allows us to optimize the performance and execution of the solutions in the field.

The feedback from these digital twins will then help the company to improve the design and calibrate their simulation models. It should be a closed loop. You can find a more recent discussion related to the above image here.

 

Our mission

At this moment, sustainability is at the top of my personal agenda, and I hope for many of you. However, besides the choices we can make in our personal lives, there is also an area where we, as PLM interested parties, should contribute: The digitization of the product lifecycle as an enabler for a sustainable business.

Without mature concepts for a connected enterprise, implementing sustainable products and business processes will be a wish, not a strategy. So add digitization to your skillset and use it in the context of sustainability.

Conclusion

It might look like this PLM blog has become an environmental blog. This might be right, as the environmental impact of products and solutions is directly related to product lifecycle management. However, do not worry. In the upcoming time, I will focus on the aspects and experiences of a connected enterprise. I will leave the easier discussions (EBOM/MBOM/FFF/Smart Numbers) from a coordinated enterprise as they are. There is work to do shortly. Your thoughts?

 

 

 

 

 

 

 

I hope you all remained curious after last week’s report from day 1 of the PLM Roadmap / PDT Europe 2022 conference in Gothenburg. The networking dinner after day 1 and the Share PLM after-party allowed us to discuss and compare our businesses. Now the highlights of day 2

 

The Power of Curiosity

We started with a keynote speech from Stefaan van Hooydonk, Founder of the Global Curiosity Institute. It was a well-received opener of the day and an interesting theme concerning PLM.

According to Stefaan, in the previous century, curiosity had a negative connotation. Curiosity killing the cat is one of these expressions confirming the mindset. It was all about conformity to the majority, the company, and curiosity was non-conformant.

The same mindset I would say we have with traditional PLM; we all have to work the same way with the same processes.

In the 21st century, modern enterprises stimulate curiosity as we understand that throughout history, curiosity has been the engine of individual, organizational, and societal progress. And in particular, in modern, unpredictable times, curiosity becomes important, for the world, the others around us and ourselves.

As Stefaan describes in his book, the Curiosity Manifesto, organizations and individuals can develop curiosity. Stefaan pushed us to reflect on our personal curiosity behavior.

  • Are we really interested in the person, the topic I do not know or do not like?
  • Are we avoiding curious steps out of fear? Fear for failing, judgment?

After Stefaan’s curiosity storm, you could see that the audience was inspired to apply it to themselves and their PLM mission(s).

I hope the latter – as here there is a lot to discover.


 

Digital Transformation – Time to roll up your sleeves

In his presentation, Torbjörn Holm, co-founder of Eurostep, addressed one of the bigger elephants in the modern enterprise: how to deal with data?

Thanks to digitization, companies are gathering ad storing data, and there seem to be no limits. However, data centers compete for electricity from the grid with civilians.

Torbjörn also introduced the term “Dark data – the dirty secret of the ICT sector. We store too much data; some research mentions that only 12 % of the data stored is critical, and the rest clogs up on some file servers. Storing unstructured and unused data generates millions of greenhouse gasses yearly.

It is time for a data cleanup day, and inspired by Torbjörn’s story, I have already started to clean up my cloud storage. However, I did not touch my backup hard disks as they do not use energy when switched off.

Further, Torbjörn elaborated that companies need to have end-to-end data policies. Which data is required? And in the case of contracted work or suppliers, data is crucial.

Ultimately companies that want to benefit from a virtual twin of their asset in operation need to have processes in place to acquire the correct data and maintain the valid data. Digital twins do not run on documents; as mentioned in some of my blog posts, they need accurate data.

Torbjörn once more reminded us that the PLCS objective is designed for that.


 

Heterogeneous and federated PLM – is it feasible?

One of the sessions that upfront had most of my attention was the presentation from Erik Herzog, Technical Fellow at Saab Aeronautics and Jad El-Khoury, Researcher at the KTH/Royal Institute of Technology.

Their presentation was closely related to the pre-conference workshop we had organized by Erik and Eurostep. More about this topic in the future.

Saab, Eurostep and KTH conducted a research project named Helipe to analyze and test a federated PLM architecture. The concept was strongly driven by engineering. The idea is shown in the images below.

First are the four main modular engineering environments; in the image, we see mechanical, electrical, software and engineering environments. The target is to keep these environments as standard as possible towards the outside world so that later, an environment could be swapped for a better environment. Inside an environment, automation should provide optimal performance for the users.

In my terminology, these environments serve as systems of engagement.

The second dimension of this architecture is the traceability layer(s) – the requirements management layer, the configuration item structures, change control and realization structures.

These traceability structures look much like what we have been doing with traditional PLM, CM and ERP systems. In my terminology, they are the systems or record, not mentioned to directly serve end-users but to provide traceability, baselines for configuration, compliance and more.

The team chose the OSLC standard to realize these capabilities. One of the main reasons because OSLC is an existing open standard based on linked data, not replicating data. In this way, a federated environment would be created with designated connections between datasets.

Jad El-Koury demonstrated how to link an existing requirement in Siemens Polarion to a Defect in IBM ELM and then create a new requirement in Polarion and link this requirement to the same defect. I never get excited from technical demos; more important to learn is the effort to build such integration and its stability over time. Click on the image for the details

The conclusions from the team below give the right indicators where the last two points seem feasible.

Still, we need more benchmarking in other environments to learn.

I remain curious about this approach as I believe it is heading toward what is necessary for the future, the mix of systems of record and systems of engagement connected through a digital web.

The bold part of the last sentence may be used by marketers.


 

Sustainability and Data-driven PLM – the perfect storm 

For those familiar with my blog (virtualdutchman.com) and my contribution to the PLM Global Green Alliance, it will be no surprise that I am currently combining new ways of working for the PLM domain (digitization) with an even more hot topic, sustainability.

More hot is perhaps a cynical remark.

In my presentation, I explained that a model-based, data-driven enterprise will be able to use digital twins during the design phase, the manufacturing process planning and twins of products in operation. Each twin has a different purpose.

The virtual product during the design phase does not have a real physical twin yet, so some might say it is not a twin at this stage. The virtual product/twin allows companies to perform trade-offs, verification and validation relatively fast and inexpensively. The power of analyzing this virtual twin will enable companies to design products not only at the best price/performance range but even as important, with the lowest environmental impact during manufacturing and usage in the field.

The virtual world of digital twins – (c) 2018 Boeing – diamond

As the Boeing diamond nicely shows, there is a whole virtual world for digital twins. The manufacturing digital twin allows companies to analyze their manufacturing process and virtually analyze the most effective manufacturing process, preferably with the lowest environmental impact.

For digital twins from a product in the field, we can analyze its behavior and optimize performance, hopefully with environmental performance indicators in mind.

For a sustainable future, it is clear that we need to implement concepts of the circular economy as the earth does not have enough resources and renewables to support our current consumption behavior and ways of living.

Note: not for everybody on the globe,  a quote from the European Environment Agency below:

Europe consumes more resources than most other regions. An average European citizen uses approximately four times more resources than one in Africa and three times more than one in Asia, but half of that of a citizen of the USA, Canada, or Australia

To reduce consumption, one of the recommendations is to switch the business model from owning products to products as a service. In the case of products as a service, the manufacturer becomes the owner of the full product lifecycle. Therefore, the manufacturer will have business reasons to make the products repairable, upgradeable, recyclable and using energy efficiently, preferably with renewables. If not, the product might become too expensive; fossil energy will be too expensive as carbon taxes will increase, and virgin materials might become too expensive.

It is a business change; however, sustainability will push organizations to change faster than we are used to. For example, we learned this week that the peeking energy prices and Russia’s current war in Ukraine have led to strong investments in renewables.

As a result, many countries no longer want to depend on Russian energy. The peak of carbon emissions for the world is now expected in 2025.
(Although we had a very bad year so far)

Therefore, my presentation concluded that we should use sustainability as an additional driver for our digital transformation in the PLM domain. The planet cannot wait until we slowly change our traditional working methods.

Therefore, the need for digital twins to support sustainability and systems thinking are the perfect storm to speed up our digitization projects.

You can find my presentation as usual, here on SlideShare and a “spoken” version on our PGGA YouTube channel here


 

Digitalization for the Development and Industrialization of Innovative and Sustainable Solutions

This session, given by  Ola Isaksson, Professor, Product Development & Systems Engineering Design Research Group Leader at Chalmers University, was a great continuation on my part of sustainability. Ola went deeper into the aspects of sustainable products and sustainable business models.

The DSIP project (Digital Sustainability Implementation Package – image above) aims to help companies understand all aspects of sustainable development. Ola mentioned that today’s products’ evolution is insufficient to ensure a sustainable outcome. Currently, not products nor product development practices are adequate enough as we do not understand all the aspects.

For example, Ola used the electrification process, taking the Lithium raw material needed for the batteries. If we take the Nissan Leaf car as the point of measure, we would have used all Lithium resources within 50 years.

Therefore, other business models are also required, where the product ownership is transferred to the manufacturer. This is one of the 9Rs (or 10), as the image shows moving from a linear economy towards a circular economy.

Also, as I mentioned in my session,  Ola referred to the upcoming regulations forcing manufacturers to change their business model or product design.  All these aspects are discussed in the DSIP project, and I look forward to learning the impact this project had on educating and supporting companies in their sustainability journey.

Click on the image to discover the scope


 

A day 2 summary

We had Bernd Feldvoss, Value Stream Leader PLM Interoperability Standards at Airbus, reporting on the progress of the A&D action group focusing on Collaboration. At this stage, the project team has developed an open-service Collaboration Management System (CMS) web application, providing navigation through the eight-step guidelines and offering the potential to improve OEM-supplier collaboration consistency and efficiency within the A&D community.

We had Henrik Lindblad, Group Leader PLM & Process Support at the European Spallation Source, building and soon operating the world’s most powerful neutron source, enabling scientific breakthroughs in research related to materials, energy, health and the environment. Besides a scientific breakthrough, this project is also an example of starting with building a virtual twin of the facility from the start providing a multidisciplinary collaboration space.


 

Conclusion

I left the conference with a lot of positive energy. The Curiosity session from Stefaan van Hooydonk energized us all, but as important for our PLM domain, I saw the trend towards more federated PLM environments, more discussions related to sustainability, and people in 3D again. So far, my takeaways this time.  Enough to explore till the next event.

With great pleasure, I am writing this post, part of a tradition that started for me in 2014. Posts starting with “The weekend after …. “describing what happened during a PDT conference, later the event merged with CIMdata becoming THE PLM event for discussions beyond marketing.

For many of us, this conference was the first time after COVID-19 in 2020. It was a 3D (In person) conference instead of a 2D (digital) conference. With approximately 160 participants, this conference showed that we wanted to meet and network in person and the enthusiasm and interaction were great.

The conference’s theme, Digital Transformation and PLM – a call for PLM Professionals to redefine and re-position the benefits and value of PLM, was quite open.

There are many areas where digitization affects the way to implement a modern PLM Strategy.

Now some of my highlights from day one. I needed to filter to remain around max 1500 words. As all the other sessions, including the sponsor vignettes, were informative, they increased the value of this conference.


Digital Skills Transformation -Often Forgotten Critical Element of Digital Transformation

Day 1 started traditionally with the keynote from Peter Bilello, CIMdata’s president and CEO. In previous conferences, Peter has recently focused on explaining the CIMdata’s critical dozen (image below). If you are unfamiliar with them, there is a webinar on November 10 where you can learn more about them.

All twelve are equally important; it is not a sequence of priorities. This time Peter spent more time on Organisational Change management (OCM), number 12 of the critical dozen – or, as stated, the Digital Transformation’s Achilles heel. Although we always mention people are important, in our implementation projects, they often seem to be the topic that gets the less focus.

We all agree on the statement: People, Process, Tools & Data. Often the reality is that we start with the tools, try to build the processes and push the people in these processes. Is it a coincidence that even CIMdata puts Digital Skills transformation as number 12? An unconscious bias?

This time, the people’s focus got full attention. Peter explained the need for a digital skills transformation framework to educate, guide and support people during a transformation. The concluding slide below says it all.


Transformation Journey and PLM & PDM Modernization to the Digital Future

The second keynote of the day was from Josef Schiöler, Head of Core Platform Area PLM/PDM from the Volvo Group. Josef and his team have a huge challenge as they are working on a foundation for the future of the Volvo Group.

The challenge is that it will provide the foundation for new business processes and the various group members, as the image shows below:


As Josef said, it is really the heart of the heart, crucial for the future. Peter Bilello referred to this project as open-heart surgery while the person is still active, as the current business must go on too.

The picture below gives an impression of the size of the operation.

And like any big transformation project also, the Volvo Group has many questions to explore as there is no existing blueprint to use.

To give you an impression:

  • How to manage complex documentation with existing and new technology and solution co-existing?
    (My take: the hybrid approach)
  • How to realize benefits and user adoption with user experience principles in mind?
    (My take: Understand the difference between a system of engagement and a system of record)
  • How to avoid seeing modernization as pure an IT initiative and secure that end-user value creation is visible while still keeping a focus on finalizing the technology transformation?
    (My take: think hybrid and focus first on the new systems of engagement that can grow)
  • How to efficiently partner with software vendors to ensure vendor solutions fit well in the overall PLM/PDM enterprise landscape without heavy customization?
    (My take: push for standards and collaboration with other similar companies – they can influence a vendor)

Note: My takes are just a starting point of the conversation. There is a discussion in the PLM domain, which I described in my blog post: A new PLM paradigm.

 

The day before the conference, we had a ½ day workshop initiated by SAAB and Eurostep where we discussed the various angles of the so-called Federated PLM.

I will return to that topic soon after some consolidation with the key members of that workshop.


Steering future Engineering Processes with System Lifecycle Management

Patrick Schäfer‘s presentation was different than the title would expect. Patrick is the IT Architect Engineering IT from ThyssenKrupp Presta AG. The company provides steering systems for the automotive industry, which is transforming from mechanical to autonomous driving, e-mobility, car-to-car connectivity, stricter safety, and environmental requirements.

The steering system becomes a system depending on hardware and software. And as current users of Agile PLM, the old Eigner PLM software, you can feel Martin Eigner’s spirit in the project.

I briefly discussed Martin’s latest book on System Lifecycle Management in my blog post, The road to model-based and connected PLM (part 5).

Martin has always been fighting for a new term for modern PLM, and you can see how conservative we are – for sometimes good reasons.

Still, ThyssenKrupp Presta has the vision to implement a new environment to support systems instead of hardware products. And in addition, they had to work fast to upgrade their current almost obsolete PLM environment to a new supported environment.

The wise path they chose was first focusing on a traditional upgrade, meaning making sure their PLM legacy data became part of a modern (Teamcenter) PLM backbone. Meanwhile, they started exploring the connection between requirements management for products and software, as shown below.

From my perspective, I would characterize this implementation as the coordinated approach creating a future option for the connected approach when the organization and future processes are more mature and known.

A good example of a pragmatic approach.


Digital Transformation in the Domain of Products and Plants at Siemens Energy

Per Soderberg, Head of Digital PLM at Siemens Energy, talked about their digital transformation project that started 6 – 7 years ago. Knowing the world of gas- and steam turbines, it is a domain where a lot of design and manufacturing information is managed in drawings.

The ultimate vision from Siemens Energy is to create an Industrial Metaverse for its solutions as the benefits are significant.

Is this target too ambitious, like GE’s 2014 Industrial Transformation with Predix? Time will tell. And I am sure you will soon hear more from Siemens Energy; therefore, I will keep it short. An interesting and ambitious program to follow. Sure you will read about them in the near future. 


Accelerating Digitalization at Stora Enso

Stora Enso is a Finish company, a leading global provider of renewable solutions in packaging, biomaterials, wooden construction and paper. Their director of Innovation Services, Kaisa Suutari, shared Stora Enso’s digital transformation program that started six years ago with a 10 million/year budget (some people started dreaming too). Great to have a budget but then where to start?

In a very systematic manner using an ideas funnel and always starting from the business need, they spend the budget in two paths, shown in the image below.

Their interesting approach was in the upper path, which Kaisa focused on. Instead of starting with an analysis of how the problem could be addressed, they start by doing and then analyze the outcome and improve.

I am a great fan of this approach as it will significantly reduce the time to maturity. However, how much time is often wasted in conducting the perfect analysis?

Their Digi Fund process is a fast process to quickly go from idea to concept, to POC and to pilot, the left side of the funnel. After a successful pilot, an implementation process starts small and scales up.

There were so many positive takeaways from this session. Start with an MVP (Minimal Viable Product) to create value from the start. Next, celebrate failure when it happens, as this is the moment you learn. Finally, continue to create measurable value created by people – the picture below says it all.

It was the second time I was impressed by Stora Enso’s innovative approach. During the PI PLMX 2020 London, Samuli Savo, Chief Digital Officer at Stora Enso, gave us insights into their innovation process. At that time, the focus was a little bit more on open innovation with startups. See my post:  The weekend after PI PLMx London 2020. An interesting approach for other businesses to make their digital transformation business-driven and fun for the people


 A day-one summary

There was Kyle Hall, who talked about MoSSEC and the importance of this standard in a connected enterprise. MoSSEC (Modelling and Simulation information in a collaborative Systems Engineering Context) is the published ISO standard (ISO 10303-243) for improving the decision-making process for complex products. Standards are a regular topic for this conference, more about MoSSEC here.

There was Robert Rencher, Sr. Systems Engineer, Associate Technical Fellow at Boeing, talking about the progress that the A&D action group is making related to Digital Thread, Digital Twins. Sometimes asking more questions than answers as they try to make sense of the marketing definition and what it means for their businesses. You can find their latest report here.

There was Samrat Chatterjee, Business Process Manager PLM at the ABB Process Automation division. Their businesses are already quite data-driven; however, by embedding PLM into the organization’s fabric, they aim to improve effectiveness, manage a broad portfolio, and be more modular and efficient.

The day was closed with a CEO Spotlight, Peter Bilello. This time the CEOs were not coming from the big PLM vendors but from complementary companies with their unique value in the PLM domain. Henrik Reif Andersen, co-founder of Configit; Dr. Mattias Johansson, CEO of Eurostep; Helena Gutierrez, co-founder of Share PLM; Javier Garcia, CEO of The Reuse Company and  Karl Wachtel, CEO, XPLM discussed their various perspectives on the PLM domain.

 

Conclusion

Already so much to say; sorry, I reached the 1500 words target; you should have been there. Combined with the networking dinner after day one, it was a great start to the conference. Are you curious about day 2 – stay tuned, and your curiosity will be rewarded.

 

Thanks to Ewa Hutmacher, Sumanth Madala and Ashish Kulkarni, who shared their pictures of the event on LinkedIn. Clicking on their names will lead you to the relevant posts.

 

Translate

Categories

  1. As a complement, even if more and more of the diversity of a product is managed at the software level…

  2. 1) A wiring diagram stores information (wires between ports of the electrical components) that does not exist in most of…

  3. BOM has NEVER been the sole "master" of the Product. The DEFINITION FILE is ! For example the wiring of…

%d bloggers like this: