You are currently browsing the category archive for the ‘Change’ category.

Last week, I have been participating in the biannual NEM network meeting, this time hosted by Vestas in Ringkøbing (Denmark).

NEM (North European Modularization) is a network for industrial companies with a shared passion and drive for modular products and solutions.

NEM’s primary goal is to advance modular strategies by fostering collaboration, motivation, and mutual support among its diverse members.

During this two-day conference, there were approximately 80 attendees from around 15 companies, all with a serious interest and experience in modularity. The conference reminded me of the CIMdata Roadmap/PDT conferences, where most of the time a core group of experts meet to share their experiences and struggles.

The discussions are so much different compared to a generic PLM or software vendor conference where you only hear (marketing) success stories.

 

Modularity

When talking about modularity, many people will have Lego in mind, as with the Lego bricks, you can build all kinds of products without the need for special building blocks. In general, this is the concept of modularity.

With modularity, a company tries to reduce the amount of custom-made designs by dividing a product into modules with strict interfaces. Modularity aims to offer a wider variety of products to the customer – but configure these from a narrower assortment of modules to streamline manufacturing, sourcing and service. Modularity allows managing changes and new functionality within the modules without managing a new product.

From ETO (Engineering To Order) to BTO (Build To Order) or even CTO (Configure to Order) is a statement often heard when companies are investing in a new PLM system. The idea is that with the CTO model, you reduce the engineering costs and risks for new orders.

With modularity, you can address more variants and options without investing in additional engineering efforts.

How the PLM system supports modularity is an often-heard question. How do you manage in the best way options and variants? The main issue here is that modularity is often considered an R&D effort – R&D must build the modular architecture. An R&D-only focus is a common mistake in the field similar to PLM. Both

PLM and Modularity suffer from the framing that it is about R&D and their tools, whereas in reality, PLM and Modularity are strategies concerning all departments in an enterprise, from sales & marketing, engineering, and manufacturing to customer service.

 

PLM and Modularity

In 2021, I discussed the topic of Modularity with Björn Eriksson & Daniel Strandhammar, who had written during the COVID-19 pandemic their easy-to-read book: The Modular Way. In a blog post, PLM and Modularity, I discussed with Daniel the touchpoints with PLM. A little later, we had a Zoom discussion with Bjorn and Daniel, together with some of the readers of the book. You can find the info still here: The Modular Way – a follow-up discussion.

What was clear to me at that time is that, in particular, Sweden is a leading country when it comes to Modularity. Companies like Scania, Electrolux are known for their product modularity.

For me it was great to learn the Vestas modularization journey. For sure the Scandinavian region sets the tone. And in addition, there are LEGO and IKEA, also famous Scandinavian companies, but with other modularity concepts.

The exciting part of the conference was that all the significant modularity players were present. Hosted by Vestas and with a keynote speech from Leif Östling, a former CEO of Scania, all the ingredients were there for an excellent conference.

 

The NEM network

The conference started with Christian Eskildsen, CEO of the NEM organization, who has a long history of leading modularity at Electrolux. The NEM is not only a facilitator for modularity. They also conduct training, certification sessions, and coaching on various levels, as shown below.

Christian mentioned that there are around 400 followers on the NEM LinkedIn group. I can recommend this LinkedIn group as the group shares their activities here.

At this moment, you can find here the results of Workstream 7 –  The Cost of Complexity.

Peter Greiner, NEM member, presented the details of this result during the conference on day 2. The conclusion of the workstream team was a preliminary estimate suggesting a minimum cost reduction of 2-5% in terms of the Cost Of Goods Sold (COGS) on top of traditional modularization savings. These estimates are based on real-world cases.

Understanding that the benefits are related to the COGS with a high contribution of the actual material costs, a 2 – 5 % range is significant. There is the intention to dig deeper into this topic.

Besides these workstreams, there are also other workstreams running or finished. The ones that interest me in the sustainability context are Workstream 1 Modular & Circular and Workstream 10 Modular PLM (Digital Thread).

The NEM network has an active group of members, making it an exciting network to follow and contribute as modularity is part of a sustainable future. More on this statement later.

Vestas

The main part of day one was organized by our host, Vestas. Jens Demtröder, Chief Engineer at Vestas for the Modular Turbine Architecture and NEM board member, first introduced the business scope, complexity, and later the future challenges that Vestas is dealing with.

First, wind energy is the best cost-competitive source for a green energy system, as the image shows when taking the full environmental impact into the equation. As the image below shows

From the outside, wind turbines all look the same; perhaps a difference between on-shore and off-shore? No way! There is a substantial evolution in the size and control of the wind turbine, and even more importantly, as the image shows, each country has its own regulations to certify a wind turbine. Vestas has to comply with 80+ different local regulations, and for that reason, modularity is vital to manage all the different demands efficiently.

A big challenge for the future will be the transport and installation of wind turbines.

The components become so big that they need to be assembled on-site, requiring new constraints on the structure to be solved.

As the image to the left, rotor sizes up to 250 m are expected and what about the transport of the nacelle itself?

Click on this link to get an impression.

The audience also participated in a (windy) walk through the manufacturing site to get an impression of the processes & components – an impression below.

Processes, organization and governance

Karl Axel Petursson, Senior Specialist in Architecture and Roadmap, gave insights into the processes, organization and governance needed for the modularity approach at Vestas.

The modularization efforts are always a balance between strategy and execution, where often execution wins. The focus on execution is a claim that I recognize when discussing modularity with the companies I am coaching.

Vestas also created an organization related to the functions it provides, being a follower of Conway’s law, as the image below shows:

With modularity, you will also realize that the modular architecture must rely on stable interfaces between the modules based on clear market needs.

Besides an organizational structure, often more and more a matrix organization, there are also additional roles to set up and maintain a modular approach. As the image below indicates, to integrate all the functions, there are various roles in Vestas, some specialized and some more holistic:

These roles are crucial when implementing and maintaining modularity in your organization. It is not just the job of a clever R&D team.

Just a clever R&D is a misconception I have often discovered in the field. Buying one or more tools that support modularity and then let brilliant engineers do the work. And this is a challenge. Engineers often do not like to be constrained by modular constraints when designing a new capability or feature.

For this reason Vestas has established an Organization Change Management initiative called Modular Minds to make engineers flourish in the organization.

Modular Minds

Madhuri Srinivasan Systems Engineering specialist and  Hanh Le  Business Transformation leader both at Vestas, presented their approach to the 2020 must-win battle for Modularisation, aiming with various means, like blogs, podcasts, etc., to educate the organization and create Modular Minds for all Vestas employees.

 

The team is applying the ADKAR model from Prosci to support this change. As you can see from the (clickable) image to the left, ADKAR is the abbreviation of Awareness, Desire, Knowledge, Ability and Reinforcement.

The ADKAR model focuses on driving change at the individual level and achieving organizational results. It is great to see such an approach applied to Modularity, and it would also be valuable in the domain of PLM, as I discussed with Share PLM in my network.

Scania

The 1 ½ hour keynote speech from Leif Östling supported by Karl-Johan Linghede was more of an interactive discussion with the audience than a speech. Leif took us to the origins of Scania, their collaboration in the beginning with learning the Toyota Way. – customer first, respect for people and focus on quality. And initial research and development together with Modular Management resulting in the MFD-methodology.

It led to the understanding that:

  • The #1 cost driver is the amount of parts you need to manage,
  • The #2 crucial point is to have standardized interfaces and keep the flexibility inside the module

The Scania way

With Ericsson, Scania yearly on partnered to work on the connected vehicle. If you are my age, you will remember connectivity at that time was not easy. The connected vehicle was the first step of what we now would call a digital twin

An interesting topic discussed was that Scania has approximately 25 interfaces at Change Level 1. This is a C-level/Executive discussion to approve potential interface changes. This level shows the commitment of the organization to keep modularity operational.

Another benefit mentioned was that the move to electrification of the vehicle was not such a significant change as in many automotive companies. Thanks to the modular structure and the well-defined interfaces, creating an electric truck was not a complete change of the truck design.

The session with Leif and Karl-Johan could have easily taken longer, giving the interesting question-and-answer dialogue with the curious audience. It was a great learning moment.

 

Digitization, Sustainability & Modularization

As a PLM person from the PLM Green Global Alliance, I was allowed to give a speech about the winning combination of Digitization, Sustainability and Modularization. You might have seen my PLM and Sustainability blog post recently; now, a zoom-in on the circular economy and modularity is included.

In this conference, I also focused on Modularity, when implemented based on model-based and data-driven approaches, which is a crucial component of the circular economy (image below) and the lifecycle analysis per module when defined as model-based (Digital Twin).

My entire presentation on SlideShare: Digitization, Sustainability & Modularization.

Conclusion

It was the first time I attended a conference focused on modularity purely, and I realized we are all fighting the same battle. Like the fact that PLM is a strategy and not an engineering system, modularity faces the same challenge. It is a strategy and not an R&D mission. It would be great to see modularity becoming a part of PLM conferences or Circular Economy events as there is so much to learn from each other – and we need them all.

 

Are you interested in the future of PLM and the meaning of Digital Threads.?

Click on the image to see the agenda and join us for 2 days of discussion & learning.

 

 

 

 

 

 

 

 

 

 

 

 

 

During my summer holiday in my “remote” office, I had the chance to digest what I recently read, heard,  saw and discussed related to the future of PLM.

I noticed this year/last year that many companies are discussing or working on their future PLM. It is time to make progress after COVID, particularly in digitization.

And as most companies are avoiding the risk of a “big bang”, they are exploring how they can improve their businesses in an evolutionary mode.

 

PLM is no longer a system

The most significant change I noticed in my discussions is the growing awareness that PLM is no longer covered by a single system.

More and more, PLM is considered a strategy, with which I fully agree. Therefore, implementing a PLM strategy requires holistic thinking and an infrastructure of different types of systems, where possible, digitally connected.

This trend is bad news for the PLM vendors as they continuously work on an end-to-end portfolio where every aspect of the PLM lifecycle is covered by one of their systems. The company’s IT department often supports the PLM vendors, as IT does not like a diverse landscape.

The main question is: “Every PLM Vendor has a rich portfolio on PowerPoint mentioning all phases of the product lifecycle.

However, are these capabilities implementable in an economical and user-friendly manner by actual companies or should PLM players need to change their strategy”?

A question I will try to answer in this post

 

The future of PLM

I have discussed several observed changes related to the effects of digitization in my recent blog posts, referencing others who have studied these topics in their organizations.

Some of the posts to refresh your memory are:

To summarize what has been discussed in these posts are the following points:

The As Is:

  • The traditional PLM systems are examples of a System of Record, not designed to be end-user friendly but designed to have a traceable baseline for manufacturing, service and product compliance.
  • The traditional PLM systems are tuned to a mechanical product introduction and release process in a coordinated manner, with a focus on BOM governance.
  • The legacy information is stored in BOM structures and related specification files.

System of Record (ENOVIA image 2014)

The To Be:

  • We are not talking about a PLM system anymore; a traditional System of Record will be digitally connected to different Systems of Engagement / Domains / Products, which have their own optimized environment for real-time collaboration.
  • The BOM structures remain essential for the hardware part; however, overreaching structures are needed to manage software and hardware releases for a product. These structures depend on connected datasets.
  • To support digital twins at the various lifecycle stages (design. Manufacturing, operations), product data needs to be based on and consumed by models.
  • A future PLM infrastructure is hybrid, based on a Single Source of Change (SSoC) and an Authoritative Source of Truth (ASoT) instead of a Single Source of Truth (SSoT).

Various Systems of Engagement

 

Related podcasts

I relistened two podcasts before writing this post, and I think they are a must to listen to.

The Peer Check podcast from Colab episode 17 — The State of PLM in 2022 w/Oleg Shilovitsky.  Adam and Oleg have a great discussion about the future of PLM.

Highlights: From System to Platform – the new norman. A Single Source of Truth doesn’t work anymore – it is about value streams. People in big companies fear making wrong PLM decisions, which is seen as a significant risk for your career.

There is no immediate need to change the current status quo.

The Share PLM Podcast – Episode 6: Revolutionizing PLM: Insights from Yousef Hooshmand.  Yousef talked with Helena and me about proven ways to migrate an old PLM landscape to a modern PLM/Business landscape.

Highlights: The term Single Source of Change and the existing concepts of a hybrid PLM infrastructure based on his experiences at Daimler and now at NIO. Yousef stresses the importance of having the vision and the executive support to execute.

The time of “big bangs” is over, and Yousef provided links to relevant content, which you can find here in the comments.

 

In addition, I want to point to the experiences provided by Erik Herzog in the Heliple project using OSLC interfaces as the “glue” to connect (in my terminology) the Systems of Engagement and the Systems of Record.

Conclusion of the Heliple-1 project

If you are interested in these concepts and want to learn and discuss them with your peers, more can be learned during the upcoming CIMdata PLM Roadmap / PDT Europe conference.

In particular, look at the agenda for day two if you are interested in this topic.

 

The future for the PLM vendors

If you look at the messaging of the current PLM Vendors, none of them is talking about this federated concept.

They are more focused with their messaging on the transition from on-premise to the cloud,  providing a SaaS offering with their portfolio.

I was slightly disappointed when I saw this article on Engineering.com provided by Autodesk: 5 PLM Best Practices from the Experiences of Autodesk and Its Customers.

The article is tool-centric, with statements that make sense and could be written by any PLM Vendor. However, Best Practice #1  Central Source of Truth Improves Productivity and Collaboration was the message that struck me. Collaboration comes from connecting people, not from the Single Source of Truth utopia.

I don’t believe PLM Vendors have to be afraid of losing their installed base rapidly with companies using their PLM as a System or Record. There is so much legacy stored in these systems that might still be relevant. The existence of legacy information, often documents, makes a migration or swap to another vendor almost impossible and unaffordable.

The System of Record is incompatible with data-driven PLM capabilities

I would like to see more clear developments of the PLM Vendors, creating a plug-and-play infrastructure for Systems of Engagement. Plug-and-play solutions could be based on a neutral partner collaboration hub like ShareAspace or the Systems of Engagement I discussed recently in my post and interview: The new side of PLM? Systems of Engagement!

Plug-and-play systems of engagement require interface standards, and PLM Vendors will only move in this direction if customers are pushing for that, and this is the chicken-and-egg discussion. And probably, their initiatives are too fragmented at the moment to come to a standard. However, don’t give up; keep building MVPs to learn and share.

Some people believe AI, with the examples we have seen with ChatGPT, will be the future direction without needing interface standards.

I am curious about your thoughts and experiences in that area and am willing to learn.

Talking about learning?

Besides reading posts and listening to podcasts, I also read an excellent book this summer. Martijn Dullaart, often participating in PLM and CM discussions, had decided to write a book based on the various discussions related to part (re-)identification (numbering, revisioning).

The book: The Essential Guide to Part Re-Identification: Unleash the Power of Interchangeability and Traceability (The Future of Configuration Management).

As Martijn starts in the preface:

“I decided to write this book because, in my search for more knowledge on the topics of Part Re-Identification, Interchangeability, and Traceability, I could only find bits and pieces but not a comprehensive work that helps fundamentally understand these topics”.

I believe the book should become standard literature for engineering schools that deal with PLM and CM, for software vendors and implementers and last but not least companies that want to improve or better clarify their change processes.

Martijn writes in an easily readable style and uses step-by-step examples to discuss the various options. There are even exercises at the end to use in a classroom or for your team to digest the content.

The good news is that the book is not about the past. You might also know Martijn for our joint discussion, The Future of Configuration Management, together with Maxime Gravel and Lisa Fenwick, on the impact of a model-based and data-driven approach to CM.

I plan to come back with a more dedicated discussion at some point with Martijn soon. Meanwhile, start reading the book. Get your free chapter if needed by following the link at the bottom of this article.

I recommend buying the book as a paperback so you can navigate easily between the diagrams and the text.

Conclusion

The trend for federated PLM is becoming more and more visible as companies start implementing these concepts. The end of monolithic PLM is a threat and an opportunity for the existing PLM Vendors. Will they work towards an open plug-and-play future, or will they keep their portfolios closed? What do you think?

Today I read Rhiannon Gallagherer’s LinkedIn post: If Murray Isn’t Happy, No One Is Happy: Value Your Social Nodes. The story reminded me of a complementary blog post I wrote in 2014, although with a small different perspective.

After reviewing my post, I discovered that nine years later, we are still having the same challenges of how to involve people in a business transformation.

People are the most important assets companies claim, but where do they focus their spending and efforts?

Probably more on building the ideal processes and having the best IT solution.

Organisational Change Management is not in their comfort zone. People like Rhiannon Gallagher, but also in my direct network, the team from Share PLM, are focusing on this blind spot.  Don’t forget this part of your digital transformation efforts.

And just for fun, there rest of the post below is the article from  2014. At that time, I was not yet focusing on digital transformation in the PLM domain. That started end of 2014 – the beginning of 2015.

PLM and Blockers
(read it with 2014 in mind – where were you?)

In the past month (April 2014), I had several discussions related to the complexity of PLM.

  • Why is PLM conceived as complex?
  • Why is it hard to sell PLM internally into an organization?
  • Or, to phrase it differently: “What makes PLM so difficult for normal human beings. As conceptually it is not so complex”

(2023 addition: PLM is complex (and we have to accept it?) )

 

So what makes it complex? What is behind PLM?

ConcurrentEngineeringThe main concept behind PLM is that people need to share data. It can be around a project, a product, or a plant through the whole lifecycle. In particular, during the early lifecycle phases, there is a lot of information that is not yet 100 percent mature.

You could decide to wait till everything is mature before sharing it with others (the classical sequential manner). However, the chances of doing it right the first time are low. Several iterations between disciplines will be required before the data is approved.

The more and more a company works sequentially, the higher the costs of changes and the longer the time to market. Due to the rigidness of this sequential approach, it becomes difficult to respond rapidly to changing customer or market demands.

Therefore, in theory (and it is not only a PLM theory), concurrent engineering should reduce the number of iterations and the total time to market by working in parallel on not yet approved data.

 

plmPLM goes further. It is about the sharing of data, and as it originally started in the early phases of the product lifecycle, the concept of PLM was considered something related to engineering. And to be fair, most of the PLM (CAD-related) vendors have a high focus on the early stages of the lifecycle and have strengthened this idea.

However, sharing can go much further, e.g., early involvement of suppliers (still engineering) or downstream support for after-sales/services (the new acronym  SLM – Service Lifecycle Management).

In my recent (2014) blog posts, I discussed the concepts of SLM and the required data model for that.

 

Anticipated sharing

The complexity lies in the word “sharing”. What does sharing mean for an organization, where historically, every person was awarded for their knowledge instead of being awarded for sharing and spreading knowledge. Guarding your knowledge was job protection.

Many so-called PLM implementations have failed to reach the sharing target as the implementation focus was on storing data per discipline and not necessarily storing data to become shareable and used by others. This is a huge difference.

(2023 addition: At that time, all PLM systems were Systems of Record)

PLM binSome famous (ERP) vendors claim if you store everything in their system, you have a “single version of the truth”.

Sounds attractive. However, my garbage bin at home is also a place where everything ends up in a single place, but a garbage bin has not been designed for sharing. Another person has no clue or time to analyze what is inside.

Even data stored in the same system can be hidden from others as the way to find data is not anticipated.

 

Data sharing instead of document deliverables

The complexity of PLM is that data should be created and shared in a matter not necessarily in the most efficient manner for a single purpose. With some extra effort, you can make the information usable and searchable for others. Typical examples are drawings and document management, where the whole process for a person is focused on delivering a specific document on time. Ok, for that purpose, but this document becomes a legacy for the long term as you need to know (or remember) what is inside the document.

doc2dataA logical implication of data sharing is that, instead of managing documents, organizations start to collect and share data elements (a 3D model, functional properties, requirements, physical properties, logistical properties, etc.). Data can be connected and restructured easily through reports and dashboards, therefore, providing specific views for different roles in the organization. Sharing has become possible, and it can be done online. Nobody needed to consolidate and extract data from documents (Excels ?)

(2023 addition: The data-driven PLM infrastructure talking about datasets)

This does not fit older generations and departmental-managed business units that are rewarded only for their individual efficiency.

Here is an extract of a LinkedIn discussion from 2014, where the two extremes are visible. Unfortunately (or perhaps good), LinkedIn does not keep everything online. There is already so much “dark data” on the internet.

Joe stating:

“The sad thing about PLM is that only PLM experts can understand it! It seems to be a very tight knit club with very little influence from any outside sources.
I think PLM should be dumped. It seems to me that computerizing engineering documentation is relatively easy process. I really think it has been over complicated. Of course we need to get the CAD vendors out of the way. Yes it was an obvious solution, but if anyone took the time to look down the road they would see that they were destroying a well established standard that were so cost effective and simple. But it seems that there is no money in simple”

And at the other side, Kais stated:

“If we want to be able to use state-of-the art technology to support the whole enterprise, and not just engineering, and through-life; then product information, in its totality, must be readily accessible and usable at all times and not locked in any perishable CAD, ERP or other systems. The Data Centric Approach that we introduced in the Datamation PLM Model is built on these concepts”

Readers from my blog will understand I am very much aligned with Kais, and PLM guys have a hard time convincing Joe of the benefits of PLM (I did not try).

 

Making the change happen

blockerBesides this LinkedIn discussion, I had discussions with several companies where my audience understood the data-centric approach. It was nice to be in the room together, sharing ideas of what would be possible. However, the outside world is hard to convince, and here the challenge is organizational change management. Who will support you and who will work against you?.

BLOCKERS: I read an interesting article in IndustryWeek from John Dyer with the title: What Motivates Blockers to Resist Change?

John describes the various types of blockers, and when reading the article combined with my PLM twisted brain, I understood again that this is one of the reasons why PLM is perceived as complex – you need to change, and there are blockers:

Blocker (noun)Someone who purposefully opposes any change (improvement) to a process for personal reasons

“Blockers” can occupy any position in a company. They can be any age, gender, education level or pay rate. We tend to think of blockers as older, more experienced workers who have been with the company for a long time, and they don’t want to consider any other way to do things. While that may be true in some cases, don’t be surprised to find blockers who are young, well-educated and fairly new to the company.”

The problem with blockers

The combination of business change and the existence of blockers is one of the biggest risks for companies to go through a business transformation. By the way, this is not only related to PLM; it is related to any required change in business.

Some examples:

imageA company I worked with was eager to study its path to the future, which required more global collaboration, a competitive business model and a more customer-centric approach. After a long evaluation phase, they decided they needed PLM, which was new for most of the people in the company. Although the project team was enthusiastic, they were not able to pass the blockers for a  change – so no PLM. Ironically enough, they lost a significant part of their business to companies that have implemented PLM. Defending the past is not a guarantee for the future.

A second example is Nokia. Nokia was famous for the ways they were able to transform their business in the past. How come they did not see the smartphone and touch screens upcoming? Apparently, based on several articles presented recently, it was Nokia´s internal culture and superior feeling that they were dominating the market that made it impossible to switch. The technology was known, and the concepts were there; however, the (middle) management was full of blockers.

Two examples where blockers had a huge impact on the company.

Conclusion:

Staying in business and remaining competitive is crucial for companies. In particular, the changes that currently happen require people to work differently in order to stay competitive. Documents will become reports generated from data. People handling and collecting documents to generate new documents will become obsolete as a modern data-centric approach makes them redundant. Keeping the old processes might destroy a company. This should convince the blockers to give up.

future exit

In the past few weeks, together with Share PLM, we recorded and prepared a few podcasts to be published soon. As you might have noticed, for Season 2, our target is to discuss the human side of PLM and PLM best practices and less the technology side.  Meaning:

  • How to align and motivate people around a PLM initiative?
  • What are the best practices when running a PLM initiative?
  • What are the crucial skills you need to have as a PLM lead?

And as there are always many success stories to learn on the internet, we also challenged our guests to share the moments where they got experienced.

As the famous quote says:

Experience is what you get when you don’t get what you expect!

We recently published our with Antonio Casaschi from Assa Abloy, a Swedish company you might have never noticed, although their products and services are a part of your daily life.

It was a discussion to my heart. We discussed the various aspects of PLM. What makes a person a PLM professional? And if you have no time to listen for these 35 minutes, read and scan the recording transcript on the transcription tab.

At 0:24:00, Antonio mentioned the concept of Proof of Concept as he had good experiences with them in the past. The remark triggered me to share some observations that a Proof of Concept (POC) is an old-fashioned way to drive change within organizations. Not discussed in this podcast but based on my experience, companies have been using the Proof Of Concepts to win time, as they were afraid to make a decision.

 

A POC to gain time?

 Company A

When working with a well-known company in 2014, I learned they were planning approximately ten POC per year to explore new ways of working or new technologies. As it was a POC based on an annual time scheme, the evaluation at the end of the year was often very discouraging.

Most of the time, the conclusion was: “Interesting, we should explore this further” /“What are the next POCs for the upcoming year?

There was no commitment to follow-up; it was more of a learning exercise not connected to any follow-up.

Company B

During one of the PDT events, a company presented that two years POC with the three leading PLM vendors, exploring supplier collaboration. I understood the PLM vendors had invested much time and resources to support this POC, expecting a big deal. However, the team mentioned it was an interesting exercise, and they learned a lot about supplier collaboration.

And nothing happened afterward ………

In 2019

At the 2019 Product Innovation Conference in London, when discussing Digital Transformation within the PLM domain, I shared in my conclusion that the POC was mainly a waste of time as it does not push you to transform; it is an option to win time but is uncommitted.

My main reason for not pushing a POC is that it is more of a limited feasibility study.

  • Often to push people and processes into the technical capabilities of the systems used. A focus starting from technology is the opposite of what I have been pushing for longer: First, focus on the value stream – people and processes- and then study which tools and technologies support these demands.
  • Second, the POC approach often blocks innovation as the incumbent system providers will claim the desired capabilities will come (soon) within their systems—a safe bet.

 

The Minimum Viable Product approach (MVP)

With the awareness that we need to work differently and benefit from digital capabilities also came the term Minimum Viable Product or MVP.

The abbreviation MVP is not to be confused with the minimum valuable products or most valuable players.

There are two significant differences with the POC approach:

  • You admit the solution does not exist anywhere – so it cannot be purchased or copied.
  • You commit to the fact that this new approach will be the right direction to take and agree that a perfect fit solution is not blocking you from starting for real.

These two differences highlight the main challenges of digital transformation in the PLM domain. Digital Transformation is a learning process – it takes time for organizations to acquire and master the needed skills. And secondly, it cannot be a big bang, and I have often referred to the 2017 article from McKinsey: Toward an integrated technology operating model. Image below.

We will soon hear more about digital transformation within the PLM domain during the next episode of our SharePLM podcast. We spoke with Yousef Hooshmand, currently working for NIO, a Chinese multinational automobile manufacturer specializing in designing and developing electric vehicles, as their PLM data lead.

You might have discovered Yousef earlier when he published his paper: “From a Monolithic PLM Landscape to a Federated Domain and Data Mesh”. It is highly recommended that to read the paper if you are interested in a potential PLM future infrastructure. I wrote about this whitepaper in 2022: A new PLM paradigm discussing the upcoming Systems of Engagement on top of a Systems or Record infrastructure.

To align our terminology with Yousef’s wording, his domains align with the Systems of Engagement definition.

As we discovered and discussed with Yousef, technology is not the blocking issue to start. You must understand the target infrastructure well and where each domain’s activities fit. Yousef mentions that there is enough literature about this topic, and I can refer to the SAAB conference paper: Genesis -an Architectural Pattern for Federated PLM.

For a less academic impression, read my blog post, The week after PLM Roadmap / PDT Europe 2022, where I share the highlights of Erik Herzog’s presentation: Heterogeneous and Federated PLM – is it feasible?

There is much to learn and discover which standards will be relevant, as both Yousef and Erik mention the importance of standards.

The podcast with Yousef (soon to be found HERE) was not so much about organizational change management and people.

However, Yousef mentioned the most crucial success factor for the transformation project he supported at Daimler. It was C-level support, trust and understanding of the approach, knowing it will be many years, an unavoidable journey if you want to remain competitive.

 

And with the journey aspect comes the importance of the Minimal Viable Product. You are starting a journey with an end goal in mind (top-of-the-mountain), and step by step (from base camp to base camp), people will be better covered in their day-to-day activities thanks to digitization.

A POC would not help you make the journey; perhaps a small POC would understand what it takes to cross a barrier.

 

Conclusion

The concept of POCs is outdated in a fast-changing environment where technology is not necessary the blocking issue. Developing practices, new architectures and using the best-fit standards is the future. Embrace the Minimal Viable Product approach. Are you?

 

Last week I enjoyed visiting LiveWorx 2023 on behalf of the PLM Global Green Alliance. PTC had invited us to understand their sustainability ambitions and meet with the relevant people from PTC, partners, customers and several of my analyst friends. It felt like a reunion.

In addition, I used the opportunity to understand better their Velocity SaaS offering with OnShape and Arena. The almost 4-days event, with approximately 5000 attendees, was massive and well-organized.

So many people were excited that this was again an in-person event after four years.

With PTC’s broad product portfolio, you could easily have a full agenda for the whole event, depending on your interests.

I was personally motivated that I had a relatively full schedule focusing purely on Sustainability, leaving all these other beautiful end-to-end concepts for another time.

Here are some of my observations

Jim Heppelman’s keynote

The primary presentation of such an event is the keynote from PTC’s CEO. This session allows you to understand the company’s key focus areas.

My takeaways:

  • Need for Speed: Software-driven innovation, or as Jim said, Software is eating the BOM, reminding me of my recent blog post: The Rise and Fall of the BOM. Here Jim was referring to the integration with ALM (CodeBeamer) and IoT to have full traceability of products. However, including Software also requires agile ways of working.
  • Need for Speed: Agile ways of working – the OnShape and Arena offerings are examples of agile working methods. A SaaS solution is easy to extend with suppliers or other stakeholders. PTC calls this their Velocity offering, typical Systems of Engagement, and I spoke later with people working on this topic. More in the future.
  • Need for Speed: Model-based digital continuity – a theme I have discussed in my blog post too. Here Jim explains the interaction between Windchill and ServiceMax, both Systems of Record for product definition and Operation.
  • Environmental Sustainability: introducing Catherine Kniker, PTC’s Chief Strategy and Sustainability Officer, announcing that PTC has committed to Science Based Targets, pledging near-term emissions reductions and long-term net-zero targets – see image below and more on Sustainability in the next section.
  • A further investment in a SaaS architecture, announcing CREO+ as a SaaS solution supporting dynamic multi-user collaboration (a System of Engagement)
  • A further investment in the partnership with Ansys fits the needs of a model-based future where modeling and simulation go hand in hand.

You can watch the full session  Path to the Future: Products in the Age of Transformation here.

 

Sustainability

The PGGA spoke with Dave Duncan and James Norman last year about PTC’s sustainability initiatives. Remember: PLM and Sustainability: talking with PTC. Therefore, Klaus Brettschneider and I were happy to meet Dave and James in person just before the event and align on understanding what’s coming at PTC.

We agreed there is no “sustainability super app”; it is more about providing an open, digital infrastructure to connect data sources at any time of the product lifecycle, supporting decision-making and analysis. It is all about reliable data.

 

Product Sustainability 101

On Tuesday, Dave Duncan gave a great introductory session, Product Sustainability 101, addressing Business Drivers and Technical Opportunities. Dave started by explaining the business context aiming at greenhouse gas (GHG) reduction based on science-based targets, describing the content of Scope 1, Scope 2 and Scope 3 emissions.

The image above, which came back in several presentations later that week, nicely describes the mapping of lifecycle decisions and operations in the context of the GHG protocol.

 

Design for Sustainability (DfS)

On Wednesday, I started with a session moderated by James Norman titled Design for Sustainability: Harnessing Innovation for a Resilient Future. The panel consisted of Neil D’Souza (CEO Makersite), Tim Greiner (MD Pure Strategies), Francois Lamy (SVP Product Management PTC) and Asheen Phansey (Director ESG & Sustainability at PagerDuty). You can find the topic discussed below:

Some of the notes I took:

  • No specific PLM modules are needed, LCA needs to become an additional practice for companies, and they rely on a connected infrastructure.
  • Where to start? First, understand the current baseline based on data collection – what is your environmental impact? Next, decide where to start
  • The importance of Design for Service – many companies design products for easy delivery, not for service. Being able to service products better will extend their lifetime, therefore reducing their environmental impact (manufacturing/decommissioning)
  • There Is a value chain for carbon data. In addition, suppliers significantly impact reaching net zero, as many OEMs have an Assembly To Order process, and most of the emissions are done during part manufacturing.

 

DfS: an example from Cummins

Next, on Wednesday, I attended the session from David Genter from Cummins, who presented their Design for Sustainability (DfS) project.

Dave started by sharing their 2030 sustainability goals:

  • On Facilities and  Operations: A reduction of 50 % of GHG emissions, reducing water usage by 30 %, reducing waste by 25 % and reducing organic compound emissions by 50%
  • Reducing Scope 3 emissions for new products by 25%
  • In general, reducing Scope 3 emissions by 55M metric tons.

The benefits for products were documented using a standardized scorecard (example below) to ensure the benefits are real and not based on wishful thinking.

Many motivated people wanted to participate in the project, and the ultimate result demonstrated that DfS has both business value for Cummins and the environment.

The project has been very well described in this whitepaper: How Cummins Made Changes to Optimize Product Designs for the Environment – a recommended case study to read.

 

Tangible Strategies for Improving Product Sustainability

The session was a dialogue between Catherine Kniker and Dave Duncan, discussing the strategies to move forward with Sustainability.

They reiterated the three areas where we as a PLM community can improve: Material choice and usage, Addressing Energy Emissions and Reducing Waste. And it is worth addressing them all, as you can see below – it is not only about carbon reduction.

It was an informative dialogue going through the different aspects of where we, as an engineering/ PLM community, can contribute. You can watch their full dialog here: Tangible Strategies for Improving Product Sustainability.

 

Conclusion

It was encouraging to see that at such an event as LiveWorx, you could learn about Sustainability and discuss Sustainability with the audience and PTC partners. And as I mentioned before, we need to learn to measure (data-driven / reliable data), and we need to be able to work in a connected infrastructure (digital thread) to allow design, simulation, validation and feedback to go hand in hand. It requires adapting a business strategy, not just a tactical solution. With the PLM Global Green Alliance, we are looking forward to following up on these.

NOTE: PTC covered the expenses associated with my participation in this event but did not in any way influence the content of this post – I made my tour fully independent through the conference and got encouraged by all the conversations I had.

 

I am writing this post because one of my PLM peers recently asked me this question: “Is the BOM losing its position? He was in discussion with another colleague who told him:

“If you own the BOM, you own the Product Lifecycle”.

This statement made me think of ä recent post from Jan Bosch recent post:  Product Development fallacy #8: the bill of materials has the highest priority.

Software becomes increasingly an essential part of the final product, and combined with Jan’s expertise in software development, he wrote this article.  I recommend reading the full post (4 min read) and next browse through the comments.

If you cannot afford these 10 minutes, here is my favorite quote from the article:

An excessive focus on the bill of materials leads to significant challenges for companies that are undergoing a digital transformation and adopting continuous value delivery. The lack of headroom, high coupling and versioning hell may easily cause an explosion of R&D expenditure over time.

Where did the BOM focus come from? A historical overview related to the rise (and fall) of the BOM.

 

In the beginning, there was the drawing.

Before the era of computers, there was “THE drawing”, describing assemblies, subassemblies or parts. And on the drawing, you can find the parts list if relevant. This parts list was the first Bill of Material, describing the parts/materials shown on the drawing.

 

Next came MRP/ERP

With the introduction of the MRP system (Material Requirement Planning), it was the first step that by using computers, people could collect the material requirements for one system as data and process.

Entering new materials/parts described on drawings was still a manual process, as well as referring to existing parts on the drawing. Reuse of parts was a manual process based on individual knowledge.

In the nineties, MRP evolved into ERP (Enterprise Resource Planning), which included the MRP part and added resource and manufacturing planning and financial reporting.

The ERP system became the most significant IT system, the execution system of the company. As it was the first enterprise system implemented, it was the first moment we learned about implementation challenges – people change and budget overruns. However, as the ERP system brought visibility to the company’s execution, it became a “must-have” system for management.

The introduction of mainstream 2D CAD did not affect the company’s culture so much. Drawings became electronic drawings, and the methodology of the parts list on the drawing remained.

Sometimes the interaction with the MRP/ERP system was enhanced by an interface – sending the drawing BOM to ERP. The advantage of the interface: no manual transfer of data reducing typos and BOM errors. The disadvantages at that time: relatively expensive (connectivity between systems was a challenge) and mostly one direction.

 

And then there was PDM.

In parallel with the introduction of ERP systems, mainstream 3D CAD systems became affordable, particularly SolidWorks, Solid Edge and Inventor. These 3D CAD systems allow sharing of parts and assemblies in different products, and the PDM database was the first aid to support part reuse, versioning and standardization.

By extracting the parts from the assemblies and subassemblies, it was possible to generate a BOM structure in the PDM system to be transferred or typed into the ERP system. We did not talk about EBOM or MBOM then, as there was only one BOM in the ERP system, and the PDM system was a tool to feed the ERP system.

Many companies still have based their processes on this approach. ERP (read SAP nowadays) is the central execution system, and PDM is an external system. You might remember the story and image from my previous post about people, processes and tools. The bad practice example: Asking the ERP system to provide a part number when starting to design a part.

 

And then products started to change.

In the early 2000s, I worked with SmarTeam to define the E&E (Electronics and Electrical) template. One of the new concepts was to synchronize all design data coming from different disciplines to a single BOM structure.

It was the time we started to talk about the EBOM. A type of BOM, as the structure to consolidate all the design data, was based on parts.

The EBOM, most of the time, reflects the design intent in logical groups and sending the relevant parts in the correct order to the ERP system was a favorite expensive customization for service providers. How to transfer an engineering BOM view to an ERP system that only understands the manufacturing view?
Note: not all ERP systems have the data model to differentiate between engineering parts and manufacturing parts

The image below illustrates the challenge and the customer’s perception.

The automated link between the design side (EBOM) and manufacturing side (MBOM) was a mission impossible – too many exceptions for the (spaghetti) code.

 

And then came the MBOM.

The identified issues connecting PDM and ERP led to the concept of implementing the MBOM in the PLM system. The MBOM in PLM is one of the characteristics of a PLM implementation compared to a PDM implementation. In a traditional PLM system, there is an interaction and connection between the EBOM and MBOM. EBOM parts should end up as MBOM parts. This interaction can be supported by automation, however, as it is in the same system, still leaving manual changes possible.

The MBOM structure in PLM could then be the information structure to transfer to the ERP system; however, there is more, as Jörg W. Fischer wrote in his provoking post-Die MBOM muss weg (The MBOM must go). He rightly points out (in German) that the MBOM is not a structure on its own but a combination of different views based on Assembly Drawings, Process Planning and Material Requirements.

His conclusion:

Calling these structures, MBOM is trying to squeeze all three structures into one. That usually doesn’t work and then leads to much more emotional discussions in the project. It also costs a lot of money. It is, therefore, better not to use the term MBOM at all.

And indeed, just having an MBOM in your PLM system might help you to prepare some of the manufacturing steps, the needed resources and parts. The MBOM result still has to be localized at the local plant where the manufacturing takes place. And here, the systems used are the ERP system and the MES system.

The main advantage of having the MBOM in the PLM system is the direct relation between specification and manufacturing intent, allowing manufacturing engineering to work collaboratively with engineering in the same environment.

  • The first benefit is fewer iterations and a shorter time to production, thanks to early interaction and manufacturing involvement in the engineering process.
  • The second benefit is: product knowledge is centralized in a single system. Consolidating your Product Knowledge in ERP does not make sense due to global localization and the missing capabilities to manage the iterative engineering processes on non-existing parts.

 

And then came the SBOM, the xBOM

Traditional PLM vendors and implementations kept using xBOM structures as placeholders for related specification data (mechanical designs, electrical, software deliverables, serialized products). Most of the time, related files.

And with this approach, talking about digital thread, PLM systems also touch on the concepts of Configuration Management.

I will not go into the details here but look at the two images by clicking on them and see a similar mindset.

It is about the traceability of information in structures and systems. These structures work well in a relatively static and linear product development and delivery environment, as illustrated below:

Engineering change and release processes are based on managing the changes in different structures from the left to the right.

 

And then came software!

Modern connected products are no longer mechanical products. The product’s functionality no longer depends on the mechanical properties but mainly on embedded electronics and software used. For example, look at the mechanical design of a telecom transmission tower – its behavior merely comes from non-mechanical components, and they can change over time. Still, the Bill of Material contains a lot of concrete and steel parts.

The ultimate example is comparing a Tesla (software on wheels) with a traditional car. For modern connected products, electronics and software need to be part of the solution. Software and electronics allow the product to be upgraded over time. Managing these products in the same manner as mechanical products is impossible, inefficient and therefore threatening your company’s future business.

I requote Jan Bosch:

An excessive focus on the bill of materials leads to significant challenges for companies that are undergoing a digital transformation and adopting continuous value delivery. The lack of headroom, high coupling and versioning hell may easily cause an explosion of R&D expenditure over time.

 

The model-based, connected enterprise

I will not solve the puzzle of the future in this post. You can read my observations in my series: The road to model-based and connected PLM. We need a new infrastructure with at least two modes. One that still serves as a System of Record, storing information in a traditional manner, like a Bill of Materials for the static parts, as not everyone and everything can be connected.

In addition, we need various Systems of Engagement that enable close to real-time interaction between products (systems) and relevant stakeholders for the engagement scope(multidisciplinary / consumers).

Digital twins are examples of such environments. Currently, these Systems of Engagement often work disconnected from the System of Record due to the lack of understanding of how to connect. (standard connectors? / OSLC?)

Our mission is to explore, as I wrote in my post Time to split PLM and drop our mechanical mindset.

And while I was finalizing this post, I read a motivating post from Jan Bosch again for all of you working on understanding and pushing the digital transformation in your eco-system.
The title: Be the protagonist of your life: 15 rules  A starting point for more to come.

 

Conclusion

The BOM is no longer the master of the product lifecycle when it comes to managing connected products, where functionality mainly depends on software. BOM structures with related documents are just one of the extracted baselines from a data-driven, connected enterprise. This traditional PLM infrastructure requires other, non-BOM-driven structures to represent the actual status of a virtual or physical product.
The BOM is not dead, but there is more ………

Your thoughts?

Those who have read my blog posts over the years will have seen the image to the left.

The people, processes and tools slogan points to the best practice of implementing (PLM and CM) systems.

Theoretically, a PLM implementation will move smoothly if the company first agrees on the desired processes and people involved before a system implementation using the right tools.

Too often, companies start from their historical landscape (the tools – starting with a vendor selection) and then try to figure out the optimal usage of their systems. The best example of this approach is the interaction between PDM(PLM) and ERP.

 

PDM and ERP

Historically ERP was the first enterprise system that most companies implemented. For product development, there was the PDM system, an engineering tool, and for execution, there was the ERP system. Since ERP focuses on the company’s execution, the system became the management’s favorite.

The ERP system and its information were needed to run and control the company. Unfortunately, this approach has introduced the idea that the ERP system should also be the source of the part information, as it was often the first enterprise system for a company. The PDM system was often considered an engineering tool only. And when we talk about a PLM system, who really implements PLM as an enterprise system or was it still an engineering tool?

This is an example of Tools, Processes, and People – A BAD PRACTICE.

Imagine an engineer who wants to introduce a new part needed for a product to deliver. In many companies at the beginning of this century, even before starting the exercise, the engineer had to request a part number from the ERP system. This is implementation complexity #1.

Next, the engineer starts developing versions of the part based on the requirements. Ultimately the engineer might come to the conclusion this part will never be implemented. The reserved part number in ERP has been wasted – what to do?

It sounds weird, but this was a reality in discussions on this topic until ten years ago.

Next, as the ERP system could only deal with 7 digits, what about part number reuse? In conclusion, it is a considerable risk that reused part numbers can lead to errors. With the introduction of the PLM systems, there was the opportunity to bridge the gap between engineering and manufacturing. Now it is clear for most companies that the engineer should create the initial part number.

Only when the conceptual part becomes approved to be used for the realization of the product, an exchange with the ERP system will be needed. Using the same part number or not, we do not care if we can map both identifiers between these environments and have traceability.

It took almost 10 years from PDM to PLM until companies agreed on this approach, and I am curious about your company’s status.

Meanwhile, in the PLM world, we have evolved on this topic. The part and the BOM are no longer simple entities. Instead, we often differentiate between EBOM and MBOM, and the parts in those BOMs are not necessarily the same.

In this context, I like Prof. Dr. Jörg W. Fischer‘s framing:
EBOM is the specification, and MBOM is the realization.
(Leider schreibt Er viel auf Deutsch).

An interesting discussion initiated by Jörg last week was again about the interaction between PLM and ERP. The article is an excellent example of how potentially mainstream enterprises are thinking. PLM = Siemens, ERP = SAP – an illustration of the “tools first” mindset before the ideal process is defined.

There was nothing wrong with that in the early days, as connectivity between different systems was difficult and expensive. Therefore people with a 20 year of experience might still rely on their systems infrastructure instead of data flow.

But enough about the bad practice – let’s go to people, processes, (data), and Tools

People, Processes, Data and Tools?

I got inspired by this topic, seeing this post two weeks ago from Juha Korpela, claiming:

Okay, so maybe a hot take, maybe not, but: the old “People, Process, Technology” trinity is one of the most harmful thinking patterns you can have. It leaves out a key element: Data.

His full post was quite focused on data, and I liked the ” wrapping post” from Dr. Nicolas Figay here, putting things more in perspective from his point of view. The reply made me think about how this discussion fits into the PLM digital transformation discussion. How would it work in the two major themes I use to explain the digital transformation in the PLM landscape?

For incidental readers of my blog, these are the two major themes I am using:

  1. From Coordinated to Connected, based on the famous diagram from Marc Halpern (image below). The coordinated approach based on documents (files) requires a particular timing (processes) and context (Bills of Information) – it is the traditional and current PLM approach for most companies. On the other hand, the Connected approach is based on connected datasets (here, we talk about data – not files). These connected datasets are available in different contexts, in real-time, to be used by all kinds of applications, particularly modeling applications. Read about it in the series: The road to model-based and connected PLM.
    .
  2. The need to split PLM, thinking in System(s) of Record and Systems of Engagement. (example below) The idea behind this split is driven by the observation that companies need various Systems of Record for configuration management, change management, compliance and realization. These activities sound like traditional PLM targets and could still be done in these systems. New in the discussion is the System of Engagement which focuses on a specific value stream in a digitally connected manner. Here data is essential.I discussed the coexistence of these two approaches in my post Time to Split PLM. A post on LinkedIn with many discussions and reshares illustrating the topic is hot. And I am happy to discuss “split PLM architectures” with all of you.

These two concepts discuss the processes and the tools, but what about the people? Here I came to a conclusion to complete the story, we have to imagine three kinds of people. And this will not be new. We have the creators of data, the controllers of data and the consumers of data. Let’s zoom in on their specifics.

 

A new representation?

I am looking for a new simplifaction of the people, processes, and tools trinity combined with data; I got inspired by the work Don Farr did at Boeing, where he worked on a new visual representation for the model-based enterprise. You might have seen the image on the left before – click on it to see it in detail.

I wrote the first time about this new representation in my post: The weekend after CIMdata Roadmap / PDT Europe 2018

Related to Configuration Management, Martijn Dullaart and Martin Haket have also worked on a diagram with their peers to depict the scope of CM and Impact Analysis. The image leads to the post with my favorite quote: Communication is merely an exchange of information, but connections tell the story.

Below I share my first attempt to combine the people, process and tools trinity with the concepts of document and data, system(s) of record and system(s) of engagement. Trying to build the story.  Look if you recognize the aspects of the discussion above, and feel free to develop enhancements.

I look forward to your suggestions. Like the understanding that we have to split PLM thinking, as it impacts how we look at implementations.

Conclusion

Digital transformation in the PLM domain is forcing us to think differently. There will still be processes based on people collecting, interpreting and combining information. However, there will also be a new domain of connected data interpreted by models and algorithms, not necessarily depending on processes.

Therefore we need to work on new representations that can be used to tell this combined story. What do you think? How can we improve?

 

In the last few weeks, I thought I had a writer’s block, as I usually write about PLM-related topics close to my engagements.
Where are the always popular discussions related to EBOM or MBOM? Where is the Form-Fit-Function discussion or the traditional “meaningful numbers” discussions?

These topics always create a lot of interaction and discussion, as many of us have mature opinions.

However, last month I spent most of the time discussing the connection between digital PLM strategies and sustainability. With the Russian invasion of Ukraine, leading to high energy prices, combined with several climate disasters this year, people are aware that 2022 is not a year as usual. A year full of events that force us to rethink our current ways of living.

The notion of urgency

Sustainability for the planet and its people has all the focus currently. COP27 gives you the impression that governments are really serious. Are they? Read this post from Kimberley R. Miner, Climate Scientist at NASA, Polar Explorer& Professor.

She doubts if we really grasp the urgency needed to address climate change. Or are we just playing to be on stage? I agree with her doubts.

So what to do with my favorite EBOM-MBOM discussions?

Last week I attended an event organized by Dassault Systems in the Netherlands for their Dutch/Belgium customers.

The title of the event was: Sustainable innovation for a digital future. I expected a techy event. Click on the image to see the details.

Asking my grandson, who had just started to his study Aerospace Engineering in Delft (NL), learning to work with CAD and PLM-tools, to join me – he replied:

“Too many software demos”

It turned out that my grandson was wrong. The keynote speech from Ruud Veltenaar made most of the audience feel uncomfortable. He really pointed to the fact that we are aware of climate change and our impact on the planet, but in a way, we are paralyzed. Nothing new, but confronting and unexpected when going to a customer event.

Ruud’s message: Accept that we are at the end of an existing world order, and we should prepare for a new world order with the right moral leadership. It starts within yourself. Reflect on who you really are, where you are in your life path, and finally, what you want.

It sounds simple, and I can see it helps to step aside and reflect on these points.

Otherwise, you might feel we are in a rat race as shown below (recommend to watch).

The keynote was the foundation for a day of group and panel discussions on sustainability. Learning from their customers their sustainability plans and experiences.

It showed Dassault Systems, with its 2012  purpose (click on the link to see its history), Harmonizing Products, Nature and Life is ahead of the curve (at least they were for me).

The event was energizing, and my grandson was wrong:
“No software – next time?”

 

The impact of legacies – data, processes & people

For those who haven’t read my previous post, The week after PLM Roadmap / PDT Europe 2022, I wrote about the importance of Heterogeneous and federated PLM, one of the discussions related to data-driven PLM.

Looking back, I have been writing about data-driven PLM since 2014, and few companies have made progress here. Understandable, first of all, due to legacy data, which is not in the right format or quality to support data-driven processes.

However, also here, legacy processes and legacy people are blocking the change. There is no blame here; it is difficult to change. You might have a visionary management team, but then it comes down to the execution of the strategy. The organizational structure and the existing people skills are creating more resistance than progress.

For that reason, I wrote this post in 2015: PLM and Global Warming, where I compared the progress we made within our PLM community with the lack of progress we are making in solving global warming. We know the problem, but we are unable to act due to the lack of feeling the urgency.

This blog post triggered Rich McFall to start together in 2018 the PLM Global Green Alliance.

 

In my PLM Roadmap / PDT Europe session Sustainability and Data-driven PLM – the perfect storm, I raised the awareness that we need to speed up. We have 10 perhaps 15 years to implement radical changes, according to scientists, before we reach irreversible tipping points.

 

Why PLM and Sustainability?

Sustainability starts with the business strategy. How does your company want to contribute to a more sustainable future? The strategy to follow with probably the most impact is the concept of a circular economy – image below and more info here.

The idea behind the circular economy is to minimize the need for new finite materials (the right side) and to use for energy delivery only renewables. Implementing these principles clearly requires a more holistic design of products and services. Each loop should be analyzed and considered when delivering solutions to the market.

Therefore, a logical outcome of the circular economy would be transforming from selling products to the market towards a product-as-a-service model. In this case, the product manufacturer becomes responsible for the full product lifecycle and its environmental impact.

And here comes the importance of PLM. You can measure and tune your environmental impact during production in your ERP or MES environment. However, 80 % of the environmental impact is defined during the design phase, the domain of PLM. All these analysis together are called Life Cycle Analysis or Life Cycle Assessment (LCA). A practice that starts at the moment you start to think about a product or solution – a specialized systems thinking approach.

So how to define and select the right options for future products?

 

Virtual products / Digital Twins

This is where sustainability is pushing for digitization of the product lifecycle. Building and analyzing products in the virtual world is much cheaper than working with physical prototypes.

The importance of a model-based approach here allows companies efficiently deal with trade-off studies for each solution.

In addition, the choice and the behavior of materials also have an impact. These material properties will come from various databases, some based on hazardous substances, others on environmental parameters. Connecting these databases to the virtual model is crucial to remain efficient.

Imagine you need manually collect and process in these properties whenever studying an alternative. The manual process will be too costly (fewer trade-offs and not finding the optimum) and too slow (time-to-market impact).

That’s why I am greatly interested in all the developments related to a federated PLM infrastructure. A monolithic system cannot be the solution for such a model-based environment. In my terminology, here we need an architecture with systems of engagement combined with system(s) of record.

I will publish more on this topic in the future.

In the previous paragraphs, I wrote about the virtual product environment, which some companies call the virtual twin. However, besides the virtual twin, we also need several digital twins. These digital models allow us to monitor and optimize the production process, which can lead to design changes.

Also, monitoring the product in operation using a digital twin allows us to optimize the performance and execution of the solutions in the field.

The feedback from these digital twins will then help the company to improve the design and calibrate their simulation models. It should be a closed loop. You can find a more recent discussion related to the above image here.

 

Our mission

At this moment, sustainability is at the top of my personal agenda, and I hope for many of you. However, besides the choices we can make in our personal lives, there is also an area where we, as PLM interested parties, should contribute: The digitization of the product lifecycle as an enabler for a sustainable business.

Without mature concepts for a connected enterprise, implementing sustainable products and business processes will be a wish, not a strategy. So add digitization to your skillset and use it in the context of sustainability.

Conclusion

It might look like this PLM blog has become an environmental blog. This might be right, as the environmental impact of products and solutions is directly related to product lifecycle management. However, do not worry. In the upcoming time, I will focus on the aspects and experiences of a connected enterprise. I will leave the easier discussions (EBOM/MBOM/FFF/Smart Numbers) from a coordinated enterprise as they are. There is work to do shortly. Your thoughts?

 

 

 

 

 

 

 

With great pleasure, I am writing this post, part of a tradition that started for me in 2014. Posts starting with “The weekend after …. “describing what happened during a PDT conference, later the event merged with CIMdata becoming THE PLM event for discussions beyond marketing.

For many of us, this conference was the first time after COVID-19 in 2020. It was a 3D (In person) conference instead of a 2D (digital) conference. With approximately 160 participants, this conference showed that we wanted to meet and network in person and the enthusiasm and interaction were great.

The conference’s theme, Digital Transformation and PLM – a call for PLM Professionals to redefine and re-position the benefits and value of PLM, was quite open.

There are many areas where digitization affects the way to implement a modern PLM Strategy.

Now some of my highlights from day one. I needed to filter to remain around max 1500 words. As all the other sessions, including the sponsor vignettes, were informative, they increased the value of this conference.


Digital Skills Transformation -Often Forgotten Critical Element of Digital Transformation

Day 1 started traditionally with the keynote from Peter Bilello, CIMdata’s president and CEO. In previous conferences, Peter has recently focused on explaining the CIMdata’s critical dozen (image below). If you are unfamiliar with them, there is a webinar on November 10 where you can learn more about them.

All twelve are equally important; it is not a sequence of priorities. This time Peter spent more time on Organisational Change management (OCM), number 12 of the critical dozen – or, as stated, the Digital Transformation’s Achilles heel. Although we always mention people are important, in our implementation projects, they often seem to be the topic that gets the less focus.

We all agree on the statement: People, Process, Tools & Data. Often the reality is that we start with the tools, try to build the processes and push the people in these processes. Is it a coincidence that even CIMdata puts Digital Skills transformation as number 12? An unconscious bias?

This time, the people’s focus got full attention. Peter explained the need for a digital skills transformation framework to educate, guide and support people during a transformation. The concluding slide below says it all.


Transformation Journey and PLM & PDM Modernization to the Digital Future

The second keynote of the day was from Josef Schiöler, Head of Core Platform Area PLM/PDM from the Volvo Group. Josef and his team have a huge challenge as they are working on a foundation for the future of the Volvo Group.

The challenge is that it will provide the foundation for new business processes and the various group members, as the image shows below:


As Josef said, it is really the heart of the heart, crucial for the future. Peter Bilello referred to this project as open-heart surgery while the person is still active, as the current business must go on too.

The picture below gives an impression of the size of the operation.

And like any big transformation project also, the Volvo Group has many questions to explore as there is no existing blueprint to use.

To give you an impression:

  • How to manage complex documentation with existing and new technology and solution co-existing?
    (My take: the hybrid approach)
  • How to realize benefits and user adoption with user experience principles in mind?
    (My take: Understand the difference between a system of engagement and a system of record)
  • How to avoid seeing modernization as pure an IT initiative and secure that end-user value creation is visible while still keeping a focus on finalizing the technology transformation?
    (My take: think hybrid and focus first on the new systems of engagement that can grow)
  • How to efficiently partner with software vendors to ensure vendor solutions fit well in the overall PLM/PDM enterprise landscape without heavy customization?
    (My take: push for standards and collaboration with other similar companies – they can influence a vendor)

Note: My takes are just a starting point of the conversation. There is a discussion in the PLM domain, which I described in my blog post: A new PLM paradigm.

 

The day before the conference, we had a ½ day workshop initiated by SAAB and Eurostep where we discussed the various angles of the so-called Federated PLM.

I will return to that topic soon after some consolidation with the key members of that workshop.


Steering future Engineering Processes with System Lifecycle Management

Patrick Schäfer‘s presentation was different than the title would expect. Patrick is the IT Architect Engineering IT from ThyssenKrupp Presta AG. The company provides steering systems for the automotive industry, which is transforming from mechanical to autonomous driving, e-mobility, car-to-car connectivity, stricter safety, and environmental requirements.

The steering system becomes a system depending on hardware and software. And as current users of Agile PLM, the old Eigner PLM software, you can feel Martin Eigner’s spirit in the project.

I briefly discussed Martin’s latest book on System Lifecycle Management in my blog post, The road to model-based and connected PLM (part 5).

Martin has always been fighting for a new term for modern PLM, and you can see how conservative we are – for sometimes good reasons.

Still, ThyssenKrupp Presta has the vision to implement a new environment to support systems instead of hardware products. And in addition, they had to work fast to upgrade their current almost obsolete PLM environment to a new supported environment.

The wise path they chose was first focusing on a traditional upgrade, meaning making sure their PLM legacy data became part of a modern (Teamcenter) PLM backbone. Meanwhile, they started exploring the connection between requirements management for products and software, as shown below.

From my perspective, I would characterize this implementation as the coordinated approach creating a future option for the connected approach when the organization and future processes are more mature and known.

A good example of a pragmatic approach.


Digital Transformation in the Domain of Products and Plants at Siemens Energy

Per Soderberg, Head of Digital PLM at Siemens Energy, talked about their digital transformation project that started 6 – 7 years ago. Knowing the world of gas- and steam turbines, it is a domain where a lot of design and manufacturing information is managed in drawings.

The ultimate vision from Siemens Energy is to create an Industrial Metaverse for its solutions as the benefits are significant.

Is this target too ambitious, like GE’s 2014 Industrial Transformation with Predix? Time will tell. And I am sure you will soon hear more from Siemens Energy; therefore, I will keep it short. An interesting and ambitious program to follow. Sure you will read about them in the near future. 


Accelerating Digitalization at Stora Enso

Stora Enso is a Finish company, a leading global provider of renewable solutions in packaging, biomaterials, wooden construction and paper. Their director of Innovation Services, Kaisa Suutari, shared Stora Enso’s digital transformation program that started six years ago with a 10 million/year budget (some people started dreaming too). Great to have a budget but then where to start?

In a very systematic manner using an ideas funnel and always starting from the business need, they spend the budget in two paths, shown in the image below.

Their interesting approach was in the upper path, which Kaisa focused on. Instead of starting with an analysis of how the problem could be addressed, they start by doing and then analyze the outcome and improve.

I am a great fan of this approach as it will significantly reduce the time to maturity. However, how much time is often wasted in conducting the perfect analysis?

Their Digi Fund process is a fast process to quickly go from idea to concept, to POC and to pilot, the left side of the funnel. After a successful pilot, an implementation process starts small and scales up.

There were so many positive takeaways from this session. Start with an MVP (Minimal Viable Product) to create value from the start. Next, celebrate failure when it happens, as this is the moment you learn. Finally, continue to create measurable value created by people – the picture below says it all.

It was the second time I was impressed by Stora Enso’s innovative approach. During the PI PLMX 2020 London, Samuli Savo, Chief Digital Officer at Stora Enso, gave us insights into their innovation process. At that time, the focus was a little bit more on open innovation with startups. See my post:  The weekend after PI PLMx London 2020. An interesting approach for other businesses to make their digital transformation business-driven and fun for the people


 A day-one summary

There was Kyle Hall, who talked about MoSSEC and the importance of this standard in a connected enterprise. MoSSEC (Modelling and Simulation information in a collaborative Systems Engineering Context) is the published ISO standard (ISO 10303-243) for improving the decision-making process for complex products. Standards are a regular topic for this conference, more about MoSSEC here.

There was Robert Rencher, Sr. Systems Engineer, Associate Technical Fellow at Boeing, talking about the progress that the A&D action group is making related to Digital Thread, Digital Twins. Sometimes asking more questions than answers as they try to make sense of the marketing definition and what it means for their businesses. You can find their latest report here.

There was Samrat Chatterjee, Business Process Manager PLM at the ABB Process Automation division. Their businesses are already quite data-driven; however, by embedding PLM into the organization’s fabric, they aim to improve effectiveness, manage a broad portfolio, and be more modular and efficient.

The day was closed with a CEO Spotlight, Peter Bilello. This time the CEOs were not coming from the big PLM vendors but from complementary companies with their unique value in the PLM domain. Henrik Reif Andersen, co-founder of Configit; Dr. Mattias Johansson, CEO of Eurostep; Helena Gutierrez, co-founder of Share PLM; Javier Garcia, CEO of The Reuse Company and  Karl Wachtel, CEO, XPLM discussed their various perspectives on the PLM domain.

 

Conclusion

Already so much to say; sorry, I reached the 1500 words target; you should have been there. Combined with the networking dinner after day one, it was a great start to the conference. Are you curious about day 2 – stay tuned, and your curiosity will be rewarded.

 

Thanks to Ewa Hutmacher, Sumanth Madala and Ashish Kulkarni, who shared their pictures of the event on LinkedIn. Clicking on their names will lead you to the relevant posts.

 

As human beings, we believe in the truth. We claim the truth. During my holiday in Greece, the question was, did the Greek Prime Minister tell the truth about the internal spy scandal?

In general, we can say, politicians never speak the real truth, and some countries are trying to make sure there is only one single source of truth – their truth. The concept of a Single Source Of Truth (SSOT) is difficult to maintain in politics.

On social media, Twitter and Facebook, people are claiming their truth. But unfortunately, without any scientific background, people know better than professionals by cherry-picking messages, statistics or even claiming non-existing facts.

Nicely described in The Dunning-Kruger effect. Unfortunately, this trend will not disappear.

If you want to learn more about the impact of social media, read this long article from The Atlantic:  Why the Past 10 Years of American Life Have Been Uniquely Stupid. Although the article is about the US, the content is valid for all countries where social media are still allowed.

The PLM and CM domain is the only place where people still rely on the truth defined by professionals. Manufacturing companies depend on reliable information to design, validate, manufacture and support their products. Compliance and safe products require an accurate and stable product definition based on approved information. Therefore, the concept of SSOT is crucial along the product lifecycle.

The importance may vary depending on the product type. The difference in complexity between an airplane and a plastic toy, for example. It is all about the risk and impact of a failure caused by the product.

During my holiday, the SSOT discussion was sparked on LinkedIn by Adam Keating, and the article starts with:

The “Single Source of Truth (SSOT)” wasn’t built for you. It was built for software vendors to get rich. Not a single company in the world has a proper SSOT.

A bit provocative, as there is nothing wrong with software vendors being profitable. Profitability guarantees the long-time support of the software solution. Remember the PLM consolidation around 2006, when SmarTeam, Matrix One (Dassault), Agile and Eigner & Partner (Oracle) were acquired, disappeared or switched to maintenance mode.

Therefore it makes sense to have a profitable business model or perhaps a real open source business model.

Still, the rest of the discussion was interesting, particularly in the LinkedIn comments. Adam mentioned the Authoritative Source of Truth (ASOT) as the new future. And although this concept becomes more and more visible in the PLM domain, I believe we need both. So, let’s have a look at these concepts.

 

Truth 1.0 – SSOT

Historically, manufacturing companies stored the truth in documents, first paper-based, later in electronic file formats and databases.

The truth consists of drawings, part lists, specifications, and other types of information.

Moreover, the information is labeled with revisions and versions to identify the information.

By keeping track of the related information through documents or part lists with significant numbers, a person in the company could find the correct corresponding information at any stage of the lifecycle.

Later, by storing all the information in a central (PLM) system, the impression might be created that this system is the Single Source Of Truth. The system Adam Keating agitated against in his LinkedIn post.

Although for many companies, the ERP has been the SSOT  (and still is). All relevant engineering information was copied into the ERP system as attached files. Documents are the authoritative, legal pieces of information that a company shares with suppliers, authorities, or customers. They can reside in PLM but also in ERP. Therefore, you need an infrastructure to manage the “truth.”

Note: The Truth 1.0 story is very much a hardware story.

Even for hardware, ensuring a consistent single version of the truth for each product remains difficult. In theory, its design specifications should match the manufacturing definition. The reality, however, shows that often this is not the case. Issues discovered during the manufacturing process are fixed in the plant – redlining the drawing  – is not always processed by engineering.

As a result, Engineering and Manufacturing might have a different version of what they consider the truth.

The challenge for a service engineer in the field is often to discover the real truth. So the “truth” might not always be in the expected place – no guaranteed Single Source Of Truth.

Configuration Management is a discipline connected to PLM to ensure that the truth is managed so that as-specified, as-manufactured, and as-delivered information has been labeled and documented unambiguously. In other words, you could say Configuration Management(CM) is aiming for the Single Source Of Truth for a product.

If you want to read more about the relation between PLM and CM  – read this post: PLM and Configuration Management (CM), where I speak with Martijn Dullaart about the association between PLM and CM.

Martijn has his blog mdux.net and is the Lead Architect for Enterprise Configuration Management at our Dutch pride ASML. Martijn is also Chairperson I4.0 Committee IPX Congress.

Summarizing: The Single Source Of Truth 1.0 concept is document-based and should rely on CM practices, which require skilled people and the right methodology. In addition, some industries require Truth 1.0.

Others take the risk of working without solid CM practices, and the PLM system might create the impression of the SSOT; it will not be the case, even for only hardware.

 Truth 2.0 – ASOT

Products have become more complex, mainly due to the combination of electronics and software. Their different lifecycles and the speed of change are hard to maintain using the traditional PLM approach of SSOT.

It will be impossible to maintain an SSOT, particularly if it is based on documents.

As CM is the discipline to ensure data consistency, it is important to look into the future of CM. At the end of last year, I discussed this topic with 3 CM thought leaders. Martijn Dullaart, Maxime Gravel and Lisa Fenwick discussed with me what they believe the change would be. Read and listen here: The future of Configuration Management.


From the discussion, it became clear that managing all the details is impossible; still, you need an overreaching baseline to identify the severity and impact of a change along the product lifecycle.

New methodologies can be developed for this, as reliable data can be used in algorithms to analyze a change impact. This brings us to the digital thread. According to the CIMdata definition used in the A&D digital twin phase 2 position paper:

The digital thread provides the ability for a business to have an Authoritative Source of Truth(ASOT), which is information available and connected in a core set of the enterprise systems across the lifecycle and supplier networks

The definition implies that, in the end, a decision is made on data from the most reliable, connected source. There might be different data in other locations. However, this information is less reliable. Updating or fixing this information does not make sense as the effort and cost of fixing will be too expensive and give no benefit.

Obviously, we need reliable data to implement the various types of digital twins.

As I am intrigued by the power of the brain – its strengths and weaknesses – the concept of ASOT can also be found in our brains. Daniel Kahneman’s book, Thinking Fast and Slow talks about the two systems/modes our brain uses. The Fast one (System 1 – low energy usage) could be the imaginary SSOT, whereas the Slow one (System 2 – high energy required) is the ASOT. The brain needs both, and I believe this is the same in our PLM domain.

A new PLM Paradigm

In this context, there is a vivid discussion about the System of Record and Systems of Engagement. I wrote about it in June (post: A new PLM paradigm); other authors name it differently, but all express a similar concept. Have a look at these recent articles and statements from:

Author Link to content

Authentise

 

The challenge of cross-discipline collaboration …….

Beyond PLM

 

When is the right time to change your PLM system + discussion

Colab

 

The Single Source Of Truth wasn’t built for you …….

Fraunhofer institute

 

Killing the PLM Monolith – the Emergence of cloud-native System Lifecycle Management (SysLM)

SAAB Group

 

Don’t mix the tenses. Managing the Present and the Future in an MBSE context

Yousef Hooshmand

 

From a Monolithic PLM Landscape to a Federated Domain and Data Mesh

If you want to learn more about these concepts and discuss them with some of the experts in this domain, come to the upcoming PLM Roadmap PTD Europe conference on 18-19 October in Gothenburg, Sweden. Have a look at the final agenda here

Register before September 12 to benefit from a 15 % Early Bird discount, which you can spend for the dinner after day 1. I look forward to discussing the SSOT/ASOT topics there.


Conclusion

The Single Source Of Truth (SSOT) and the Authoritative Source of Truth (ASOT) are terms that illustrate the traditional PLM paradigm is changing thanks to digitization and connected stakeholders. The change is in the air. Now, the experience has to come. So be part of the change and discuss with us.

 

Translate

  1. Unknown's avatar
  2. Håkan Kårdén's avatar

    Jos, all interesting and relevant. There are additional elements to be mentioned and Ontologies seem to be one of the…

  3. Lewis Kennebrew's avatar

    Jos, as usual, you've provided a buffet of "food for thought". Where do you see AI being trained by a…

  4. Håkan Kårdén's avatar