You are currently browsing the tag archive for the ‘Business Change’ tag.

Another year passed, and as usual, I took the time to look back. I always feel that things are going so much slower than expected. But that’s reality – there is always friction, and in particular, in the PLM domain, there is so much legacy we cannot leave behind.

It is better to plan what we can do in 2024 to be prepared for the next steps or, if lucky, even implement the next steps in progress.

In this post, I will discuss four significant areas of attention (AI – DATA – PEOPLE – SUSTAINABILITY) in an alphabetic order, not prioritized.

Here are some initial thoughts. In the upcoming weeks I will elaborate further on them and look forward to your input.

 

AI (Artificial Intelligence)

Where would I be without talking about AI?

When you look at the image below, the Gartner Hype Cycle for AI in 2023, you see the potential coming on the left, with Generative AI at the peak.

Part of the hype comes from the availability of generative AI tools in the public domain, allowing everyone to play with them or use them. Some barriers are gone, but what does it mean? Many AI tools can make our lives easier, and there is for sure no threat if our job does not depend on standard practices.

 

AI and People

When I was teaching physics in high school, it was during the introduction of the pocket calculator, which replaced the slide rule.You need to be skilled to uyse the slide rule, now there was a device that gave immediate answers. Was this bad for the pupils?

If you do not know a slide rule, it was en example of new technology replacing old tools, providing more time for other details.  Click on the image or read more about the slide rule here on Wiki.

Or today you would ask the question about the slide rule to ChatGPT? Does generative AI mean the end of Wikipedia? Or does generative AI need the common knowledge of sites like Wikipedia?

AI can empower people in legacy environments, when working with disconnected systems. AI will be a threat for to people and companies that rely on people and processes to bring information together without adding value. These activities will disappear soon and you must consider using this innovative approach.

During the recent holiday period, there was an interesting discussion about why companies are reluctant to change and implement better solution concepts. Initially launched by Alex Bruskin here on LinkedIn , the debate spilled over into the topic of TECHNICAL DEBT , well addressed here by Lionel Grealou.

Both articles and the related discussion in the comments are recommended to follow and learn.

 

AI and Sustainability

Similar to the introduction of Bitcoin using blockchain technology, some people are warning about the vast energy consumption required for training and interaction with Large Language Models (LLM), as Sasha Luccioni explains in her interesting TED talk when addressing sustainability.

She proposes that tech companies should be more transparent on this topic, the size and the type of the LLM matters, as the indicative picture below illustrates.

Carbon Emissions of LLMs compared

In addition, I found an interesting article discussing the pros and cons of AI related to Sustainability. The image below from the article Risks and Benefits of Large Language Models for the Environment illustrates nicely that we must start discussing and balancing these topics.

To conclude, in discussing AI related to sustainability, I see the significant advantage of using generative AI for ESG reporting.

ESG reporting is currently a very fragmented activity for organizations, based on (marketing) people’s goodwill and currently these reports are not always be evidence-based.

 

Data

The transformation from a coordinated, document-driven enterprise towards a hybrid coordinated/connected enterprise using a data-driven approach became increasingly visible in 2023. I expect this transformation to grow faster in 2024 – the momentum is here.

We saw last year that the discussions related to Federated PLM nicely converged at the PLM Roadmap / PDT Europe conference in Paris. I shared most of the topics in this post: The week after PLM Roadmap / PDT Europe 2023. In addition, there is now the Heliple Federated PLM LinkedIn group with regular discussions planned.

In addition, if you read here Jan Bosch’s reflection on 2023, he mentions (quote):

… 2023 was the year where many of the companies in the center became serious about the use of data. Whether it is historical analysis, high-frequency data collection during R&D, A/B testing or data pipelines, I notice a remarkable shift from a focus on software to a focus on data. The notion of data as a product, for now predominantly for internal use, is increasingly strong in the companies we work with

I am a big fan of Jan’s posting; coming from the software world, he describes the same issues that we have in the PLM world, except he does not carry the hardware legacy that much and, therefore, acts faster than us in the PLM world.

An interesting illustration of the slow pace to a data-driven environment is the revival of the PLM and ERP integration discussion. Prof. Jörg Fischer and Martin Eigner contributed to the broader debate of a modern enterprise infrastructure, not based on systems (PLM, ERP, MES, ….) but more on the flow of data through the lifecycle and an organization.

It is a great restart of the debate, showing we should care more about data semantics and the flow of information.

The articles: The Future of PLM & ERP: Bridging the Gap. An Epic Battle of Opinions!  and Is part master in PLM and ERP equal or not) combined with the comments to these posts, are a must read to follow this change towards a more connected flow of information.

While writing this post, Andreas Lindenthal expanded the discussion with his post: PLM and Configuration Management Best Practices: Part Traceability and Revisions. Again thanks to data-driven approaches, there is an extending support for the entire product lifecycle. Product Lifecycle Management,  Configuration Management and AIM (Asset Information Management) have come together.

PLM and CM are more and more overlapping as I discussed some time ago with Martijn Dullaart, Maxime Gravel and Lisa Fenwick in the The future of Configuration Management. This topic will be “hot”in 2024.

 

People

From the people’s perspective towards AI, DATA and SUSTAINABILITY, there is a noticeable divide between generations.  Of course, for the sake of the article, I am generalizing, assuming most people do not like to change their habits or want to reprogram themselves.

Unfortunate, we have to adapt our skills as our environment is changing. Most of my generation was brought up with the single source of truth idea, documented and supported by science papers.

In my terminology, information processing takes place in our head by combining all the information we learned or collected through documents/books/newspapers – the coordinated approach.

For people living in this mindset, AI can become a significant threat, as their brain is no longer needed to make a judgment, and they are not used to differentiate between facts and fake news as they were never trained to do so

The same is valid for practices like the model-based approach, working data-centric, or considering sustainability. It is not in the DNA of the older generations and, therefore, hard to change.

The older generation is mostly part of an organization’s higher management, so we are returning to the technical debt discussion.

Later generations that grew up as digital natives are used to almost real-time interaction, and when applied consistently in a digital enterprise, people will benefit from the information available to them in a rich context – in my terminology – the connected approach.

AI is a blessing for people living in this mindset as they do not need to use old-fashioned methods to acquire information.

“Let ChatGPT write my essay.”

However, their challenge could be what I would call “processing time”. Because data is available, it does not necessarily mean it is the correct information. For that reason it remains important to spend time digesting the impact of information you are reading – don’t click “Like”based on the tittle, read the full article and then decide.

Experience is what you get, when you don’t get what you expect.

meaning you only become experienced if you learn from failures.

 

Sustainability

Unfortunately, sustainability is not only the last topic in alphabetic order, as when you look at the image below, you see that discussions related to sustainability are in a slight decline at C-level at the moment.

I share this observation in my engagements when discussing sustainability with the companies I interact with.

The PLM software and services providers are all on a trajectory of providing tools and an infrastructure to support a transition to a more circular economy and better traceability of materials and carbon emissions.

In the PLM Global Green Alliance, we talked with Aras, Autodesk, Dassault Systems, PTC, SAP, Sustaira, TTPSC(Green PLM) and more to come in 2024. The solution offerings in the PLM domain are available to start, now the people and processes.

For sure, AI tools will help companies to get a better understanding of their sustainability efforts. As mentioned before AI could help companies in understanding their environmental impact and build more accurate ESG reports.

Next, being DATA-driven will be crucial.  As discussed during the latest PLM Roadmap/PDT Europe conference: The Need for a Governance Digital Thread.

And regarding PEOPLE, the good news is that younger generations want to take care of their future. They are in a position to choose the company to work for or influence companies by their consumer behavior. Unfortunately, climate disasters will remind us continuously in the upcoming decades that we are in a critical phase.

With the PLM Global Green Alliance, we strive to bring people together with a PLM mindset, sharing news and information on how to move forward to a sustainable future.

Mark Reisig (CIMdata – moderator for Sustainability & Energy) and Patrice Quencez (CIMPA – moderator for the Circular Economy) joined the PGGA last year and you will experience their inputs this year.

 

Conclusion

As you can see from this long post, there is so much to learn. The topics described are all actual, and each topic requires education, experience (success & failures) combined with understanding  of the technology concepts. Make sure you consider all of them, as focusing on a single topic will not make move faster forward – they are all related. Please share your experiences this year—Happy New Year of Learning.

 

Last week, I shared my first impressions from my favorite conference, in the post: The weekend after PLM Roadmap/PDT Europe 2023, where most impressions could be classified as traditional PLM and model-based.

There is nothing wrong with conventional PLM, as there is still much to do within this scope. A model-based approach for MBSE (Model-Based Systems Engineering) and MBD (Model-Based Definition) and efficient supplier collaboration are not topics you solve by implementing a new system.

Ultimately, to have a business-sustainable PLM infrastructure, you need to structure your company internally and connect to the outside world with a focus on standards to avoid a vendor lock-in or a dead end.

In short, this is what I described so far in The weekend after ….part 1.

Now, let’s look at the relatively new topics for this audience.

Enabling the Marketing, Engineering & Manufacturing Digital Thread

Cyril Bouillard, the PLM & CAD Tools Referent at the Mersen Electrical Protection (EP) business unit, shared his experience implementing an end-to-end digital backbone from marketing through engineering and manufacturing.

Cyril showed the benefits of a modern PLM infrastructure that is not CAD-centric and focused on engineering only. The advantages of this approach are a seamless integrated flow of PLM and PIM (Product Information Management).

I wrote about this topic in 2019: PLM and PIM – the complementary value in a digital enterprise. Combining the concepts of PLM and PIM in an integrated, connected environment could also provide a serious benefit when collaborating with external parties.

Another benefit Cyril demonstrated was the integration of RoHS compliance to the BOM as an integrated environment. In my session, I also addressed integrated RoHS compliance as a stepping stone to efficiency in future compliance needs.

Read more later or in this post:  Material Compliance – as a stepping-stone towards Life Cycle Assessment (LCA)

Cyril concluded with some lessons learned.

Data quality is essential in such an environment, and there are significant time savings implementing the connected Digital Thread.

 

Meeting the Challenges of Sustainability in Critical Transport Infrastructures

Etienne Pansart, head of digital engineering for construction at SYSTRA, explained how they address digital continuity with PLM throughout the built assets’ lifecycle.

Etienne’s story was related to the complexity of managing a railway infrastructure, which is a linear and vertical distribution at multiple scales; it needs to be predictable and under constant monitoring; it is a typical system of systems network, and on top of that, maintenance and operational conditions need to be continued up to date.

Regarding railway assets – a railway needs renewal every two years, bridges are designed to last a hundred years, and train stations should support everyday use.

When complaining about disturbances, you might have a little more respect now (depending on your country). However, on top of these challenges, Etienne also talked about the additional difficulties expected due to climate change: floods, fire, earth movements, and droughts, all of which will influence the availability of the rail infrastructure.

In that context, Etienne talked about the MINERVE project – see image below:

As you can see from the main challenges, there is an effort of digitalization for both the assets and a need to provide digital continuity over the entire asset lifecycle. This is not typically done in an environment with many different partners and suppliers delivering a part of the information.

Etienne explained in more detail how they aim to establish digital twins and MBSE practices to build and maintain a data-driven, model-based environment.

Having digital twins allows much more granular monitoring and making accurate design decisions, mainly related to sustainability, without the need to study the physical world.

His presentation was again a proof point that through digitalization and digital twins, the traditional worlds of Product Lifecycle Management and Asset Information Management become part of the same infrastructure.

And it may be clear that in such a collaboration environment, standards are crucial to connect the various stakeholder’s data sources – Etienne mentioned ISO 16739 (IFC), IFC Rail, and ISO 19650 (BIM) as obvious standards but also ISO 10303 (PLCS) to support the digital thread leveraged by OSLC.

I am curious to learn more about the progress of such a challenging project – having worked with the high-speed railway project in the Netherlands in 1995 – no standards at that time (BIM did not exist) – mainly a location reference structure with documents. Nothing digital.

 

The connected Digital Thread

The theme of the conference was The Digital Thread in a Heterogeneous, Extended Enterprise Reality, and in the next section, I will zoom in on some of the inspiring sessions for the future, where collaboration or information sharing is all based on a connected Digital Thread – a term I will explain in more depth in my next blog post.

 

Transforming the PLM Landscape:
The Gateway to Business Transformation

Yousef Hooshmand‘s presentation was the highlight of this conference for me.

Yousef is the PLM Architect and Lead for the Modernization of the PLM Landscape at NIO, and he has been active before in the IT-landscape transformation at Daimler, on which he published the paper: From a monolithic PLM landscape to a federated domain and data mesh.

If you read my blog or follow Share PLM, you might seen the reference to Yousef’s work before, or recently, you can hear the full story at the Share PLM Podcast: Episode 6: Revolutionizing PLM: Insights.

It was the first time I met Yousef in 3D after several virtual meetings, and his passion for the topic made it hard to fit in the assigned 30 minutes.

There is so much to share on this topic, and part of it we already did before the conference in a half-day workshop related to Federated PLM (more on this in the following review).

First, Yousef started with the five steps of the business transformation at NIO, where long-term executive commitment is a must.

His statement: “If you don’t report directly to the board, your project is not important”, caused some discomfort in the audience.

As the image shows, a business transformation should start with a systematic description and analysis of which business values and objectives should be targeted, where they fit in the business and IT landscape, what are the measures and how they can be tracked or assessed and ultimately, what we need as tools and technology.

In his paper From a Monolithic PLM Landscape to a Federated Domain and Data Mesh, Yousef described the targeted federated landscape in the image below.

And now some vendors might say, we have all these domains in our product portfolio (or we have slides for that) – so buy our software, and you are good.

And here Yousef added his essential message, illustrated by the image below.

Start by delivering the best user-centric solutions (in an MVP manner – days/weeks – not months/years). Next, be data-centric in all your choices and ultimately build an environment ready for change. As Yousef mentioned: “Make sure you own the data – people and tools can leave!”

And to conclude reporting about his passionate plea for Federated PLM:

“Stop talking about the Single Source of Truth, start Thinking of the Nearest Source of Truth based on the Single Source of Change”.

 

Heliple-2 PLM Federation:
A Call for Action & Contributions

A great follow-up on Yousef’s session was Erik Herzog‘s presentation about the final findings of the Heliple 2 project, where SAAB Aeronautics, together with Volvo, Eurostep, KTH, IBM and Lynxwork, are investigating a new way of federated PLM, by using an OSLC-based, heterogeneous linked product lifecycle environment.

Heliple stands for HEterogeneous LInked Product Lifecycle Environment

The image below, which I shared several times before, illustrates the mindset of the project.

Last year, during the previous conference in Gothenburg, Erik introduced the concept of federated PLM – read more in my post: The week after PLM Roadmap / PDT Europe 2022, mentioning two open issues to be investigated: Operational feasibility (is it maintainable over time) and Realisation effectivity (is it affordable and maintainable at a reasonable cost)

As you can see from the slide below, the results were positive and encouraged SAAB to continue on this path.

One of the points to mention was that during this project, Lynxwork was used to speed up the development of the OSLC adapter, reducing costs, time and needed skills.

After this successful effort, Erik and several others who joined us at the pre-conference workshop agreed that this initiative is valid to be tested, discussed and exposed outside Sweden.

Therefore, the Federated PLM Interest Group was launched to join people worldwide who want to contribute to this concept with their experiences and tools.

A first webinar from the group is already scheduled for December 12th at 4 PM CET – you can join and register here.

 

More to come

Given the length of this blog post, I want to stop here.

Topics to share in the next post are related to my contribution at the conference The Need for a Governance Digital Thread, where I addressed the need for federated PLM capabilities with the upcoming regulations and practices related to sustainability, which require a connected Digital.

I want to combine this post with the findings that Mattias Johansson, CEO of Eurostep, shared in his session: Why a Digital Thread makes a lot of sense, goes beyond manufacturing, and should be standards-based.

There are some interesting findings in these two presentations.

And there was the introduction of AI at the conference, with some experts’ talks and thoughts. Perhaps at this stage, it is too high on Gartner’s hype cycle to go into details. It will surely be THE topic of discussion or interest you must have noticed.

The recent turmoil at OpenAI is an example of that. More to come for sure in the future.

 

Conclusion

The PLM Roadmap/PDT Europe conference was significant for me because I discovered that companies are working on concepts for a data-driven infrastructure for PLM and are (working on) implementing them. The end of monolithic PLM is visible, and companies need to learn to master data using ontologies, standards and connected digital threads.

 

 

 

 

 

 

During my summer holiday in my “remote” office, I had the chance to digest what I recently read, heard,  saw and discussed related to the future of PLM.

I noticed this year/last year that many companies are discussing or working on their future PLM. It is time to make progress after COVID, particularly in digitization.

And as most companies are avoiding the risk of a “big bang”, they are exploring how they can improve their businesses in an evolutionary mode.

 

PLM is no longer a system

The most significant change I noticed in my discussions is the growing awareness that PLM is no longer covered by a single system.

More and more, PLM is considered a strategy, with which I fully agree. Therefore, implementing a PLM strategy requires holistic thinking and an infrastructure of different types of systems, where possible, digitally connected.

This trend is bad news for the PLM vendors as they continuously work on an end-to-end portfolio where every aspect of the PLM lifecycle is covered by one of their systems. The company’s IT department often supports the PLM vendors, as IT does not like a diverse landscape.

The main question is: “Every PLM Vendor has a rich portfolio on PowerPoint mentioning all phases of the product lifecycle.

However, are these capabilities implementable in an economical and user-friendly manner by actual companies or should PLM players need to change their strategy”?

A question I will try to answer in this post

 

The future of PLM

I have discussed several observed changes related to the effects of digitization in my recent blog posts, referencing others who have studied these topics in their organizations.

Some of the posts to refresh your memory are:

To summarize what has been discussed in these posts are the following points:

The As Is:

  • The traditional PLM systems are examples of a System of Record, not designed to be end-user friendly but designed to have a traceable baseline for manufacturing, service and product compliance.
  • The traditional PLM systems are tuned to a mechanical product introduction and release process in a coordinated manner, with a focus on BOM governance.
  • The legacy information is stored in BOM structures and related specification files.

System of Record (ENOVIA image 2014)

The To Be:

  • We are not talking about a PLM system anymore; a traditional System of Record will be digitally connected to different Systems of Engagement / Domains / Products, which have their own optimized environment for real-time collaboration.
  • The BOM structures remain essential for the hardware part; however, overreaching structures are needed to manage software and hardware releases for a product. These structures depend on connected datasets.
  • To support digital twins at the various lifecycle stages (design. Manufacturing, operations), product data needs to be based on and consumed by models.
  • A future PLM infrastructure is hybrid, based on a Single Source of Change (SSoC) and an Authoritative Source of Truth (ASoT) instead of a Single Source of Truth (SSoT).

Various Systems of Engagement

 

Related podcasts

I relistened two podcasts before writing this post, and I think they are a must to listen to.

The Peer Check podcast from Colab episode 17 — The State of PLM in 2022 w/Oleg Shilovitsky.  Adam and Oleg have a great discussion about the future of PLM.

Highlights: From System to Platform – the new norman. A Single Source of Truth doesn’t work anymore – it is about value streams. People in big companies fear making wrong PLM decisions, which is seen as a significant risk for your career.

There is no immediate need to change the current status quo.

The Share PLM Podcast – Episode 6: Revolutionizing PLM: Insights from Yousef Hooshmand.  Yousef talked with Helena and me about proven ways to migrate an old PLM landscape to a modern PLM/Business landscape.

Highlights: The term Single Source of Change and the existing concepts of a hybrid PLM infrastructure based on his experiences at Daimler and now at NIO. Yousef stresses the importance of having the vision and the executive support to execute.

The time of “big bangs” is over, and Yousef provided links to relevant content, which you can find here in the comments.

 

In addition, I want to point to the experiences provided by Erik Herzog in the Heliple project using OSLC interfaces as the “glue” to connect (in my terminology) the Systems of Engagement and the Systems of Record.

Conclusion of the Heliple-1 project

If you are interested in these concepts and want to learn and discuss them with your peers, more can be learned during the upcoming CIMdata PLM Roadmap / PDT Europe conference.

In particular, look at the agenda for day two if you are interested in this topic.

 

The future for the PLM vendors

If you look at the messaging of the current PLM Vendors, none of them is talking about this federated concept.

They are more focused with their messaging on the transition from on-premise to the cloud,  providing a SaaS offering with their portfolio.

I was slightly disappointed when I saw this article on Engineering.com provided by Autodesk: 5 PLM Best Practices from the Experiences of Autodesk and Its Customers.

The article is tool-centric, with statements that make sense and could be written by any PLM Vendor. However, Best Practice #1  Central Source of Truth Improves Productivity and Collaboration was the message that struck me. Collaboration comes from connecting people, not from the Single Source of Truth utopia.

I don’t believe PLM Vendors have to be afraid of losing their installed base rapidly with companies using their PLM as a System or Record. There is so much legacy stored in these systems that might still be relevant. The existence of legacy information, often documents, makes a migration or swap to another vendor almost impossible and unaffordable.

The System of Record is incompatible with data-driven PLM capabilities

I would like to see more clear developments of the PLM Vendors, creating a plug-and-play infrastructure for Systems of Engagement. Plug-and-play solutions could be based on a neutral partner collaboration hub like ShareAspace or the Systems of Engagement I discussed recently in my post and interview: The new side of PLM? Systems of Engagement!

Plug-and-play systems of engagement require interface standards, and PLM Vendors will only move in this direction if customers are pushing for that, and this is the chicken-and-egg discussion. And probably, their initiatives are too fragmented at the moment to come to a standard. However, don’t give up; keep building MVPs to learn and share.

Some people believe AI, with the examples we have seen with ChatGPT, will be the future direction without needing interface standards.

I am curious about your thoughts and experiences in that area and am willing to learn.

Talking about learning?

Besides reading posts and listening to podcasts, I also read an excellent book this summer. Martijn Dullaart, often participating in PLM and CM discussions, had decided to write a book based on the various discussions related to part (re-)identification (numbering, revisioning).

The book: The Essential Guide to Part Re-Identification: Unleash the Power of Interchangeability and Traceability (The Future of Configuration Management).

As Martijn starts in the preface:

“I decided to write this book because, in my search for more knowledge on the topics of Part Re-Identification, Interchangeability, and Traceability, I could only find bits and pieces but not a comprehensive work that helps fundamentally understand these topics”.

I believe the book should become standard literature for engineering schools that deal with PLM and CM, for software vendors and implementers and last but not least companies that want to improve or better clarify their change processes.

Martijn writes in an easily readable style and uses step-by-step examples to discuss the various options. There are even exercises at the end to use in a classroom or for your team to digest the content.

The good news is that the book is not about the past. You might also know Martijn for our joint discussion, The Future of Configuration Management, together with Maxime Gravel and Lisa Fenwick, on the impact of a model-based and data-driven approach to CM.

I plan to come back with a more dedicated discussion at some point with Martijn soon. Meanwhile, start reading the book. Get your free chapter if needed by following the link at the bottom of this article.

I recommend buying the book as a paperback so you can navigate easily between the diagrams and the text.

Conclusion

The trend for federated PLM is becoming more and more visible as companies start implementing these concepts. The end of monolithic PLM is a threat and an opportunity for the existing PLM Vendors. Will they work towards an open plug-and-play future, or will they keep their portfolios closed? What do you think?

Last week I had the opportunity to discuss the topic of Systems of Engagement in the context of the more extensive PLM landscape.

I spoke with Andre Wegner from Authentise and their product Threads, MJ Smith from CoLab and Oleg Shilovitsky from OpenBOM.

I invited all three of them to discuss their background, their target customers, the significance of real-time collaboration outside discipline siloes, how they connect to existing PLM systems (Systems of Record), and finally, whether a company culture plays a role.

Listen to this almost 45 min discussion here  (save the m4a file first) or watch the discussion below on YouTube.

 

What I learned from this conversation

  • Systems of Engagement are bringing value to small enterprises but also as complementary systems to traditional PLM environments in larger companies.
  • Thanks to their SaaS approach, they are easy to install and use to fulfill a need that would take weeks/months to implement in a traditional PLM environment. They can be implemented at a department level or by connecting a value chain of people.
  • Due to their real-time collaboration capabilities, these systems provide fast and significant benefits.
  • Systems of Engagement represent the trend that companies want to move away from monolithic systems and focus on working with the correct data connected to the users. A topic I will explore in a future blog post/

I am curious to learn what you pick up from this conversation – are we missing other trends? Use the comments to this post.

 

Related to the company:
Visit Authentise.com

Related to the product:
Learn more about Collaborative Threads

Related to the reported benefits:
– Surgical robotics R&D team tracks 100% of their decisions and saves 150 hours in the first two weeks… doubling the effective size of their team:

Related to the company:
Visit Colabsoftware.com

Related to the product
Raise the bar for your design conversations

Related to the reported benefits
– How Mainspring used CoLab to achieve a 50% cost reduction redesign in half the time
– How Ford Pro Accelerated Time to Market by 30%

Related to the company:
Visit openbom.com

Related to the product:
Global Collaborative SaaS Platform For Industrial Companies

Related to reported benefits:
– OpenBOM makes the OKOS team 20% more efficient by helping to reduce inventory errors, costs, and streamlining supplier process
– VarTech Systems Optimizes Efficiency by Saving Two Hours of Engineering Time Daily with OpenBOM

 

Conclusion

I believe that Systems of Engagement are important for the digital transformation of a company.

They allow companies to learn what it means to work in a SaaS environment, potentially outside traditional company borders but with a focus on a specific value stream.

Thanks to their rapid deployment times, they help the company to grow its revenue even when the existing business is under threat due to newcomers.

The diagram below says it all. What are your favorite Systems of Engagement?

Hot from the press

Don’t miss the latest episode from the Share PLM podcast with Yousef Hooshmand – the discussion is very much connected to this discussion.

In the past few weeks, together with Share PLM, we recorded and prepared a few podcasts to be published soon. As you might have noticed, for Season 2, our target is to discuss the human side of PLM and PLM best practices and less the technology side.  Meaning:

  • How to align and motivate people around a PLM initiative?
  • What are the best practices when running a PLM initiative?
  • What are the crucial skills you need to have as a PLM lead?

And as there are always many success stories to learn on the internet, we also challenged our guests to share the moments where they got experienced.

As the famous quote says:

Experience is what you get when you don’t get what you expect!

We recently published our with Antonio Casaschi from Assa Abloy, a Swedish company you might have never noticed, although their products and services are a part of your daily life.

It was a discussion to my heart. We discussed the various aspects of PLM. What makes a person a PLM professional? And if you have no time to listen for these 35 minutes, read and scan the recording transcript on the transcription tab.

At 0:24:00, Antonio mentioned the concept of Proof of Concept as he had good experiences with them in the past. The remark triggered me to share some observations that a Proof of Concept (POC) is an old-fashioned way to drive change within organizations. Not discussed in this podcast but based on my experience, companies have been using the Proof Of Concepts to win time, as they were afraid to make a decision.

 

A POC to gain time?

 Company A

When working with a well-known company in 2014, I learned they were planning approximately ten POC per year to explore new ways of working or new technologies. As it was a POC based on an annual time scheme, the evaluation at the end of the year was often very discouraging.

Most of the time, the conclusion was: “Interesting, we should explore this further” /“What are the next POCs for the upcoming year?

There was no commitment to follow-up; it was more of a learning exercise not connected to any follow-up.

Company B

During one of the PDT events, a company presented that two years POC with the three leading PLM vendors, exploring supplier collaboration. I understood the PLM vendors had invested much time and resources to support this POC, expecting a big deal. However, the team mentioned it was an interesting exercise, and they learned a lot about supplier collaboration.

And nothing happened afterward ………

In 2019

At the 2019 Product Innovation Conference in London, when discussing Digital Transformation within the PLM domain, I shared in my conclusion that the POC was mainly a waste of time as it does not push you to transform; it is an option to win time but is uncommitted.

My main reason for not pushing a POC is that it is more of a limited feasibility study.

  • Often to push people and processes into the technical capabilities of the systems used. A focus starting from technology is the opposite of what I have been pushing for longer: First, focus on the value stream – people and processes- and then study which tools and technologies support these demands.
  • Second, the POC approach often blocks innovation as the incumbent system providers will claim the desired capabilities will come (soon) within their systems—a safe bet.

 

The Minimum Viable Product approach (MVP)

With the awareness that we need to work differently and benefit from digital capabilities also came the term Minimum Viable Product or MVP.

The abbreviation MVP is not to be confused with the minimum valuable products or most valuable players.

There are two significant differences with the POC approach:

  • You admit the solution does not exist anywhere – so it cannot be purchased or copied.
  • You commit to the fact that this new approach will be the right direction to take and agree that a perfect fit solution is not blocking you from starting for real.

These two differences highlight the main challenges of digital transformation in the PLM domain. Digital Transformation is a learning process – it takes time for organizations to acquire and master the needed skills. And secondly, it cannot be a big bang, and I have often referred to the 2017 article from McKinsey: Toward an integrated technology operating model. Image below.

We will soon hear more about digital transformation within the PLM domain during the next episode of our SharePLM podcast. We spoke with Yousef Hooshmand, currently working for NIO, a Chinese multinational automobile manufacturer specializing in designing and developing electric vehicles, as their PLM data lead.

You might have discovered Yousef earlier when he published his paper: “From a Monolithic PLM Landscape to a Federated Domain and Data Mesh”. It is highly recommended that to read the paper if you are interested in a potential PLM future infrastructure. I wrote about this whitepaper in 2022: A new PLM paradigm discussing the upcoming Systems of Engagement on top of a Systems or Record infrastructure.

To align our terminology with Yousef’s wording, his domains align with the Systems of Engagement definition.

As we discovered and discussed with Yousef, technology is not the blocking issue to start. You must understand the target infrastructure well and where each domain’s activities fit. Yousef mentions that there is enough literature about this topic, and I can refer to the SAAB conference paper: Genesis -an Architectural Pattern for Federated PLM.

For a less academic impression, read my blog post, The week after PLM Roadmap / PDT Europe 2022, where I share the highlights of Erik Herzog’s presentation: Heterogeneous and Federated PLM – is it feasible?

There is much to learn and discover which standards will be relevant, as both Yousef and Erik mention the importance of standards.

The podcast with Yousef (soon to be found HERE) was not so much about organizational change management and people.

However, Yousef mentioned the most crucial success factor for the transformation project he supported at Daimler. It was C-level support, trust and understanding of the approach, knowing it will be many years, an unavoidable journey if you want to remain competitive.

 

And with the journey aspect comes the importance of the Minimal Viable Product. You are starting a journey with an end goal in mind (top-of-the-mountain), and step by step (from base camp to base camp), people will be better covered in their day-to-day activities thanks to digitization.

A POC would not help you make the journey; perhaps a small POC would understand what it takes to cross a barrier.

 

Conclusion

The concept of POCs is outdated in a fast-changing environment where technology is not necessary the blocking issue. Developing practices, new architectures and using the best-fit standards is the future. Embrace the Minimal Viable Product approach. Are you?

 

Two weeks ago, I shared my post: Modern PLM is (too) complex on LinkedIn, and apparently, it was a topic that touched many readers. Almost a hundred likes, fifty comments and six shares. Not the usual thing you would expect from a PLM blog post.

In addition, the article led to offline discussions with peers, giving me an even better understanding of what people think. Here is a summary of the various talks.

 

What is PLM?

In particular, since the inception of Product Lifecycle Management, software vendors have battled with the various PLM definitions.

Initially, PLM was considered an engineering tool for product development, with an extensive potential set of capabilities supported by PowerPoint. Most companies actually implemented a collaborative PDM system at that time and named it PLM.

Was PLM really understood? Look at the infamous Autodesk CEO Carl Bass’s anti-PLM rap from 2007. Next, in 2012, Autodesk introduced its PLM solution called Autodesk PLM 360 as one of the first cloud solutions.

Only with growing connectivity and enterprise information sharing did the definition of PLM start to change.

PLM became a product information backbone serving downstream deployment with product data – the traditional Teamcenter, Windchill and ENOVIA implementations are typical examples of this phase.

With a digitization effort taking place in the non-PLM domain, connecting product development, design and delivery data to a company’s digital business became necessary. You could say, and this is the CIMdata definition:

PLM is a strategic business approach that applies a consistent set of business solutions that support the collaborative creation, management, dissemination, and use of product definition information. PLM supports the extended enterprise (customers, design and supply partners, etc.)

I agree with this definition; perhaps 80 % of our PLM community does. But how many times have we been trapped again in the same thinking: PLM is a system.

The most recent example is the post from Oleg Shilovitsky last week where he claims: Discover why OpenBOM reigns supreme in the world of PLM! 

Nothing wrong with that, as software vendors will always tweak definitions as they need marketing to make a profit, but PLM is not a system.

My main point is that PLM is a “vague” community label with many interpretations. Software vendors have the most significant marketing budget to push their unique definitions. However, also various practitioners in the field have their interpretations.

And maybe Martin Haket’s comment to the post says it all (partly quote):

I’m a bit late to this discussion, but in my opinion, the complexity is mainly due to the fact that the ownership of the processes and data models underlying PLM are not properly organized. ‘Everybody’ in the company is allowed to mix in the discussion and have their opinion; legacy drives departments to undesirable requirements leading to complex implementations.

My intermediate conclusion: Our legacy and lack of a single definition of PLM make it complex.

 

The PLM professional

On LinkedIn, there are approximately 14.000 PLM consultants in my first and second levels of connections. This number indicates that the label “PLM Consultant” has a specific recognition.

During my “PLM is complex” discussion, I noticed Roger Tempest’s Professional PLM White paper and started the dialogue with him.

Roger Tempest is one of the co-founders of the PLM Interest Group. He has been trying to create a baseline for a foundational PLM certification with several others. We discussed the challenges of getting the PLM Professional recognized as an essential business role. Can we certify the PLM professional the same way as a certified Configuration Manager or certified Project Manager?

I shared my thoughts with Roger, claiming that our discipline is too vague and diverse and that finding a common baseline is hard.

Therefore, we are curious about your opinion too. Please tell us in the comments to this post what you think about recognizing the PLM professional and what skills should be the minimum. What are the basics of a PLM professional?

In addition, I participated in some of the SharePLM podcast recordings with PLM experts from the field (follow us here). I raised the PLM professional question either during the podcast or during the preparation of the after-party. Also, there was no single unique answer.

So much is part of PLM: people (culture, skills), processes & data, tools & infrastructures (architectures, standards) combined with execution (waterfall/agile?)

My intermediate conclusion: The broadness of PLM makes it complex to have a common foundation.

 

More about complexity

PEOPLE: Let’s zoom in on the aspects of complexity. Starting from the People, Processes, Data and Tools discussion. The first thing mentioned is “the people,” organizations usually claim: “the most important assets in our organization are the people”.

However, people are usually the last dimension considered in business changes. Companies start with the tools, try to build the optimal processes and finally push the people into that framework by training, incentives or just force.

The reason for the last approach is that dealing with people is complex. People have their beliefs, their legacy and their motivation. And if people do not feel connected to the business (change), they will become an obstacle to change – look at the example below from my 2014 PI Apparel presentation:

To support the importance of people, I am excited to work with Share PLM and the Season 2 podcast series.

In these episodes, we talk with successful PLM experts about their lessons learned during PLM implementation. You will discover it is a learning process, and connecting to people in different cultures is essential. As it is a learning process, you will find it takes time and human skills to master this complexity.

Often human skills are called “soft skills”, but actually, they are “vital skills”!

 

PROCESSES: Regarding the processes part, this is another challenging topic. Often we try to simplify processes to make them workable (sounds like a good idea). With many seasoned PLM practitioners coming from the mechanical product development world, it is not a surprise that many proposed PLM processes are BOM-centric – building on PDM and ERP capabilities.

In my post: The rise and fall of the BOM? I started with this quote from Jan Bosch:

An excessive focus on the bill of materials leads to significant challenges for companies that are undergoing a digital transformation and adopting continuous value delivery. The lack of headroom, high coupling and versioning hell may easily cause an explosion of R&D expenditure over time.

Today’s organization and product complexity does not allow us to keep the processes simple to remain competitive. In that context, have a look at Erik Herzog’s comment on PLM complexity:

I believe a contributing factor to making PLM complex lies in our tendency to make too many simplifications. Do we understand a simple thing such as configuration change management in incremental development? At least in my organization, there is room for improvement.

In the comment, Erik also provided a link to his conference paper: Introducing the 4-Box Development Model describing the potential interaction between  Systems Engineering and Configuration Management. A topic that is too complex for your current company; however, it illustrates that you cannot generalize and simplify PLM overall.

In addition to Erik’s comments, I want to mention again that we can change our business processes thanks to a modern, connected, data-driven infrastructure. From coordinated to connected working with a mix of Systems of Engagement (new) and Systems of Record (traditional). There are no solid best practices yet, but the real PLM geeks are becoming visible.

TOOLS & DATA: When discussing the future: From Coordinated to Connected, there has always been a discussion about the legacy.

Should we migrate the legacy data and systems and replace them with new tools and data models? Or are there other options? The interaction of tools and data is often the domain of Enterprise Solution Architects. The Solution Architect’s role becomes increasingly important in a modern, data-driven company, and several are pretty active in PLM, if you know how to find them, because they are not in the mainstream of PLM.

This week we made a SharePLM podcast recording with Yousef Hooshmand. I wrote about his paper “From a Monolithic PLM Landscape to a Federated Domain and Data Mesh” last year as Yousef describes the complex process, that time working at Daimler, to slowly replace old legacy infrastructure with a new modern user/role-centric data-driven infrastructure.

Watch out for this recording to be published soon as Yousef shares various provoking experiences. Not to provoke our community but to create the awareness that a transformation is possible when you have the right long-term vision, strategy and C-level support.

 

Fighting complexity

And then there are people trying to fight complexity by describing their best practices. There was the launch of Martijn Dullaart’s book: The essential guide to Part Re-Identification. Martijn mentioned that he took the time to write his book based on all our PLM and CM communities interactions instead of writing a series of blog posts, which you still find on his MDUX site. I plan to read this book too this summer and hopefully come back with Martijn and others in a discussion about the book.

Note: We have CM people involved in many of the PLM discussions. I think they are fighting similar complexity like others in the PLM domain. However, they have the benefit that their role: Configuration Manager, is recognized and supported by a commercial certification organization( the Institute of Process Excellence – IpX ).

While completing this post, I read this article from Oleg Shilovitsky: PLM User Groups and Communities. At first glance, you might think that PLM User Groups and Communities might be the solution to address the complexity.

And I think they do; there are within most PLM vendors orchestrated User Groups and Communities. Depending on your tool vendor, you will find like-minded people supported by vendor experts. Are they reducing the complexity? Probably not, as they are at the end of the People, Processes, Data and Tools discussion. You are already working within a specific boundary.

Based on my experience as a core PLM Global Green Alliance member, I think PLM-neutral communities are not viable. There is very little interaction in this community, with currently 686 members, although the topics are very actual. Yes, people want to consume and learn, but making time available to share is, unfortunately, impossible when not financially motivated. Sharing opinions, yes, but working on topics: we are too busy.

 

Conclusion

The term PLM seems adequate to identify a group with a common interest (and skills?) Due to the broad scope and aspects – it is impossible to create a standard job description for the PLM professional, and we must learn to live with that- see my arguments.

What do you think?

Are you ready to discuss the complexity of PLM with your peers

Last week I enjoyed visiting LiveWorx 2023 on behalf of the PLM Global Green Alliance. PTC had invited us to understand their sustainability ambitions and meet with the relevant people from PTC, partners, customers and several of my analyst friends. It felt like a reunion.

In addition, I used the opportunity to understand better their Velocity SaaS offering with OnShape and Arena. The almost 4-days event, with approximately 5000 attendees, was massive and well-organized.

So many people were excited that this was again an in-person event after four years.

With PTC’s broad product portfolio, you could easily have a full agenda for the whole event, depending on your interests.

I was personally motivated that I had a relatively full schedule focusing purely on Sustainability, leaving all these other beautiful end-to-end concepts for another time.

Here are some of my observations

Jim Heppelman’s keynote

The primary presentation of such an event is the keynote from PTC’s CEO. This session allows you to understand the company’s key focus areas.

My takeaways:

  • Need for Speed: Software-driven innovation, or as Jim said, Software is eating the BOM, reminding me of my recent blog post: The Rise and Fall of the BOM. Here Jim was referring to the integration with ALM (CodeBeamer) and IoT to have full traceability of products. However, including Software also requires agile ways of working.
  • Need for Speed: Agile ways of working – the OnShape and Arena offerings are examples of agile working methods. A SaaS solution is easy to extend with suppliers or other stakeholders. PTC calls this their Velocity offering, typical Systems of Engagement, and I spoke later with people working on this topic. More in the future.
  • Need for Speed: Model-based digital continuity – a theme I have discussed in my blog post too. Here Jim explains the interaction between Windchill and ServiceMax, both Systems of Record for product definition and Operation.
  • Environmental Sustainability: introducing Catherine Kniker, PTC’s Chief Strategy and Sustainability Officer, announcing that PTC has committed to Science Based Targets, pledging near-term emissions reductions and long-term net-zero targets – see image below and more on Sustainability in the next section.
  • A further investment in a SaaS architecture, announcing CREO+ as a SaaS solution supporting dynamic multi-user collaboration (a System of Engagement)
  • A further investment in the partnership with Ansys fits the needs of a model-based future where modeling and simulation go hand in hand.

You can watch the full session  Path to the Future: Products in the Age of Transformation here.

 

Sustainability

The PGGA spoke with Dave Duncan and James Norman last year about PTC’s sustainability initiatives. Remember: PLM and Sustainability: talking with PTC. Therefore, Klaus Brettschneider and I were happy to meet Dave and James in person just before the event and align on understanding what’s coming at PTC.

We agreed there is no “sustainability super app”; it is more about providing an open, digital infrastructure to connect data sources at any time of the product lifecycle, supporting decision-making and analysis. It is all about reliable data.

 

Product Sustainability 101

On Tuesday, Dave Duncan gave a great introductory session, Product Sustainability 101, addressing Business Drivers and Technical Opportunities. Dave started by explaining the business context aiming at greenhouse gas (GHG) reduction based on science-based targets, describing the content of Scope 1, Scope 2 and Scope 3 emissions.

The image above, which came back in several presentations later that week, nicely describes the mapping of lifecycle decisions and operations in the context of the GHG protocol.

 

Design for Sustainability (DfS)

On Wednesday, I started with a session moderated by James Norman titled Design for Sustainability: Harnessing Innovation for a Resilient Future. The panel consisted of Neil D’Souza (CEO Makersite), Tim Greiner (MD Pure Strategies), Francois Lamy (SVP Product Management PTC) and Asheen Phansey (Director ESG & Sustainability at PagerDuty). You can find the topic discussed below:

Some of the notes I took:

  • No specific PLM modules are needed, LCA needs to become an additional practice for companies, and they rely on a connected infrastructure.
  • Where to start? First, understand the current baseline based on data collection – what is your environmental impact? Next, decide where to start
  • The importance of Design for Service – many companies design products for easy delivery, not for service. Being able to service products better will extend their lifetime, therefore reducing their environmental impact (manufacturing/decommissioning)
  • There Is a value chain for carbon data. In addition, suppliers significantly impact reaching net zero, as many OEMs have an Assembly To Order process, and most of the emissions are done during part manufacturing.

 

DfS: an example from Cummins

Next, on Wednesday, I attended the session from David Genter from Cummins, who presented their Design for Sustainability (DfS) project.

Dave started by sharing their 2030 sustainability goals:

  • On Facilities and  Operations: A reduction of 50 % of GHG emissions, reducing water usage by 30 %, reducing waste by 25 % and reducing organic compound emissions by 50%
  • Reducing Scope 3 emissions for new products by 25%
  • In general, reducing Scope 3 emissions by 55M metric tons.

The benefits for products were documented using a standardized scorecard (example below) to ensure the benefits are real and not based on wishful thinking.

Many motivated people wanted to participate in the project, and the ultimate result demonstrated that DfS has both business value for Cummins and the environment.

The project has been very well described in this whitepaper: How Cummins Made Changes to Optimize Product Designs for the Environment – a recommended case study to read.

 

Tangible Strategies for Improving Product Sustainability

The session was a dialogue between Catherine Kniker and Dave Duncan, discussing the strategies to move forward with Sustainability.

They reiterated the three areas where we as a PLM community can improve: Material choice and usage, Addressing Energy Emissions and Reducing Waste. And it is worth addressing them all, as you can see below – it is not only about carbon reduction.

It was an informative dialogue going through the different aspects of where we, as an engineering/ PLM community, can contribute. You can watch their full dialog here: Tangible Strategies for Improving Product Sustainability.

 

Conclusion

It was encouraging to see that at such an event as LiveWorx, you could learn about Sustainability and discuss Sustainability with the audience and PTC partners. And as I mentioned before, we need to learn to measure (data-driven / reliable data), and we need to be able to work in a connected infrastructure (digital thread) to allow design, simulation, validation and feedback to go hand in hand. It requires adapting a business strategy, not just a tactical solution. With the PLM Global Green Alliance, we are looking forward to following up on these.

NOTE: PTC covered the expenses associated with my participation in this event but did not in any way influence the content of this post – I made my tour fully independent through the conference and got encouraged by all the conversations I had.

 

This year started for me with a discussion related to federated PLM. A topic that I highlighted as one of the imminent trends of 2022. A topic relevant for PLM consultants and implementers. If you are working in a company struggling with PLM, this topic might be hard to introduce in your company.

Before going into the discussion’s topics and arguments, let’s first describe the historical context.

 

The traditional PLM frame.

Historically PLM has been framed first as a system for engineering to manage their product data. So you could call it PDM first. After that, PLM systems were introduced and used to provide access to product data, upstream and downstream. The most common usage was the relation with manufacturing, leading to EBOM and MBOM discussions.

The traditional ENOVIA PLM backbone

IT landscape simplification often led to an infrastructure of siloed solutions – PLM, ERP, CRM and later, MES. IT was driving the standardization of systems and defining interfaces between systems. System capabilities were leading, not the flow of information.

As many companies are still in this stage, I would call it PLM 1.0

PLM 1.0 systems serve mainly as a System of Record for the organization, where disciplines consolidate their data in a given context, the Bills of Information. The Bill of Information then is again the place to connect specification documents, i.e., CAD models, drawings and other documents, providing a Digital Thread.

Aras – Bills of Information creating the Digital Thread

The actual engineering work is done with specialized tools, MCAD/ECAD, CAE, Simulation, Planning tools and more. Therefore, each person could work in their discipline-specific environment and synchronize their data to the PLM system in a coordinated manner.

However, this interaction is not easy for some of the end-users. For example, the usability of CAD integrations with the PLM system is constantly debated.

Many of my implementation discussions with customers were in this context. For example, suppose your products are relatively simple, or your company is relatively small. In that case, the opinion is that the System or Record approach is overkill.

That’s why many small and medium enterprises do not see the value of a PLM backbone.

This could be true till recently. However, the threats to this approach are digitization and regulations.

Customers, partners, and regulators all expect more accurate and fast responses on specific issues, preferably instantly. In addition, sustainability regulations might push your company to implement a System of Record.

 

PLM as a business strategy

For the past fifteen years, we have discussed PLM more as a business strategy implemented with business systems and an infrastructure designed for sharing. Therefore, I choose these words carefully to avoid overhanging the expression: PLM as a business strategy.

The reason for this prudence is that, in reality, I have seen many PLM implementations fail due to the ambiguity of PLM as a system or strategy. Many enterprises have previously selected a preferred PLM Vendor solution as a starting point for their “PLM strategy”.

One of the most neglected best practices.

In reality, this means there was no strategy but a hope that with this impressive set of product demos, the company would find a way to support its business needs. Instead of people, process and then tools to implement the strategy, most of the time, it was starting with the tools trying to implement the processes and transform the people. That is not really the definition of business transformation.

In my opinion, this is happening because, at the management level, decisions are made based on financials.

Developing a PLM-related business strategy requires management understanding and involvement at all levels of the organization.

This is often not the case; the middle management has to solve the connection between the strategy and the execution. By design, however, the middle management will not restructure the organization. By design, they will collect the inputs van the end users.

And it is clear what end users want – no disruption in their comfortable way of working.

Halfway conclusion:

Rebranding PLM as a business strategy has not really changed the way companies work. PLM systems remain a System of Record mainly for governance and traceability.

To understand the situation in your company, look at who is responsible for PLM.

  • If IT is responsible, then most likely, PLM is not considered a business strategy but more an infrastructure.
  • If engineering is responsible for PLM, then you are still in the early days of PLM, the engineering tools to be consulted by others upstream or downstream.

Only when PLM accountability is at the upper management level, it might be a business strategy (assume the upper management understands the details)

 

Connected is the game changer

Connecting all stakeholders in an engagement has been a game changer in the world. With the introduction of platforms and the smartphone as a connected device, consumers could suddenly benefit from direct responses to desired service requests (Spotify, iTunes, Uber, Amazon, Airbnb, Booking, Netflix, …).

The business change: connecting real-time all stakeholders to deliver highly rapid results.

What would be the game changer in PLM was the question? The image below describes the 2014 Accenture description of digital PLM and its potential benefits.

 

Is connected PLM a utopia?

Marc Halpern from Gartner shared in 2015 the slide below that you might have seen many times before. Digital Transformation is really moving from a coordinated to a connected technology, it seems.

The image below gives an impression of an evolution.

I have been following this concept till I was triggered by a 2017 McKinsey publication: “our insights/toward an integrated technology operating model“.

This was the first notion for me that the future should be hybrid, a combination of traditional PLM   (system of record) complemented with teams that work digitally connected; McKinsey called them pods that become product-centric (multidisciplinary team focusing on a product) instead of discipline-centric (marketing/engineering/manufacturing/service)

In 2019 I wrote the post: The PLM migration dilemma supporting the “shocking” conclusion “Don’t think about migration when moving to data-driven, connected ways of working. You need both environments.”

One of the main arguments behind this conclusion was that legacy product data and processes were not designed to ensure data accuracy and quality on such a level that it could become connected data. As a result, converting documents into reliable datasets would be a costly, impossible exercise with no real ROI.

The second argument was that the outside world, customers, regulatory bodies and other non-connected stakeholders still need documents as standardized deliverables.

The conclusion led to the image below.

Systems of Record (left) and Systems of Engagement (right)

 

Splitting PLM?

In 2021 these thoughts became more mature through various publications and players in the PLM domain.

We saw the upcoming of Systems of Engagement – I discussed OpenBOM, Colab and potentially Configit in the post: A new PLM paradigm. These systems can be characterized as connected solutions across the enterprise and value chain, focusing on a platform experience for the stakeholders.

These are all environments addressing the needs of a specific group of users as efficiently and as friendly as possible.

A System of Engagement will not fit naturally in a traditional PLM backbone; the System of Record.

Erik Herzog with SAAB Aerospace and Yousef Houshmand at that time with Daimler published that year papers related to “Federated PLM” or “The end of monolithic PLM.”. They acknowledged a company needs to focus on more than a single PLM solution. The presentation from Erik Herzog at the PLM Roadmap/PDT conference was interesting because Erik talked about the Systems of Engagement and the Systems of Record. He proposed using OSLC as the standard to connect these two types of PLM.

It was a clear example of an attempt to combine the two kinds of PLM.

And here comes my question: Do we need to split PLM?

When I look at PLM implementations in the field, almost all are implemented as a System of Record, an information backbone proved by a single vendor PLM. The various disciplines deliver their content through interfaces to the backbone (Coordinated approach).

However, there is low usability or support for multidisciplinary collaboration; the PLM backbone is not designed for that.

Due to concepts of Model-Based Systems Engineering (MBSE) and Model-Based Definition (MBD), there are now solutions on the market that allow different disciplines to work jointly related to connected datasets that can be manipulated using modeling software  (1D, 2D, 3D, 4D,…).

These environments, often a mix of software and hardware tools, are the Systems of Engagement and provide speedy results with high quality in the virtual world. Digital Twins are running on Systems of Engagements, not on Systems of Records.

Systems of Engagement do not need to come from the same vendor, as they serve different purposes. But how to explain this to your management, who wants simplicity. I can imagine the IT organization has a better understanding of this concept as, at the end of 2015, Gartner introduced the concept of the bimodal approach.

Their definition:

Mode 1 is optimized for areas that are more well-understood. It focuses on exploiting what is known. This includes renovating the legacy environment so it is fit for a digital world. Mode 2 is exploratory, potentially experimenting to solve new problems. Mode 2 is optimized for areas of uncertainty. Mode 2 often works on initiatives that begin with a hypothesis that is tested and adapted during a process involving short iterations.

No Conclusion – but a question this time:

At the management level, unfortunately, there is most of the time still the “Single PLM”-mindset due to a lack of understanding of the business. Clearly splitting your PLM seems the way forward. IT could be ready for this, but will the business realize this opportunity?

What are your thoughts?

 

In the last weeks, I had several discussions related to sustainability. What can companies do to become sustainable and prove it? But, unfortunately, there is so much greenwashing at this moment.

Look at this post: 10 Companies and Corporations Called Out For Greenwashing.

Therefore I thought about which practical steps a company should take to prepare for a sustainable future, as the change will not happen overnight. It reminds me of the path towards a digital, model-based enterprise (my other passion). In my post Why Model-Based definition is important for all, I mentioned that MBD (Model-Based Definition) could be considered the first stepping-stone toward a Model-Based enterprise.

The analogy for Material Compliance came after an Aras seminar I watched a month ago. The webinar How PLM Paves the Way for Sustainability with  Insensia (an Aras implementer) demonstrates how material compliance is the first step toward sustainable product development.

Let’s understand why

The first steps

Companies that currently deliver solutions mostly only focus on economic gains. The projects or products they sell need to be profitable and competitive, which makes sense if you want a future.

And this would not have changed if the awareness of climate impact has not become apparent.

First, CFKs and hazardous materials lead to new regulations. Next global agreements to fight climate change – the Paris agreement and more to come – have led and will lead to regulations that will change how products will be developed. All companies will have to change their product development and delivery models when it becomes a global mandate.

A required change is likely going to happen. In Europe, the Green Deal is making stable progress. However, what will happen in the US will be a mystery as even their supreme court becomes a political entity against sustainability (money first).

Still, compliance with regulations will be required if a company wants to operate in a global market.

What is Material Compliance?

In 2002, the European Union published a directive to restrict hazardous substances in materials. The directive, known as RoHS (Restriction of Hazardous Substances), was mainly related to electronic components. In the first directive, six hazardous materials were restricted.

The most infamous are Cadmium(Cd), Lead(Pb), and Mercury (Hg). In 2006 all products on the EU market must pass RoHS compliance, and in 2011 was now connected the CE marking of products sold in the European market was.

In 2015 four additional chemical substances were added, most softening PVC but also affecting the immune system. Meanwhile, other countries have introduced similar RoHS regulations; therefore, we can see it as a global restricting. Read more here: The RoHS guide.

Consumers buying RoHS-compliant products now can be assured that none of the threshold values of the substances is reached in the product. The challenge for the manufacturer is to go through each of the components of the MBOM. To understand if it contains one of the ten restricted substances and, if yes, in which quantity.

Therefore, they need to get that information from each relevant supplier a RoHS declaration.

Besides RoHS, additional regulations protect the environment and the consumer. For example, REACH (Registration, Evaluation, Authorization and Restriction of Chemicals) compliance deals with the regulations created to improve the environment and protect human health. In addition, REACH addresses the risks associated with chemicals and promotes alternative methods for the hazard assessment of substances.

The compliance process in four steps

Material compliance is most of all the job of engineers. Therefore around 2005, some of my customers started to add RoHS support to their PLM environment.

 

Step 1

The image below shows the simple implementation – the PDF-from from the supplier was linked to the (M)BOM part.

An employee had to manually add the substances into a table and ensure the threshold values were not reached. But, of course, there was already a selection of preferred manufacturer parts during the engineering phase. Therefore RoHS compliance was almost guaranteed when releasing the EBOM.

But this process could be done more cleverly.

 

Step 2

So the next step was that manufacturers started to extend their PLM data model with the additional attributes for RoHS compliance. Again, this could be done cleverly or extremely generic, adding the attributes to all parts.

So now, when receiving the material declaration, a person just has to add the substance values to the part attributes. Then, through either standard functionality or customization, a compliance report could be generated for the (M)BOM. So this already saves some work.

 

Step 3

The next step was to provide direct access to these attributes to the supplier and push the supplier to do the work.

Now the overhead for the manufacturer has been reduced again. This is because only the supplier needs to do the job for his customer.

 

Step 4

In step 4, we see a real connected environment, where information is stored only once, referenced by manufacturers, and kept actual by the part suppliers.

Who will host the RoHS databank? From some of my customer projects, I recall IHS as a data provider – it seems they are into this business when you look at their website HERE.

 

Where is your company at this moment?

Having seen the four stepping-stones leading towards efficient RoHS compliance, you see the challenge of moving from a document-driven approach to a data-driven approach.

Now let’s look into the future. Concepts like Life Cycle Assessment (LCA) or a Digital Product Passport (DPP) will require a fully connected approach.

Where is your company at this moment – have you reached RoHS compliance step 3 or 4? A first step to learn and work connected and data-driven.

 

Life Cycle Assessment – the ultimate target

A lifecycle assessment, or lifecycle analysis (two times LCA again), is a methodology to assess the environmental impact of a product (or solution) through its whole lifecycle. From materials sourcing, manufacturing, transportation, usage, service, and decommissioning. And by assessing, we mean a clear, verifiable, and shareable manner, not just guessing.

Traditional engineering education is not bringing these skills, although LCA is not new, as this 10-years old YouTube movie from Autodesk illustrates:

What is new is that due to global understanding, we are reaching the limits of what our planet can endure; we must act now. Upcoming international regulations will enforce life cycle analysis reporting for manufacturers or service providers. This will happen gradually.

Meanwhile, we all should work on a circular economy, the major framework for a sustainable planet- click on the image on the left.

In my post, I wrote about these combined topics: SYSTEMS THINKING – a must-have skill in the 21st century.

 

Life Cycle Analysis – Digital Twin – Digitization

The big elephant in the room is that when we talk about introducing LCA in your company, it has a lot to do with the digitization of your company. Assessment data in a document can require too much human effort to maintain the data at the right quality. The costs are not affordable if your competitor is more efficient.

When coming to the Analysis part, here, a model-based, data-driven infrastructure is the most efficient way to run virtual analysis, using digital twin concepts at each stage of the product lifecycle.

Virtual models for design, manufacturing and operations allow your company to make trade-off studies with low cost before committing to the physical world. 80 % of the environmental impact of a product comes from decisions in the virtual world.

Once you have your digital twins for each phase of the product lifecycle, you can benchmark your models with data reported from the physical world. All these interactions can be found in the beautiful Boeing diamond below, which I discussed before – Read A digital twin for everybody.

 

Conclusion

Efficient and sustainable life cycle assessment and analysis will come from connected information sources. The old document-driven paradigm is too costly and too slow to maintain. In particular, when the scope is not only a subset of your product, it is your full product and its full lifecycle with LCA. Another stepping stone towards the near future. Where are you?

 

Stepping-stone 1:            From Model-Based Definition to an efficient Model-Based, Data-driven Enterprise

Stepping-stone 2:            For RoHS compliance to an efficient and sustainable Model-Based, data-driven enterprise.

While preparing my presentation for the Dutch Model-Based Definition solutions event, I had some reflections and experiences discussing Model-Based Definition. Particularly in traditional industries. In the Aerospace & Defense, and Automotive industry, Model-Based Definition has become the standard. However, other industries have big challenges in adopting this approach. In this post, I want to share my observations and bring clarifications about the importance.

 

What is a Model-Based Definition?

The Wiki-definition for Model-Based Definition is not bad:

Model-based definition (MBD), sometimes called digital product definition (DPD), is the practice of using 3D models (such as solid models, 3D PMI and associated metadata) within 3D CAD software to define (provide specifications for) individual components and product assemblies. The types of information included are geometric dimensioning and tolerancing (GD&T), component level materials, assembly level bills of materials, engineering configurations, design intent, etc.

By contrast, other methodologies have historically required the accompanying use of 2D engineering drawings to provide such details.

When I started to write about Model-Based definition in 2016, the concept of a connected enterprise was not discussed. MBD mainly enhanced data sharing between engineering, manufacturing, and suppliers at that time. The 3D PMI is a data package for information exchange between these stakeholders.

The main difference is that the 3D Model is the main information carrier, connected to 2D manufacturing views and other relevant data, all connected in this package.

 

MBD – the benefits

There is no need to write a blog post related to the benefits of MBD. With some research, you find enough reasons. The most important benefits of MBD are:

  • the information is and human-readable and machine-readable. Allowing the implementation of Smart Manufacturing / Industry 4.0 concepts
  • the information relies on processes and data and is no longer dependent on human interpretation. This leads to better quality and error-fixing late in the process.
  • MBD information is a building block for the digital enterprise. If you cannot master this concept, forget the benefits of MBSE and Virtual Twins. These concepts don’t run on documents.

To help you discover the benefits of MBD described by others – have a look here:

 

MBD as a stepping stone to the future

When you are able to implement model-based definition practices in your organization and connect with your eco-system, you are learning what it means to work in a connected matter. Where the scope is limited, you already discover that working in a connected manner is not the same as mandating everyone to work with the same systems or tools. Instead, it is about new ways of working (skills & people), combined with exchange standards (which to follow).

Where MBD is part of the bigger model-based enterprise, the same principles apply for connecting upstream information (Model-Based Systems Engineering) and downstream information(IoT-based operation and service models).

Oleg Shilovitsky addresses the same need from a data point of view in his recent blog: PLM Strategy For Post COVID Time. He makes an important point about the Digital Thread:

Digital Thread is one of my favorite topics because it is leading directly to the topic of connected data and services in global manufacturing networks.

I agree with that statement as the digital thread is like MBD, another steppingstone to organize information in a connected manner, even beyond the scope of engineering-manufacturing interaction. However, Digital Thread is an intermediate step toward a full data-driven and model-based enterprise.

To master all these new ways is working, it is crucial for the management of manufacturing companies, both OEM and their suppliers, to initiate learning programs. Not as a Proof of Concept but as a real-life, growing activity.

Why MBD is not yet a common practice?

If you look at the success of MBD in Aerospace & Defense and Automotive, one of the main reasons was the push from the OEMs to align their suppliers. They even dictated CAD systems and versions to enable smooth and efficient collaboration.

In other industries, there we not so many giant OEMs that could dictate their supply chain. Often also, the OEM was not even ready for MBD. Therefore, the excuse was often we cannot push our suppliers to work different, let’s remain working as best as possible (the old way and some automation)

Besides the technical changes, MBD also had a business impact. Where the traditional 2D-Drawing was the contractual and leading information carrier, now the annotated 3D Model has to become the contractual agreement. This is much more complex than browsing through (paper) documents; now, you need an application to open up the content and select the right view(s) or datasets.

In the interaction between engineering and manufacturing, you could hear statements like:

you can use the 3D Model for your NC programming, but be aware the 2D drawing is leading. We cannot guarantee consistency between them.

In particular, this is a business change affecting the relationship between an OEM and its suppliers. And we know business changes do not happen overnight.

Smaller suppliers might even refuse to work on a Model-Based definition, as it is considered an extra overhead they do not benefit from.

In particular, when working with various OEMs that might have their own preferred MBD package content based on their preferred usage. There are standards; however, OEMs often push for their preferred proprietary format.

It is about an orchestrated change.

Implementing MBD in your company, like PLM, is challenging because people need to be aligned and trained on new ways of working. In particular, this creates resistance at the end-user level.

Similar to the introduction of mainstream CAD (AutoCAD in the eighties) and mainstream 3D CAD (Solidworks in the late nineties), it requires new processes, trained people, and matching tools.

This is not always on the agenda of C-level people who try to avoid technical details (because they don’t understand them – read this great article: Technical Leadership: A Chronic Weakness in Engineering Enterprises.

I am aware of learning materials coming from the US, not so much about European or Asian thought leaders. Feel free to add other relevant resources for the readers in this post’s comments. Have a look and talk with:

Action Engineering with their OSCAR initiative: Bringing MBD Within Reach. I spoke with Jennifer Herron, founder of Action Engineering, a year ago about MBD and OSCAR in my blog post: PLM and Model-Based Definition.

Another interesting company to follow is Capvidia. Read their blog post to start with is MBD model-based definition in the 21st century.

The future

What you will discover from these two companies is that they focus on the connected flow of information between companies while anticipating that each stakeholder might have their preferred (traditional) PLM environment. It is about data federation.

The future of a connected enterprise is even more complex. So I was excited to see and download Yousef Hooshmand’s paper:  ”From a Monolithic PLM Landscape to a Federated Domain and Data Mesh”.

Yousef and some of his colleagues report about their PLM modernization project @Mercedes-Benz AG, aiming at transforming a monolithic PLM landscape into a federated Domain and Data Mesh.

This paper provides a lot of structured thinking related to the concepts I try to explain to my audience in everyday language. See my The road to model-based and connected PLM thoughts.

This paper has much more depth and is a must-read and must-discuss writing for those interested – perhaps an opportunity for new startups and a threat to traditional PLM vendors.

Conclusion

Vellum drawings are almost gone now – we have electronic 2D Drawings. The model-based definition has confirmed the benefits of improving the interaction between engineering, manufacturing & suppliers. Still, many industries are struggling with this approach due to process & people changes needed. If you are not able or willing to implement a model-based definition approach, be worried about the future. The eco-systems will only run efficiently (and survive) when their information exchange is based on data and models. Start learning now.

p.s. just out of curiosity:
If you are model-based advocate support this post with a

 

Translate

  1. Unknown's avatar
  2. Håkan Kårdén's avatar

    Jos, all interesting and relevant. There are additional elements to be mentioned and Ontologies seem to be one of the…

  3. Lewis Kennebrew's avatar

    Jos, as usual, you've provided a buffet of "food for thought". Where do you see AI being trained by a…

  4. Håkan Kårdén's avatar