Again, a “The weekend after …” post related to my favorite event to which I have contributed since 2014.

Expectations were high this time from my side, in particular because we would have a serious discussion related to connected digital threads and federated PLM.

More about these topics in my post next week as all content is not yet available for sharing.

The conference was sold out this time, and during the breaks, you had to navigate through the people to find your network opportunities. Also, the participation of the main PLM players as sponsors illustrated that  everyone wanted to benefit from this opportunity to meet and learn from their industry peers.

Looking back to the conference, there were two noticeable streams.

  • The stream where people share their current PLM experiences, traditionally the A&D action groups moderated by CIMdata, is part of this stream. This part I will cover in this post.
  • There were forward-looking presentations related to standards, ontologies, and federated PLM—all with an AI flavor. This part I will cover in my next post(s).

The connection between all these sessions was the Digital Thread. The conference’s theme was:   The Digital Thread in a Heterogeneous, Extended Enterprise Reality. Let’s start the review with the highlights from the first stream.

Digital Thread: Why Should We Care?

As usual, Peter Bilello from CIMdata kicked off the conference by setting the scene. Peter started by clarifying the two definitions of the Digital Thread.

  • The first is a communication framework that allows a connected data flow and integrated view of an asset’s data (i.e., its Digital Twin) throughout its lifecycle across traditionally siloed functional perspectives.
    In my terminology, the connected digital thread.
  • The second is a network of connected information sources around the product lifecycle supporting traceability and decision-making.
    In my terminology, the coordinated digital thread is the most straightforward digital thread to achieve.

An example of the Coordinated Digital Thread

Peter recommends starting a digital thread by connecting at the beginning of product conceptualization, creating an environment where one can analyze the performance of the product portfolio and the product features and capabilities that need to be planned or how they perform in the field.

In addition, when defining the products, connect them with regulatory requirement databases as they have must-have requirements. A topic I addressed in my session too, besides the existing regulatory requirements, it is expected that in the upcoming years, due to environmental regulations, these requirements will increase, and it will be necessary to have them integrated with your digital thread.

Digital Threads require data governance and are the basis for the various digital twins. Peter discussed the multiple applications of the digital twin, primarily a relation between a virtual asset and a physical asset, except in the early concept phase.

The digital thread is still in the early phase of implementation at companies. A CIMdata survey showed that companies still focus primarily on implementing traditional PDM capabilities, although as the image above shows, there is a growing interest in short-term digital twin/thread implementations.

 

People, Process & Technology:
The Pillars of Digital Transformation Success

The second keynote was from Christine McMonagle, Director of Digital Engineering Systems at Textron Systems a services and products supplier for the Aerospace and Defense industry. Christine leads the digital evolution in Textron Systems and presents nicely how a digital transformation should start from the people.Traditionally this industry has enough budget on the OEM level and therefore companies will not take a revolutionary approach when it comes to digital transformation.

Having your people at all levels involved and make them understand the need for change is crucial. A change does not happen top-down. You must educate people and understand what is possible and achievable to change – in the right direction. One of her concluding slides highlights the main points.

In the Q&A there to Christine’s sessions there was an interesting question related to the involvement of Human Resources (HR) in this project. There was a laugh that said it all – like in most companies HR is not focusing on organizational change, they focus more on operational issues – the Human is considered a Resource.

Turn resistance in support

Between the regular sessions there were short sessions from sponsors: Altium, Contact Software, Dassault Systemes, ESI, inensia, Modular Management , PTC, SAP, Share PLM and Sinequa could pitch their value offering.

The Share PLM session, shortly after Christine’s presentation was a nice continuation of the  focus on people. I loved the Share PLM image to the left explaining why people do not engage with our dreams.

 

Learn how LEONI is achieving Digital Continuity in the Automotive Industry.

Tobias Bauer, head of Product Data Standardization at LEONI talked about their FLOW project. FLOW is an acronym for Future Leoni Operating World. LEONI, well-known in the automotive industry  produces cable and network solutions, including cable harnesses.

Recently it has gone through a serious financial crisis and the need for restructuring. This makes it always challenging for a “visionary” PLM project. Tobias mentioned that after disappointing engagements with consultancy firms, they decided on a bottom-up approach to analyze existing processes using BPML. They agreed on a to-be state, fixing bottlenecks and streamlining the flow of information.

Tobias presented a smooth product data flow between their PLM system (PTC Windchill) and ERP (SAP S/4 HANA), clearly stating that the PLM system has become the controlled source of managing product changes.

Their key achievements reported so far were:

  • related to BOM creation and routing (approx. 10x faster – from 2-3 days to ¼ day),
  • better data consistency (fewer manual steps)
  • complete traceability between the systems with PLM as the change management backbone.

The last point I would call the coordinated Digital Thread. The image below shows their current IT landscape in a simplified manner.

This solution might seem obvious for neutral PLM academics or experts, but it is an achievement to do this in an environment with SAP implemented. The eBOM-mBOM discussion is one of the most frequent held discussions – sometimes a battle.

Often, companies use their IT systems first and listen to the vendor’s experts to build integrations instead of starting from the natural business flow of information.

 

Aerospace & Defense Action groups outcomes

As usual, several Aerospace & Defense (A&D) action groups reported their progress during this conference. The A&D action groups are facilitated by CIMdata, and per topic, various OEMs and suppliers in the A&D industry study and analyze a particular topic, often inviting software vendors to demonstrate and discuss their capabilities with them.

Their activities and reports can be found on the A&D PLM Action page here;  In the remainder of this post I will share briefly the ones presented. For a real deep dive in the topics I recommend to find the proceedings per topic on the  A&D action page.

 

The Promise and Reality of the Digital Thread

James Roche CIMdata presented insights from industry research on The Promise and Reality of the Digital Thread. A total of 90 persons completed an in-depth survey about the status and implementation of digital thread concepts in their company. It is clear that the digital thread is still in its early days in this industry, and it is mainly about the coordinated digital thread. The image below reflects the highlights of the survey.

 

A&D Industry Digital Twin and Digital Thread Standards

Robert Rencher from Boeing explained the progress of their Digital Twin/Digital Thread project, where they had investigated the applicable standards to support a Digital Twin/Digital Thread (Phase 4 out of 7 currently planned). The image below shows that various standards may apply depending on business perspectives.

Their current findings are:

  • Digital twin standards overlap, which is most likely a function of standards bodies representing their respective standards as an ongoing development from a historical perspective.
  • The limited availability of mature digital twin/thread standards requires greater attention by standards organizations.
  • The concept of the digital twin continues to evolve. This dynamic will be a challenge to standards bodies.
  • The digital twin and the digital thread are distinct aspects of digital transformation. The corresponding digital twin and digital thread standards will be distinctly different.
  • Coordinating the development of the respective standards between the digital twin/thread is needed.
  • The digital twin’s organization, definition, and enablement depend on data and information provided by the digital thread.

 

Roadmap for Enabling Global Collaboration

Robert Gutwein (Pratt & Whitney Canada) and Agnes Gourillon-Jandot (Safran Aircraft Engines) reported their progress on the Global Collaboration project. Collaboration is challenged as exchange methods can vary, as well as dealing with the validation of exchanged information and governing the exchange of information in the context of IP protection.

One of the focal points was to introduce an approach to define standardized supplier agreements that anticipate modern model-based exchanges and collaboration methods.

Robert & Agnes presented the 8-step guideline for the aerospace industry in specific terms, explicitly mentioning the ISO44001 standard as being generic for all industries. An impression of the eight steps and sub-steps can be found below:

The 8-step approach will be supported by a 3rd-party Collaboration Management System (CMS app), which is not mandatory but recommended for use. When an interaction depends on a specific tool, it cannot become an ISO standard. The purpose of the methodology and app is to assist participants to ensure the collaboration aspect between stakeholders contains all the necessary steps & and people.

 

Model-based OEM/Supplier Collaboration Needs in Aviation Industry

Hartmut Hintze, working at Airbus Operations, presented the latest findings of the MBSE Data Interoperability working group and presented the model-based OEM/Supplier collaboration requirements and standards that need to be supported by the PLM/MBSE solution providers in the future. This collaboration goes beyond sharing CAD models, as you can see from the supplier engagement framework below:

As there are no standards-based tools, their first focus was looking into methodologies for model and behavior exchanges based on use cases. The use cases are then used to verify the state-of-the-art abilities of the various tools. At this moment, there is a focus on SysML V2 as a potential game-changer due to its new API support. As a relative novice on SysML, I cannot explain this topic in more simple words. I recommend that experts visit their presentations on the AD PAG publications page here.

 

Conclusions

The theme of the conference was related to the Digital Thread – and as you will discover it is valid for everyone. Learn to see the difference between the coordinated Digital Thread and the connected Digital Tread.This time, a lot of information about the Aerospace and Defense Action Groups (AD PAG), which are a fundamental part of this conference.  The A&D industry has always been leading in advanced PLM concepts. However, more advanced concepts will come in my next post when touching the connected Digital Thread in the context of federated PLM and let’s not forget AI.

 

 

 

 

It might have been silent in the series of PLM and Sustainability …  interviews where we as PLM Green Global Alliance core team members, talk with software vendors, implementers and consultants and their relation to PLM and sustainability. The interviews are still in a stage of exploring what is happening at this moment. More details per vendor or service provider next year.

Our last interview was in April this year when we spoke with Mark Reisig, Green Energy Practice Director & Executive Consultant at CIMdata. You can find the interview here, and at that time, I mentioned the good news is that sustainability is no longer a software discussion.

As companies are planning or pushed by regulations to implement sustainable strategies, it becomes clear that education and guidance are needed beyond the tools.

This trend is also noticeable in our PLM Green Global Alliance community, which has grown significantly in the past half year. While writing this post, we have 862 members, not all as active as we hoped. Still, there is more good news related to dedicated contributors and more to come in the next PGGA update.

This time, we want to share the interview with Erik Rieger and Rafał Witkowski, both working for Transition Technologies PSC, a global IT solution integrator in the PLM world known for their PTC implementation services.

I met them during the LiveWorx conference in Boston in May – you can read more about the conference in my post:  The weekend after LiveWorx 2023. Here we decided to follow-up on GreenPLM/

 GreenPLM

The label “GreenPLM” is always challenging as it could be considered green-washing. However, in this case, GreenPLM is an additional software offering that can be implemented on top of a PLM system, enabling people to make scientifically informed decisions for a more sustainable, greener product.

For GreenPLM, Rafal’s and Erik’s experiences are based on implementing GreenPLM on top of the PTC Windchill suite. Listen for the next 34 minutes to an educative session and learn.

You can download the slides shown in the recording here.

What I learned

  • It was more a general educative session related to the relation PLM and Sustainability, focusing on the importance of design decisions – the 80 % impact number.
  • Erik considers sustainability not a disruption for designers; they already work within cost, quality and time parameters. Now, sustainability is the fourth dimension to consider.
  • Erik’s opinion is also reflected in the pragmatic approach of GreenPLM as an additional extension of Windchill using PTC Navigate and OSLC standards.
  • GreenPLM is more design-oriented than Mendix-based Sustaira, a sustainability platform we discussed in this series – you can find the recording here.

Want to learn more?

Here are some links related to the topics discussed in our meeting:

Conclusions

With GreenPLM, it is clear that the focus of design for sustainability is changing from a vision (led by software vendors and environmental regulations) towards implementations in the field. Pragmatic and an extension of the current PLM infrastructure. System integrators like Transition Technologies are the required bridge between vision and realization. We are looking for more examples from the field.

Two more weeks to go – don’t miss this opportunity when you are in Europe
Click on the image to see the full and interesting agenda/

 

Last week, I have been participating in the biannual NEM network meeting, this time hosted by Vestas in Ringkøbing (Denmark).

NEM (North European Modularization) is a network for industrial companies with a shared passion and drive for modular products and solutions.

NEM’s primary goal is to advance modular strategies by fostering collaboration, motivation, and mutual support among its diverse members.

During this two-day conference, there were approximately 80 attendees from around 15 companies, all with a serious interest and experience in modularity. The conference reminded me of the CIMdata Roadmap/PDT conferences, where most of the time a core group of experts meet to share their experiences and struggles.

The discussions are so much different compared to a generic PLM or software vendor conference where you only hear (marketing) success stories.

 

Modularity

When talking about modularity, many people will have Lego in mind, as with the Lego bricks, you can build all kinds of products without the need for special building blocks. In general, this is the concept of modularity.

With modularity, a company tries to reduce the amount of custom-made designs by dividing a product into modules with strict interfaces. Modularity aims to offer a wider variety of products to the customer – but configure these from a narrower assortment of modules to streamline manufacturing, sourcing and service. Modularity allows managing changes and new functionality within the modules without managing a new product.

From ETO (Engineering To Order) to BTO (Build To Order) or even CTO (Configure to Order) is a statement often heard when companies are investing in a new PLM system. The idea is that with the CTO model, you reduce the engineering costs and risks for new orders.

With modularity, you can address more variants and options without investing in additional engineering efforts.

How the PLM system supports modularity is an often-heard question. How do you manage in the best way options and variants? The main issue here is that modularity is often considered an R&D effort – R&D must build the modular architecture. An R&D-only focus is a common mistake in the field similar to PLM. Both

PLM and Modularity suffer from the framing that it is about R&D and their tools, whereas in reality, PLM and Modularity are strategies concerning all departments in an enterprise, from sales & marketing, engineering, and manufacturing to customer service.

 

PLM and Modularity

In 2021, I discussed the topic of Modularity with Björn Eriksson & Daniel Strandhammar, who had written during the COVID-19 pandemic their easy-to-read book: The Modular Way. In a blog post, PLM and Modularity, I discussed with Daniel the touchpoints with PLM. A little later, we had a Zoom discussion with Bjorn and Daniel, together with some of the readers of the book. You can find the info still here: The Modular Way – a follow-up discussion.

What was clear to me at that time is that, in particular, Sweden is a leading country when it comes to Modularity. Companies like Scania, Electrolux are known for their product modularity.

For me it was great to learn the Vestas modularization journey. For sure the Scandinavian region sets the tone. And in addition, there are LEGO and IKEA, also famous Scandinavian companies, but with other modularity concepts.

The exciting part of the conference was that all the significant modularity players were present. Hosted by Vestas and with a keynote speech from Leif Östling, a former CEO of Scania, all the ingredients were there for an excellent conference.

 

The NEM network

The conference started with Christian Eskildsen, CEO of the NEM organization, who has a long history of leading modularity at Electrolux. The NEM is not only a facilitator for modularity. They also conduct training, certification sessions, and coaching on various levels, as shown below.

Christian mentioned that there are around 400 followers on the NEM LinkedIn group. I can recommend this LinkedIn group as the group shares their activities here.

At this moment, you can find here the results of Workstream 7 –  The Cost of Complexity.

Peter Greiner, NEM member, presented the details of this result during the conference on day 2. The conclusion of the workstream team was a preliminary estimate suggesting a minimum cost reduction of 2-5% in terms of the Cost Of Goods Sold (COGS) on top of traditional modularization savings. These estimates are based on real-world cases.

Understanding that the benefits are related to the COGS with a high contribution of the actual material costs, a 2 – 5 % range is significant. There is the intention to dig deeper into this topic.

Besides these workstreams, there are also other workstreams running or finished. The ones that interest me in the sustainability context are Workstream 1 Modular & Circular and Workstream 10 Modular PLM (Digital Thread).

The NEM network has an active group of members, making it an exciting network to follow and contribute as modularity is part of a sustainable future. More on this statement later.

Vestas

The main part of day one was organized by our host, Vestas. Jens Demtröder, Chief Engineer at Vestas for the Modular Turbine Architecture and NEM board member, first introduced the business scope, complexity, and later the future challenges that Vestas is dealing with.

First, wind energy is the best cost-competitive source for a green energy system, as the image shows when taking the full environmental impact into the equation. As the image below shows

From the outside, wind turbines all look the same; perhaps a difference between on-shore and off-shore? No way! There is a substantial evolution in the size and control of the wind turbine, and even more importantly, as the image shows, each country has its own regulations to certify a wind turbine. Vestas has to comply with 80+ different local regulations, and for that reason, modularity is vital to manage all the different demands efficiently.

A big challenge for the future will be the transport and installation of wind turbines.

The components become so big that they need to be assembled on-site, requiring new constraints on the structure to be solved.

As the image to the left, rotor sizes up to 250 m are expected and what about the transport of the nacelle itself?

Click on this link to get an impression.

The audience also participated in a (windy) walk through the manufacturing site to get an impression of the processes & components – an impression below.

Processes, organization and governance

Karl Axel Petursson, Senior Specialist in Architecture and Roadmap, gave insights into the processes, organization and governance needed for the modularity approach at Vestas.

The modularization efforts are always a balance between strategy and execution, where often execution wins. The focus on execution is a claim that I recognize when discussing modularity with the companies I am coaching.

Vestas also created an organization related to the functions it provides, being a follower of Conway’s law, as the image below shows:

With modularity, you will also realize that the modular architecture must rely on stable interfaces between the modules based on clear market needs.

Besides an organizational structure, often more and more a matrix organization, there are also additional roles to set up and maintain a modular approach. As the image below indicates, to integrate all the functions, there are various roles in Vestas, some specialized and some more holistic:

These roles are crucial when implementing and maintaining modularity in your organization. It is not just the job of a clever R&D team.

Just a clever R&D is a misconception I have often discovered in the field. Buying one or more tools that support modularity and then let brilliant engineers do the work. And this is a challenge. Engineers often do not like to be constrained by modular constraints when designing a new capability or feature.

For this reason Vestas has established an Organization Change Management initiative called Modular Minds to make engineers flourish in the organization.

Modular Minds

Madhuri Srinivasan Systems Engineering specialist and  Hanh Le  Business Transformation leader both at Vestas, presented their approach to the 2020 must-win battle for Modularisation, aiming with various means, like blogs, podcasts, etc., to educate the organization and create Modular Minds for all Vestas employees.

 

The team is applying the ADKAR model from Prosci to support this change. As you can see from the (clickable) image to the left, ADKAR is the abbreviation of Awareness, Desire, Knowledge, Ability and Reinforcement.

The ADKAR model focuses on driving change at the individual level and achieving organizational results. It is great to see such an approach applied to Modularity, and it would also be valuable in the domain of PLM, as I discussed with Share PLM in my network.

Scania

The 1 ½ hour keynote speech from Leif Östling supported by Karl-Johan Linghede was more of an interactive discussion with the audience than a speech. Leif took us to the origins of Scania, their collaboration in the beginning with learning the Toyota Way. – customer first, respect for people and focus on quality. And initial research and development together with Modular Management resulting in the MFD-methodology.

It led to the understanding that:

  • The #1 cost driver is the amount of parts you need to manage,
  • The #2 crucial point is to have standardized interfaces and keep the flexibility inside the module

The Scania way

With Ericsson, Scania yearly on partnered to work on the connected vehicle. If you are my age, you will remember connectivity at that time was not easy. The connected vehicle was the first step of what we now would call a digital twin

An interesting topic discussed was that Scania has approximately 25 interfaces at Change Level 1. This is a C-level/Executive discussion to approve potential interface changes. This level shows the commitment of the organization to keep modularity operational.

Another benefit mentioned was that the move to electrification of the vehicle was not such a significant change as in many automotive companies. Thanks to the modular structure and the well-defined interfaces, creating an electric truck was not a complete change of the truck design.

The session with Leif and Karl-Johan could have easily taken longer, giving the interesting question-and-answer dialogue with the curious audience. It was a great learning moment.

 

Digitization, Sustainability & Modularization

As a PLM person from the PLM Green Global Alliance, I was allowed to give a speech about the winning combination of Digitization, Sustainability and Modularization. You might have seen my PLM and Sustainability blog post recently; now, a zoom-in on the circular economy and modularity is included.

In this conference, I also focused on Modularity, when implemented based on model-based and data-driven approaches, which is a crucial component of the circular economy (image below) and the lifecycle analysis per module when defined as model-based (Digital Twin).

My entire presentation on SlideShare: Digitization, Sustainability & Modularization.

Conclusion

It was the first time I attended a conference focused on modularity purely, and I realized we are all fighting the same battle. Like the fact that PLM is a strategy and not an engineering system, modularity faces the same challenge. It is a strategy and not an R&D mission. It would be great to see modularity becoming a part of PLM conferences or Circular Economy events as there is so much to learn from each other – and we need them all.

 

Are you interested in the future of PLM and the meaning of Digital Threads.?

Click on the image to see the agenda and join us for 2 days of discussion & learning.

 

 

 

 

 

 

 

 

 

 

 

 

 

During May and June, I wrote a guest chapter for the next edition of  John Stark’s book Product Lifecycle Management (Volume 2): The Devil is in the Details.

The book is considered a standard in the academic world when studying aspects of PLM.

Looking into the table of contents through the above link, it shows that understanding PLM in its full scope is broad. I wrote about it recently: PLM is Complex (and we have to accept it?), and Roger Tempest and others are still fighting to get the job as PLM Professional recognized Associate Yourself With Professional PLM.

To make the scope broader, John invited me to write a chapter about PLM and Sustainability, which is an actual topic in many organizations. As sustainability is my dedicated topic in the PLM Global Green Alliance (PGGA) core team, I was happy to accept this challenge.

This activity is challenging because writing a chapter on a current topic might make it outdated soon. For the same reason, I never wanted to write a PLM book as I wrote in my 2014 post: Did you notice PLM is changing?

The book, with the additional chapter, will be available later this year. I want to share with you in this post the topics I addressed in this chapter. Perhaps relevant for your organization or personal interests. Also, I am looking forward to learning if I missed any topics.

 

Introduction

The chapter starts with defining the context. PLM is considered a strategy supported by a connected IT infrastructure, and for the definition of sustainability, I refer to the relevant SDGs as described on our PGGA theme page: PLM and Sustainability

Next, I discuss two major concepts indissoluble connected with sustainability.

 

The Circular Economy

On a planet with limited resources and still a growing consumption of raw materials, we need to follow the concepts of the circular economy in our businesses and lives. The circular economy section addresses mainly the hardware side of the butterfly as, here, PLM practices have the most significant impact.

The circular economy requires collaboration among various stakeholders, including businesses, governments and consumers. It involves rethinking production processes and establishing new consumption patterns. Policies and regulations will push for circular economy patterns, as seen in the following paragraphs.

 

Systems Thinking

A significant change in bringing products to the market will be the need to change how we look at our development processes. Historically, many of these processes were linear and only focused on time to market, cost and quality. Now, we have to look into other dimensions, like environmental impact, usage and impact on the planet. As I wrote in the past Systems Thinking – a must-have skill in the 21st century?

Systems Thinking is a cognitive approach that emphasizes understanding complex problems by considering interconnections, feedback loops, and emergent properties. It provides a holistic perspective and explores multiple viewpoints.

Systems Thinking guides problem-solving and decision-making and requires you to treat a solution with a mindset of a system interacting with other systems.

 

Regulations

More sustainable products and services will be driven primarily by existing and upcoming regulations. In this section, I refer to the success of the CFC (ChloroFluorCarbon) emission reduction, leading to slowly fixing the hole in the Ozon layer. Current regulations like WEEE, RoHS and REACH are already relevant for many companies, and compliance with these regulations is a good exercise for more stringent regulations related to Carbon emissions and upcoming related to the Digital Product Passport.

Making regulatory compliance a part of the concept phase ensures no late changes are needed to become compliant, saving time and costs. In addition, making regulatory compliance as much as possible with a data-driven approach reduces the overhead required to prove regulatory compliance. Both topics are part of a PLM strategy.

In this context, see Lionel Grealou’s article 5 Brand Value Benefits at the Intersection of Sustainability and Product Compliance. The article has also been shared in our PGGA LinkedIn group.

 

Business

On the business side, the Greenhouse Gas Protocol is explained. How companies will have to report their Scope 1 and Scope 2 emissions and, ultimately, Scope 3 – see the image below for the details.

GHG reporting will support companies, investors and consumers to decide where to prioritize and put their money.

Ultimately, companies have to be profitable to survive in their business. The ESG framework is relevant in this context as it will allow investors to put their money not only based on short-term gains (as expected) but also on Environmental or Social parameters. There are a lot of discussions related to the ESG framework, as you might have read in Vincent de la Mar’s monthly newsletter, Sustainability & ESG Insights, which is also published in our PGGA group – a link below..

Besides ESG guidelines, there is also the drive by governments and consumers to push for a Product as a Service economy. Instead of owning products, consumers would pay for the usage of these products.

The concept is not new when considering lease cars, EV scooters, or streaming services like Spotify and Netflix. In the CIMdata PLM Roadmap/PDT Fall 2021 conference, we heard Kenn Webster explaining: In the future, you will own nothing & you will be happy.

Changing the business to a Product as a Service is not something done overnight. It requires repairable, upgradeable products. And business related, it requires a connected ecosystem of all stakeholders – the manufacturer, the finance company, and the operating entities.

 

Digital Transformation

All the subjects discussed before require real-time reporting and analysis combined with data access to compliance-related databases. More in the section related to Life Cycle Assessment. As I discussed last year in several conferences, a sustainability initiative starts with data-driven and model-based approaches during the concept phase, but when manufacturing and operating (connected) products in the field. You can read the entire story here: Sustainability and Data-Driven PLM – the Perfect Storm.

Life Cycle Analysis

Special attention is given in this chapter to Life Cycle Analysis, which seems to be a popular topic among PLM vendors. Here, they can provide tools to make a lifecycle assessment, and you can read an impression of these tools in a guest blog from Roger L. Franz titled PLM Tools to Design for Sustainability – PLM Green Global Alliance.

However, Lifecycle Analysis is not as simple. Looking at the ISO 14040 framework, which describes – having the right goals and scope in mind, allows you to do an LCA where the Product Category Rules (PCS) will enable companies to compare their products with others.

PCRs include the description of the product category, the goal of the LCA, functional units, system boundaries, cut-off criteria, allocation rules, impact categories, information on the use phase, units, calculation procedures, requirements for data quality, and other information on the lifecycle Inventory Phase.

So be aware there is more to do than installing a tool.

 

Digital Twin

This section describes the importance of implementing a digital twin for the design phase, allowing companies to develop, test and analyze their products and services first virtually. Trade-off studies on virtual products are much cheaper, and when they are done in a data-driven, model-based environment, it will be the most efficient environment. In my terminology, setting up such a collaboration environment might be considered a System of Engagement.

The second crucial digital twin mentioned is the digital twin from a product in operation where performance can be monitored and usage can be optimized for a minimal environmental impact. Suppose a company is able to create a feedback loop between its products in the field and its product innovation platform. In that case, it can benchmark its design models and update the product behavior for better performance.

The manufacturing digital twin is also discussed in the context of environmental impact, as choosing the right processes and resources can significantly affect scope 3 emissions.

The chapter finishes with the story of a fictive company, WePack, where we can follow the impact and implementations of the topics described in this chapter.

 

Conclusion

As I described in the introduction, the topic of PLM and Sustainability is relatively new and constantly evolving. What do you think? Did I miss any dimensions?

Feel free to contribute to our PLM Global Green Alliance LinkedIn group.

Looking forward to meet you here

 

 

During my summer holiday in my “remote” office, I had the chance to digest what I recently read, heard,  saw and discussed related to the future of PLM.

I noticed this year/last year that many companies are discussing or working on their future PLM. It is time to make progress after COVID, particularly in digitization.

And as most companies are avoiding the risk of a “big bang”, they are exploring how they can improve their businesses in an evolutionary mode.

 

PLM is no longer a system

The most significant change I noticed in my discussions is the growing awareness that PLM is no longer covered by a single system.

More and more, PLM is considered a strategy, with which I fully agree. Therefore, implementing a PLM strategy requires holistic thinking and an infrastructure of different types of systems, where possible, digitally connected.

This trend is bad news for the PLM vendors as they continuously work on an end-to-end portfolio where every aspect of the PLM lifecycle is covered by one of their systems. The company’s IT department often supports the PLM vendors, as IT does not like a diverse landscape.

The main question is: “Every PLM Vendor has a rich portfolio on PowerPoint mentioning all phases of the product lifecycle.

However, are these capabilities implementable in an economical and user-friendly manner by actual companies or should PLM players need to change their strategy”?

A question I will try to answer in this post

 

The future of PLM

I have discussed several observed changes related to the effects of digitization in my recent blog posts, referencing others who have studied these topics in their organizations.

Some of the posts to refresh your memory are:

To summarize what has been discussed in these posts are the following points:

The As Is:

  • The traditional PLM systems are examples of a System of Record, not designed to be end-user friendly but designed to have a traceable baseline for manufacturing, service and product compliance.
  • The traditional PLM systems are tuned to a mechanical product introduction and release process in a coordinated manner, with a focus on BOM governance.
  • The legacy information is stored in BOM structures and related specification files.

System of Record (ENOVIA image 2014)

The To Be:

  • We are not talking about a PLM system anymore; a traditional System of Record will be digitally connected to different Systems of Engagement / Domains / Products, which have their own optimized environment for real-time collaboration.
  • The BOM structures remain essential for the hardware part; however, overreaching structures are needed to manage software and hardware releases for a product. These structures depend on connected datasets.
  • To support digital twins at the various lifecycle stages (design. Manufacturing, operations), product data needs to be based on and consumed by models.
  • A future PLM infrastructure is hybrid, based on a Single Source of Change (SSoC) and an Authoritative Source of Truth (ASoT) instead of a Single Source of Truth (SSoT).

Various Systems of Engagement

 

Related podcasts

I relistened two podcasts before writing this post, and I think they are a must to listen to.

The Peer Check podcast from Colab episode 17 — The State of PLM in 2022 w/Oleg Shilovitsky.  Adam and Oleg have a great discussion about the future of PLM.

Highlights: From System to Platform – the new norman. A Single Source of Truth doesn’t work anymore – it is about value streams. People in big companies fear making wrong PLM decisions, which is seen as a significant risk for your career.

There is no immediate need to change the current status quo.

The Share PLM Podcast – Episode 6: Revolutionizing PLM: Insights from Yousef Hooshmand.  Yousef talked with Helena and me about proven ways to migrate an old PLM landscape to a modern PLM/Business landscape.

Highlights: The term Single Source of Change and the existing concepts of a hybrid PLM infrastructure based on his experiences at Daimler and now at NIO. Yousef stresses the importance of having the vision and the executive support to execute.

The time of “big bangs” is over, and Yousef provided links to relevant content, which you can find here in the comments.

 

In addition, I want to point to the experiences provided by Erik Herzog in the Heliple project using OSLC interfaces as the “glue” to connect (in my terminology) the Systems of Engagement and the Systems of Record.

Conclusion of the Heliple-1 project

If you are interested in these concepts and want to learn and discuss them with your peers, more can be learned during the upcoming CIMdata PLM Roadmap / PDT Europe conference.

In particular, look at the agenda for day two if you are interested in this topic.

 

The future for the PLM vendors

If you look at the messaging of the current PLM Vendors, none of them is talking about this federated concept.

They are more focused with their messaging on the transition from on-premise to the cloud,  providing a SaaS offering with their portfolio.

I was slightly disappointed when I saw this article on Engineering.com provided by Autodesk: 5 PLM Best Practices from the Experiences of Autodesk and Its Customers.

The article is tool-centric, with statements that make sense and could be written by any PLM Vendor. However, Best Practice #1  Central Source of Truth Improves Productivity and Collaboration was the message that struck me. Collaboration comes from connecting people, not from the Single Source of Truth utopia.

I don’t believe PLM Vendors have to be afraid of losing their installed base rapidly with companies using their PLM as a System or Record. There is so much legacy stored in these systems that might still be relevant. The existence of legacy information, often documents, makes a migration or swap to another vendor almost impossible and unaffordable.

The System of Record is incompatible with data-driven PLM capabilities

I would like to see more clear developments of the PLM Vendors, creating a plug-and-play infrastructure for Systems of Engagement. Plug-and-play solutions could be based on a neutral partner collaboration hub like ShareAspace or the Systems of Engagement I discussed recently in my post and interview: The new side of PLM? Systems of Engagement!

Plug-and-play systems of engagement require interface standards, and PLM Vendors will only move in this direction if customers are pushing for that, and this is the chicken-and-egg discussion. And probably, their initiatives are too fragmented at the moment to come to a standard. However, don’t give up; keep building MVPs to learn and share.

Some people believe AI, with the examples we have seen with ChatGPT, will be the future direction without needing interface standards.

I am curious about your thoughts and experiences in that area and am willing to learn.

Talking about learning?

Besides reading posts and listening to podcasts, I also read an excellent book this summer. Martijn Dullaart, often participating in PLM and CM discussions, had decided to write a book based on the various discussions related to part (re-)identification (numbering, revisioning).

The book: The Essential Guide to Part Re-Identification: Unleash the Power of Interchangeability and Traceability (The Future of Configuration Management).

As Martijn starts in the preface:

“I decided to write this book because, in my search for more knowledge on the topics of Part Re-Identification, Interchangeability, and Traceability, I could only find bits and pieces but not a comprehensive work that helps fundamentally understand these topics”.

I believe the book should become standard literature for engineering schools that deal with PLM and CM, for software vendors and implementers and last but not least companies that want to improve or better clarify their change processes.

Martijn writes in an easily readable style and uses step-by-step examples to discuss the various options. There are even exercises at the end to use in a classroom or for your team to digest the content.

The good news is that the book is not about the past. You might also know Martijn for our joint discussion, The Future of Configuration Management, together with Maxime Gravel and Lisa Fenwick, on the impact of a model-based and data-driven approach to CM.

I plan to come back with a more dedicated discussion at some point with Martijn soon. Meanwhile, start reading the book. Get your free chapter if needed by following the link at the bottom of this article.

I recommend buying the book as a paperback so you can navigate easily between the diagrams and the text.

Conclusion

The trend for federated PLM is becoming more and more visible as companies start implementing these concepts. The end of monolithic PLM is a threat and an opportunity for the existing PLM Vendors. Will they work towards an open plug-and-play future, or will they keep their portfolios closed? What do you think?

Last week I had the opportunity to discuss the topic of Systems of Engagement in the context of the more extensive PLM landscape.

I spoke with Andre Wegner from Authentise and their product Threads, MJ Smith from CoLab and Oleg Shilovitsky from OpenBOM.

I invited all three of them to discuss their background, their target customers, the significance of real-time collaboration outside discipline siloes, how they connect to existing PLM systems (Systems of Record), and finally, whether a company culture plays a role.

Listen to this almost 45 min discussion here  (save the m4a file first) or watch the discussion below on YouTube.

 

What I learned from this conversation

  • Systems of Engagement are bringing value to small enterprises but also as complementary systems to traditional PLM environments in larger companies.
  • Thanks to their SaaS approach, they are easy to install and use to fulfill a need that would take weeks/months to implement in a traditional PLM environment. They can be implemented at a department level or by connecting a value chain of people.
  • Due to their real-time collaboration capabilities, these systems provide fast and significant benefits.
  • Systems of Engagement represent the trend that companies want to move away from monolithic systems and focus on working with the correct data connected to the users. A topic I will explore in a future blog post/

I am curious to learn what you pick up from this conversation – are we missing other trends? Use the comments to this post.

 

Related to the company:
Visit Authentise.com

Related to the product:
Learn more about Collaborative Threads

Related to the reported benefits:
– Surgical robotics R&D team tracks 100% of their decisions and saves 150 hours in the first two weeks… doubling the effective size of their team:

Related to the company:
Visit Colabsoftware.com

Related to the product
Raise the bar for your design conversations

Related to the reported benefits
– How Mainspring used CoLab to achieve a 50% cost reduction redesign in half the time
– How Ford Pro Accelerated Time to Market by 30%

Related to the company:
Visit openbom.com

Related to the product:
Global Collaborative SaaS Platform For Industrial Companies

Related to reported benefits:
– OpenBOM makes the OKOS team 20% more efficient by helping to reduce inventory errors, costs, and streamlining supplier process
– VarTech Systems Optimizes Efficiency by Saving Two Hours of Engineering Time Daily with OpenBOM

 

Conclusion

I believe that Systems of Engagement are important for the digital transformation of a company.

They allow companies to learn what it means to work in a SaaS environment, potentially outside traditional company borders but with a focus on a specific value stream.

Thanks to their rapid deployment times, they help the company to grow its revenue even when the existing business is under threat due to newcomers.

The diagram below says it all. What are your favorite Systems of Engagement?

Hot from the press

Don’t miss the latest episode from the Share PLM podcast with Yousef Hooshmand – the discussion is very much connected to this discussion.

Today I read Rhiannon Gallagherer’s LinkedIn post: If Murray Isn’t Happy, No One Is Happy: Value Your Social Nodes. The story reminded me of a complementary blog post I wrote in 2014, although with a small different perspective.

After reviewing my post, I discovered that nine years later, we are still having the same challenges of how to involve people in a business transformation.

People are the most important assets companies claim, but where do they focus their spending and efforts?

Probably more on building the ideal processes and having the best IT solution.

Organisational Change Management is not in their comfort zone. People like Rhiannon Gallagher, but also in my direct network, the team from Share PLM, are focusing on this blind spot.  Don’t forget this part of your digital transformation efforts.

And just for fun, there rest of the post below is the article from  2014. At that time, I was not yet focusing on digital transformation in the PLM domain. That started end of 2014 – the beginning of 2015.

PLM and Blockers
(read it with 2014 in mind – where were you?)

In the past month (April 2014), I had several discussions related to the complexity of PLM.

  • Why is PLM conceived as complex?
  • Why is it hard to sell PLM internally into an organization?
  • Or, to phrase it differently: “What makes PLM so difficult for normal human beings. As conceptually it is not so complex”

(2023 addition: PLM is complex (and we have to accept it?) )

 

So what makes it complex? What is behind PLM?

ConcurrentEngineeringThe main concept behind PLM is that people need to share data. It can be around a project, a product, or a plant through the whole lifecycle. In particular, during the early lifecycle phases, there is a lot of information that is not yet 100 percent mature.

You could decide to wait till everything is mature before sharing it with others (the classical sequential manner). However, the chances of doing it right the first time are low. Several iterations between disciplines will be required before the data is approved.

The more and more a company works sequentially, the higher the costs of changes and the longer the time to market. Due to the rigidness of this sequential approach, it becomes difficult to respond rapidly to changing customer or market demands.

Therefore, in theory (and it is not only a PLM theory), concurrent engineering should reduce the number of iterations and the total time to market by working in parallel on not yet approved data.

 

plmPLM goes further. It is about the sharing of data, and as it originally started in the early phases of the product lifecycle, the concept of PLM was considered something related to engineering. And to be fair, most of the PLM (CAD-related) vendors have a high focus on the early stages of the lifecycle and have strengthened this idea.

However, sharing can go much further, e.g., early involvement of suppliers (still engineering) or downstream support for after-sales/services (the new acronym  SLM – Service Lifecycle Management).

In my recent (2014) blog posts, I discussed the concepts of SLM and the required data model for that.

 

Anticipated sharing

The complexity lies in the word “sharing”. What does sharing mean for an organization, where historically, every person was awarded for their knowledge instead of being awarded for sharing and spreading knowledge. Guarding your knowledge was job protection.

Many so-called PLM implementations have failed to reach the sharing target as the implementation focus was on storing data per discipline and not necessarily storing data to become shareable and used by others. This is a huge difference.

(2023 addition: At that time, all PLM systems were Systems of Record)

PLM binSome famous (ERP) vendors claim if you store everything in their system, you have a “single version of the truth”.

Sounds attractive. However, my garbage bin at home is also a place where everything ends up in a single place, but a garbage bin has not been designed for sharing. Another person has no clue or time to analyze what is inside.

Even data stored in the same system can be hidden from others as the way to find data is not anticipated.

 

Data sharing instead of document deliverables

The complexity of PLM is that data should be created and shared in a matter not necessarily in the most efficient manner for a single purpose. With some extra effort, you can make the information usable and searchable for others. Typical examples are drawings and document management, where the whole process for a person is focused on delivering a specific document on time. Ok, for that purpose, but this document becomes a legacy for the long term as you need to know (or remember) what is inside the document.

doc2dataA logical implication of data sharing is that, instead of managing documents, organizations start to collect and share data elements (a 3D model, functional properties, requirements, physical properties, logistical properties, etc.). Data can be connected and restructured easily through reports and dashboards, therefore, providing specific views for different roles in the organization. Sharing has become possible, and it can be done online. Nobody needed to consolidate and extract data from documents (Excels ?)

(2023 addition: The data-driven PLM infrastructure talking about datasets)

This does not fit older generations and departmental-managed business units that are rewarded only for their individual efficiency.

Here is an extract of a LinkedIn discussion from 2014, where the two extremes are visible. Unfortunately (or perhaps good), LinkedIn does not keep everything online. There is already so much “dark data” on the internet.

Joe stating:

“The sad thing about PLM is that only PLM experts can understand it! It seems to be a very tight knit club with very little influence from any outside sources.
I think PLM should be dumped. It seems to me that computerizing engineering documentation is relatively easy process. I really think it has been over complicated. Of course we need to get the CAD vendors out of the way. Yes it was an obvious solution, but if anyone took the time to look down the road they would see that they were destroying a well established standard that were so cost effective and simple. But it seems that there is no money in simple”

And at the other side, Kais stated:

“If we want to be able to use state-of-the art technology to support the whole enterprise, and not just engineering, and through-life; then product information, in its totality, must be readily accessible and usable at all times and not locked in any perishable CAD, ERP or other systems. The Data Centric Approach that we introduced in the Datamation PLM Model is built on these concepts”

Readers from my blog will understand I am very much aligned with Kais, and PLM guys have a hard time convincing Joe of the benefits of PLM (I did not try).

 

Making the change happen

blockerBesides this LinkedIn discussion, I had discussions with several companies where my audience understood the data-centric approach. It was nice to be in the room together, sharing ideas of what would be possible. However, the outside world is hard to convince, and here the challenge is organizational change management. Who will support you and who will work against you?.

BLOCKERS: I read an interesting article in IndustryWeek from John Dyer with the title: What Motivates Blockers to Resist Change?

John describes the various types of blockers, and when reading the article combined with my PLM twisted brain, I understood again that this is one of the reasons why PLM is perceived as complex – you need to change, and there are blockers:

Blocker (noun)Someone who purposefully opposes any change (improvement) to a process for personal reasons

“Blockers” can occupy any position in a company. They can be any age, gender, education level or pay rate. We tend to think of blockers as older, more experienced workers who have been with the company for a long time, and they don’t want to consider any other way to do things. While that may be true in some cases, don’t be surprised to find blockers who are young, well-educated and fairly new to the company.”

The problem with blockers

The combination of business change and the existence of blockers is one of the biggest risks for companies to go through a business transformation. By the way, this is not only related to PLM; it is related to any required change in business.

Some examples:

imageA company I worked with was eager to study its path to the future, which required more global collaboration, a competitive business model and a more customer-centric approach. After a long evaluation phase, they decided they needed PLM, which was new for most of the people in the company. Although the project team was enthusiastic, they were not able to pass the blockers for a  change – so no PLM. Ironically enough, they lost a significant part of their business to companies that have implemented PLM. Defending the past is not a guarantee for the future.

A second example is Nokia. Nokia was famous for the ways they were able to transform their business in the past. How come they did not see the smartphone and touch screens upcoming? Apparently, based on several articles presented recently, it was Nokia´s internal culture and superior feeling that they were dominating the market that made it impossible to switch. The technology was known, and the concepts were there; however, the (middle) management was full of blockers.

Two examples where blockers had a huge impact on the company.

Conclusion:

Staying in business and remaining competitive is crucial for companies. In particular, the changes that currently happen require people to work differently in order to stay competitive. Documents will become reports generated from data. People handling and collecting documents to generate new documents will become obsolete as a modern data-centric approach makes them redundant. Keeping the old processes might destroy a company. This should convince the blockers to give up.

future exit

In the past few weeks, together with Share PLM, we recorded and prepared a few podcasts to be published soon. As you might have noticed, for Season 2, our target is to discuss the human side of PLM and PLM best practices and less the technology side.  Meaning:

  • How to align and motivate people around a PLM initiative?
  • What are the best practices when running a PLM initiative?
  • What are the crucial skills you need to have as a PLM lead?

And as there are always many success stories to learn on the internet, we also challenged our guests to share the moments where they got experienced.

As the famous quote says:

Experience is what you get when you don’t get what you expect!

We recently published our with Antonio Casaschi from Assa Abloy, a Swedish company you might have never noticed, although their products and services are a part of your daily life.

It was a discussion to my heart. We discussed the various aspects of PLM. What makes a person a PLM professional? And if you have no time to listen for these 35 minutes, read and scan the recording transcript on the transcription tab.

At 0:24:00, Antonio mentioned the concept of Proof of Concept as he had good experiences with them in the past. The remark triggered me to share some observations that a Proof of Concept (POC) is an old-fashioned way to drive change within organizations. Not discussed in this podcast but based on my experience, companies have been using the Proof Of Concepts to win time, as they were afraid to make a decision.

 

A POC to gain time?

 Company A

When working with a well-known company in 2014, I learned they were planning approximately ten POC per year to explore new ways of working or new technologies. As it was a POC based on an annual time scheme, the evaluation at the end of the year was often very discouraging.

Most of the time, the conclusion was: “Interesting, we should explore this further” /“What are the next POCs for the upcoming year?

There was no commitment to follow-up; it was more of a learning exercise not connected to any follow-up.

Company B

During one of the PDT events, a company presented that two years POC with the three leading PLM vendors, exploring supplier collaboration. I understood the PLM vendors had invested much time and resources to support this POC, expecting a big deal. However, the team mentioned it was an interesting exercise, and they learned a lot about supplier collaboration.

And nothing happened afterward ………

In 2019

At the 2019 Product Innovation Conference in London, when discussing Digital Transformation within the PLM domain, I shared in my conclusion that the POC was mainly a waste of time as it does not push you to transform; it is an option to win time but is uncommitted.

My main reason for not pushing a POC is that it is more of a limited feasibility study.

  • Often to push people and processes into the technical capabilities of the systems used. A focus starting from technology is the opposite of what I have been pushing for longer: First, focus on the value stream – people and processes- and then study which tools and technologies support these demands.
  • Second, the POC approach often blocks innovation as the incumbent system providers will claim the desired capabilities will come (soon) within their systems—a safe bet.

 

The Minimum Viable Product approach (MVP)

With the awareness that we need to work differently and benefit from digital capabilities also came the term Minimum Viable Product or MVP.

The abbreviation MVP is not to be confused with the minimum valuable products or most valuable players.

There are two significant differences with the POC approach:

  • You admit the solution does not exist anywhere – so it cannot be purchased or copied.
  • You commit to the fact that this new approach will be the right direction to take and agree that a perfect fit solution is not blocking you from starting for real.

These two differences highlight the main challenges of digital transformation in the PLM domain. Digital Transformation is a learning process – it takes time for organizations to acquire and master the needed skills. And secondly, it cannot be a big bang, and I have often referred to the 2017 article from McKinsey: Toward an integrated technology operating model. Image below.

We will soon hear more about digital transformation within the PLM domain during the next episode of our SharePLM podcast. We spoke with Yousef Hooshmand, currently working for NIO, a Chinese multinational automobile manufacturer specializing in designing and developing electric vehicles, as their PLM data lead.

You might have discovered Yousef earlier when he published his paper: “From a Monolithic PLM Landscape to a Federated Domain and Data Mesh”. It is highly recommended that to read the paper if you are interested in a potential PLM future infrastructure. I wrote about this whitepaper in 2022: A new PLM paradigm discussing the upcoming Systems of Engagement on top of a Systems or Record infrastructure.

To align our terminology with Yousef’s wording, his domains align with the Systems of Engagement definition.

As we discovered and discussed with Yousef, technology is not the blocking issue to start. You must understand the target infrastructure well and where each domain’s activities fit. Yousef mentions that there is enough literature about this topic, and I can refer to the SAAB conference paper: Genesis -an Architectural Pattern for Federated PLM.

For a less academic impression, read my blog post, The week after PLM Roadmap / PDT Europe 2022, where I share the highlights of Erik Herzog’s presentation: Heterogeneous and Federated PLM – is it feasible?

There is much to learn and discover which standards will be relevant, as both Yousef and Erik mention the importance of standards.

The podcast with Yousef (soon to be found HERE) was not so much about organizational change management and people.

However, Yousef mentioned the most crucial success factor for the transformation project he supported at Daimler. It was C-level support, trust and understanding of the approach, knowing it will be many years, an unavoidable journey if you want to remain competitive.

 

And with the journey aspect comes the importance of the Minimal Viable Product. You are starting a journey with an end goal in mind (top-of-the-mountain), and step by step (from base camp to base camp), people will be better covered in their day-to-day activities thanks to digitization.

A POC would not help you make the journey; perhaps a small POC would understand what it takes to cross a barrier.

 

Conclusion

The concept of POCs is outdated in a fast-changing environment where technology is not necessary the blocking issue. Developing practices, new architectures and using the best-fit standards is the future. Embrace the Minimal Viable Product approach. Are you?

 

Two weeks ago, I shared my post: Modern PLM is (too) complex on LinkedIn, and apparently, it was a topic that touched many readers. Almost a hundred likes, fifty comments and six shares. Not the usual thing you would expect from a PLM blog post.

In addition, the article led to offline discussions with peers, giving me an even better understanding of what people think. Here is a summary of the various talks.

 

What is PLM?

In particular, since the inception of Product Lifecycle Management, software vendors have battled with the various PLM definitions.

Initially, PLM was considered an engineering tool for product development, with an extensive potential set of capabilities supported by PowerPoint. Most companies actually implemented a collaborative PDM system at that time and named it PLM.

Was PLM really understood? Look at the infamous Autodesk CEO Carl Bass’s anti-PLM rap from 2007. Next, in 2012, Autodesk introduced its PLM solution called Autodesk PLM 360 as one of the first cloud solutions.

Only with growing connectivity and enterprise information sharing did the definition of PLM start to change.

PLM became a product information backbone serving downstream deployment with product data – the traditional Teamcenter, Windchill and ENOVIA implementations are typical examples of this phase.

With a digitization effort taking place in the non-PLM domain, connecting product development, design and delivery data to a company’s digital business became necessary. You could say, and this is the CIMdata definition:

PLM is a strategic business approach that applies a consistent set of business solutions that support the collaborative creation, management, dissemination, and use of product definition information. PLM supports the extended enterprise (customers, design and supply partners, etc.)

I agree with this definition; perhaps 80 % of our PLM community does. But how many times have we been trapped again in the same thinking: PLM is a system.

The most recent example is the post from Oleg Shilovitsky last week where he claims: Discover why OpenBOM reigns supreme in the world of PLM! 

Nothing wrong with that, as software vendors will always tweak definitions as they need marketing to make a profit, but PLM is not a system.

My main point is that PLM is a “vague” community label with many interpretations. Software vendors have the most significant marketing budget to push their unique definitions. However, also various practitioners in the field have their interpretations.

And maybe Martin Haket’s comment to the post says it all (partly quote):

I’m a bit late to this discussion, but in my opinion, the complexity is mainly due to the fact that the ownership of the processes and data models underlying PLM are not properly organized. ‘Everybody’ in the company is allowed to mix in the discussion and have their opinion; legacy drives departments to undesirable requirements leading to complex implementations.

My intermediate conclusion: Our legacy and lack of a single definition of PLM make it complex.

 

The PLM professional

On LinkedIn, there are approximately 14.000 PLM consultants in my first and second levels of connections. This number indicates that the label “PLM Consultant” has a specific recognition.

During my “PLM is complex” discussion, I noticed Roger Tempest’s Professional PLM White paper and started the dialogue with him.

Roger Tempest is one of the co-founders of the PLM Interest Group. He has been trying to create a baseline for a foundational PLM certification with several others. We discussed the challenges of getting the PLM Professional recognized as an essential business role. Can we certify the PLM professional the same way as a certified Configuration Manager or certified Project Manager?

I shared my thoughts with Roger, claiming that our discipline is too vague and diverse and that finding a common baseline is hard.

Therefore, we are curious about your opinion too. Please tell us in the comments to this post what you think about recognizing the PLM professional and what skills should be the minimum. What are the basics of a PLM professional?

In addition, I participated in some of the SharePLM podcast recordings with PLM experts from the field (follow us here). I raised the PLM professional question either during the podcast or during the preparation of the after-party. Also, there was no single unique answer.

So much is part of PLM: people (culture, skills), processes & data, tools & infrastructures (architectures, standards) combined with execution (waterfall/agile?)

My intermediate conclusion: The broadness of PLM makes it complex to have a common foundation.

 

More about complexity

PEOPLE: Let’s zoom in on the aspects of complexity. Starting from the People, Processes, Data and Tools discussion. The first thing mentioned is “the people,” organizations usually claim: “the most important assets in our organization are the people”.

However, people are usually the last dimension considered in business changes. Companies start with the tools, try to build the optimal processes and finally push the people into that framework by training, incentives or just force.

The reason for the last approach is that dealing with people is complex. People have their beliefs, their legacy and their motivation. And if people do not feel connected to the business (change), they will become an obstacle to change – look at the example below from my 2014 PI Apparel presentation:

To support the importance of people, I am excited to work with Share PLM and the Season 2 podcast series.

In these episodes, we talk with successful PLM experts about their lessons learned during PLM implementation. You will discover it is a learning process, and connecting to people in different cultures is essential. As it is a learning process, you will find it takes time and human skills to master this complexity.

Often human skills are called “soft skills”, but actually, they are “vital skills”!

 

PROCESSES: Regarding the processes part, this is another challenging topic. Often we try to simplify processes to make them workable (sounds like a good idea). With many seasoned PLM practitioners coming from the mechanical product development world, it is not a surprise that many proposed PLM processes are BOM-centric – building on PDM and ERP capabilities.

In my post: The rise and fall of the BOM? I started with this quote from Jan Bosch:

An excessive focus on the bill of materials leads to significant challenges for companies that are undergoing a digital transformation and adopting continuous value delivery. The lack of headroom, high coupling and versioning hell may easily cause an explosion of R&D expenditure over time.

Today’s organization and product complexity does not allow us to keep the processes simple to remain competitive. In that context, have a look at Erik Herzog’s comment on PLM complexity:

I believe a contributing factor to making PLM complex lies in our tendency to make too many simplifications. Do we understand a simple thing such as configuration change management in incremental development? At least in my organization, there is room for improvement.

In the comment, Erik also provided a link to his conference paper: Introducing the 4-Box Development Model describing the potential interaction between  Systems Engineering and Configuration Management. A topic that is too complex for your current company; however, it illustrates that you cannot generalize and simplify PLM overall.

In addition to Erik’s comments, I want to mention again that we can change our business processes thanks to a modern, connected, data-driven infrastructure. From coordinated to connected working with a mix of Systems of Engagement (new) and Systems of Record (traditional). There are no solid best practices yet, but the real PLM geeks are becoming visible.

TOOLS & DATA: When discussing the future: From Coordinated to Connected, there has always been a discussion about the legacy.

Should we migrate the legacy data and systems and replace them with new tools and data models? Or are there other options? The interaction of tools and data is often the domain of Enterprise Solution Architects. The Solution Architect’s role becomes increasingly important in a modern, data-driven company, and several are pretty active in PLM, if you know how to find them, because they are not in the mainstream of PLM.

This week we made a SharePLM podcast recording with Yousef Hooshmand. I wrote about his paper “From a Monolithic PLM Landscape to a Federated Domain and Data Mesh” last year as Yousef describes the complex process, that time working at Daimler, to slowly replace old legacy infrastructure with a new modern user/role-centric data-driven infrastructure.

Watch out for this recording to be published soon as Yousef shares various provoking experiences. Not to provoke our community but to create the awareness that a transformation is possible when you have the right long-term vision, strategy and C-level support.

 

Fighting complexity

And then there are people trying to fight complexity by describing their best practices. There was the launch of Martijn Dullaart’s book: The essential guide to Part Re-Identification. Martijn mentioned that he took the time to write his book based on all our PLM and CM communities interactions instead of writing a series of blog posts, which you still find on his MDUX site. I plan to read this book too this summer and hopefully come back with Martijn and others in a discussion about the book.

Note: We have CM people involved in many of the PLM discussions. I think they are fighting similar complexity like others in the PLM domain. However, they have the benefit that their role: Configuration Manager, is recognized and supported by a commercial certification organization( the Institute of Process Excellence – IpX ).

While completing this post, I read this article from Oleg Shilovitsky: PLM User Groups and Communities. At first glance, you might think that PLM User Groups and Communities might be the solution to address the complexity.

And I think they do; there are within most PLM vendors orchestrated User Groups and Communities. Depending on your tool vendor, you will find like-minded people supported by vendor experts. Are they reducing the complexity? Probably not, as they are at the end of the People, Processes, Data and Tools discussion. You are already working within a specific boundary.

Based on my experience as a core PLM Global Green Alliance member, I think PLM-neutral communities are not viable. There is very little interaction in this community, with currently 686 members, although the topics are very actual. Yes, people want to consume and learn, but making time available to share is, unfortunately, impossible when not financially motivated. Sharing opinions, yes, but working on topics: we are too busy.

 

Conclusion

The term PLM seems adequate to identify a group with a common interest (and skills?) Due to the broad scope and aspects – it is impossible to create a standard job description for the PLM professional, and we must learn to live with that- see my arguments.

What do you think?

Are you ready to discuss the complexity of PLM with your peers

In the past two weeks, I had several discussions with peers in the PLM domain about their experiences.

Some of them I met after a long time again face-to-face at the LiveWorx 2023 event. See my review of the event here: The Weekend after LiveWorx 2023.

And there were several interactions on LinkedIn, leading to a more extended discussion thread (an example of a digital thread ?) or a Zoom discussion (a so-called 2D conversation).

To complete the story, I also participated in two PLM podcasts from Share PLM, where we interviewed Johan Mikkelä  (currently working at FLSmidth)  and, in the second episode Issam Darraj (presently working at ABB) about their PLM experiences. Less a discussion, more a dialogue, trying to grasp the non-documented aspects of PLM. We are looking for your feedback on these podcasts too.

All these discussions led to a reconfirmation that if you are a PLM practitioner, you need a broad skillset to address the business needs, translate them into people and process activities relevant to the industry and ultimately implement the proper collection of tools.

As a sneaky preview for the podcast sessions, we asked both Johan and Issam about the importance of the tools. I will not disclose their answers here; you have to listen.

Let’s look at some of the discussions.

NOTE: Just before pushing the Publish button, Oleg Shilovitsky published this blog article PLM Project Failures and Unstoppable PLM Playbook. I will comment on his points at the end of this post. It is all part of the extensive discussion.

 

PLM, LinkedIn and complexity

The most popular discussions on LinkedIn are often related to the various types of Bills of Materials (eBOM, mBOM, sBOM), Part numbering schemes (intelligent or not), version and revision management and the famous FFF discussions.

This post: PLM and Configuration Management Best Practices: Working with Revisions, from Andreas Lindenthal, was a recent example that triggered others to react.

I had some offline discussions on this topic last week, and I noticed Frédéric Zeller wrote his post with the title PLM, LinkedIn and complexity, starting his post with (quote):

I am stunned by the average level of posts on the PLM on LinkedIn.

I’m sorry, but in 2023 :

  • Part Number management (significant, non-significant) should no longer be a problem.
  • Revision management should no longer be a question.
  • Configuration management theory should no longer be a question.
  • Notions of EBOMs, MBOMs … should no longer be a question.

So why are there still problems on these topics?

You can see from the at least 40+ comments that this statement created a lot of reactions, including mine. Apparently, these topics are touching many people worldwide, and there is no simple, single answer to each of these topics. And there are so many other topics relevant to PLM.

Talking later with Frederic for one hour in a Zoom session, we discussed the importance of the right PLM data model.

I also wrote a series about the (traditional) PLM data model: The importance of a (PLM) data model.

Frederic is more of a PLM architect; we even discussed the wording related to the EBOM and the MBOM. A topic that I feel comfortable discussing after many years of experience seeing the attempts that failed and the dreams people had. And this was only one aspect of PLM.

You also find the discussion related to a PLM certification in the same thread. How would you certify a person as a PLM expert?

There are so many dimensions to PLM. Even more important, the PLM from 10-15 years ago (more of a system discussion) is no longer the PLM nowadays (a strategy and an infrastructure) –

This is a crucial difference. Learning to use a PLM tool and implement it is not the same as building a PLM strategy for your company. It is Tools, Process, People versus Process, People, Tools and Data.

 

Time for Methodology workshops?

I recently discussed with several peers what we could do to assist people looking for best practices discussion and lessons learned. There is a need, but how to organize them as we cannot expect this to be voluntary work.

In the past, I suggested MarketKey, the organizer of the PI DX events,  extend its theme workshops. For example, instead of a 45-min Focus group with a short introduction to a theme (e.g., eBOM-mBOM, PLM-ERP interfaces), make these sessions last at least half a day and be independent of the PLM vendors.

Apparently, it did not fit in the PI DX programming; half a day would potentially stretch the duration of the conference and more and more, we see two days of meetings as the maximum. Longer becomes difficult to justify even if the content might have high value for the participants.

I observed a similar situation last year in combination with the PLM roadmap/PDT Europe conference in Gothenburg. Here we had a half-day workshop before the conference led by Erik Herzog(SAAB Aeronautics)/ Judith Crockford (Europstep) to discuss concepts related to federated PLM – read more in this post: The week after PLM Roadmap/PDT Europe 2022.

It reminded me of an MDM workshop before the 2015 Event, led by Marc Halpern from Gartner. Unfortunately, the federated PLM discussion remained a pretty Swedish initiative, and the follow-up did not reach a wider audience.

And then there are the Aerospace and Defense PLM action groups that discuss moderated by CIMdata. It is great that they published their findings (look here), although the best lessons learned are during the workshops.

However, I also believe the A&D industry cannot be compared to a mid-market machinery manufacturing company. Therefore, it is helpful for a smaller audience only.

And here, I inserted a paragraph dedicated to Oleg’s recent post, PLM Project Failures and Unstoppable PLM Playbook – starting with a quote:

How to learn to implement PLM? I wrote about it in my earlier article – PLM playbook: how to learn about PLM? While I’m still happy to share my knowledge and experience, I think there is a bigger need in helping manufacturing companies and, especially PLM professionals, with the methodology of how to achieve the right goal when implementing PLM. Which made me think about the Unstoppable PLM playbook ©.

I found a similar passion for helping companies to adopt PLM while talking to Helena Gutierrez. Over many conversations during the last few months, we talked about how to help manufacturing companies with PLM adoption. The unstoppable PLM playbook is still a work in progress, but we want to start talking about it to get your feedback and start the conversation. 

It is an excellent confirmation of the fact that there is a need for education and that the education related to PLM on the Internet is not good enough.

As a former teacher in Physics, I do not believe in the Unstoppable PLM Playbook, even if it is a branded name. Many books are written by specific authors, giving their perspectives based on their (academic) knowledge.

Are they useful? I believe only in the context of a classroom discussion where the applicability can be discussed,

Therefore my questions to vendor-neutral global players, like CIMdata, Eurostep, Prostep, SharePLM, TCS and others, are you willing to pick up this request? Or are there other entities that I missed? Please leave your thoughts in the comments. I will be happy to assist in organizing them.

There are many more future topics to discuss and document too.

  • What about the potential split of a PLM infrastructure between Systems of Record & Systems of Engagement?
  • What about the Digital Thread, a more and more accepted theme in discussions, but what is the standard definition?
  • Is it traceability as some vendors promote it, or is it the continuity of data, direct usable in various contexts – the DevOps approach?

 

Who likes to discuss methodology?

When asking myself this question, I see the analogy with standards. So let’s look at the various players in the PLM domain – sorry for the immense generalization.

Strategic consultants: standards are essential, but spare me the details.

Vendors: standards are limiting the unique capabilities of my products

Implementers: two types – Those who understand and use standards as they see the long-term benefits. Those who avoid standards as it introduces complexity.

Companies: they love standards if they can be implemented seamlessly.

Universities: they love to explore standards and help to set the standards even if they are not scalable

Just replace standards with methodology, and you see the analogy.

 

We like to discuss the methodology.

As I mentioned in the introduction, I started to work with Share PLM on a series of podcasts where we interview PLM experts in the field that have experience with the people, the process, the tools and the data side. Through these interviews, you will realize PLM is complex and has become even more complicated when you consider PLM a strategy instead of a tool.

We hope these podcasts might be a starting point for further discussion – either through direct interactions or through contributions to the podcast. If you have PLM experts in your network that can explain the complexity of PLM from various angles and have the experience. Please let us know – it is time to share.

 

Conclusion

By switching gears, I noticed that PLM has become complex. Too complex for a single person to master. With an aging traditional PLM workforce (like me), it is time to consolidate the best practices of the past and discuss the best practices for the future. There are no simple answers, as every industry is different. Help us to energize the PLM community – your thoughts/contributions?

Announcing PLM Road Map & PDT EMEA 2023

 

Translate

  1. Bart Willemsen's avatar

    Interesting reflection, Jos. In my experience, the situation you describe is very recognizable. At the company where I work, sustainability…

  2. Unknown's avatar
  3. Håkan Kårdén's avatar

    Jos, all interesting and relevant. There are additional elements to be mentioned and Ontologies seem to be one of the…

  4. Lewis Kennebrew's avatar

    Jos, as usual, you've provided a buffet of "food for thought". Where do you see AI being trained by a…