You are currently browsing the category archive for the ‘Digital Twin’ category.

We are happy to close the year with the first round of the PLM Global Green Alliances (PGGA) series: PLM and Sustainability.

We interviewed PLM-related software vendors in this series, discussing their sustainability mission and offering.

We talked with SAP, Autodesk, Dassault Systèmes, Sustaira and Aras and now with PTC. It was an exciting discussion, looking back at their Lifecycle Analysis (LCA) history and ending with a cliffhanger about what’s coming next year.

PTC

The discussion was with Dave Duncan,  VP Sustainability at PTC, focusing on industrial Sustainability as well as PTC’s internal footprint reduction programs, joined by James Norman, who globally leads PTC’s Community of Practice for PLM and Design-for-Sustainability.

Interesting to notice from this discussion, listen to the introduction of Dave and James and their history with Sustainability long before it became a buzzword and then notice how long it takes till digital thread and digital twin are mentioned – enjoy the 38 minutes of interaction below


Slides shown during the interview combined with additional company information can be found HERE.

 

What we have learned

  • It was interesting to learn that just before the financial crisis in 2008, PTC invested (together with James Norman) in lifecycle analysis. But, unfortunately, a focus on restoring the economy silenced this activity until (as Dave Duncan says) a little more than six months ago, when Sustainability is almost in the top 3 of every company’s agenda.
  • Regulation and financial reporting are the current drivers for companies to act related to Sustainability.
  • The digital thread combined with the notion of relying on data quality are transformational aspects.
  • Another transformational aspect is connecting sustainability as an integrated part of product development instead of a separate marketing discipline.
  • Early next year, we will learn more about the realization of the PTC Digital Twin.

Want to learn more

Here are some links to the topics discussed in our meeting:

 

Conclusions

It was great to conclude with PTC this year. I hope readers following this series:  “The PLM Global Green Alliance meets  …” has given a good first impression of where PLM-related vendors are heading regarding their support for a sustainable future.

We touched base with them, the leaders, and the experts in their organizations. We discussed the need for data-driven infrastructures, the relation with the circular economy and compliance.

Next year we plan to follow up with them, now looking more into the customer experiences, tools, and methodology used.

 

 

 

 

In the last few weeks, I thought I had a writer’s block, as I usually write about PLM-related topics close to my engagements.
Where are the always popular discussions related to EBOM or MBOM? Where is the Form-Fit-Function discussion or the traditional “meaningful numbers” discussions?

These topics always create a lot of interaction and discussion, as many of us have mature opinions.

However, last month I spent most of the time discussing the connection between digital PLM strategies and sustainability. With the Russian invasion of Ukraine, leading to high energy prices, combined with several climate disasters this year, people are aware that 2022 is not a year as usual. A year full of events that force us to rethink our current ways of living.

The notion of urgency

Sustainability for the planet and its people has all the focus currently. COP27 gives you the impression that governments are really serious. Are they? Read this post from Kimberley R. Miner, Climate Scientist at NASA, Polar Explorer& Professor.

She doubts if we really grasp the urgency needed to address climate change. Or are we just playing to be on stage? I agree with her doubts.

So what to do with my favorite EBOM-MBOM discussions?

Last week I attended an event organized by Dassault Systems in the Netherlands for their Dutch/Belgium customers.

The title of the event was: Sustainable innovation for a digital future. I expected a techy event. Click on the image to see the details.

Asking my grandson, who had just started to his study Aerospace Engineering in Delft (NL), learning to work with CAD and PLM-tools, to join me – he replied:

“Too many software demos”

It turned out that my grandson was wrong. The keynote speech from Ruud Veltenaar made most of the audience feel uncomfortable. He really pointed to the fact that we are aware of climate change and our impact on the planet, but in a way, we are paralyzed. Nothing new, but confronting and unexpected when going to a customer event.

Ruud’s message: Accept that we are at the end of an existing world order, and we should prepare for a new world order with the right moral leadership. It starts within yourself. Reflect on who you really are, where you are in your life path, and finally, what you want.

It sounds simple, and I can see it helps to step aside and reflect on these points.

Otherwise, you might feel we are in a rat race as shown below (recommend to watch).

The keynote was the foundation for a day of group and panel discussions on sustainability. Learning from their customers their sustainability plans and experiences.

It showed Dassault Systems, with its 2012  purpose (click on the link to see its history), Harmonizing Products, Nature and Life is ahead of the curve (at least they were for me).

The event was energizing, and my grandson was wrong:
“No software – next time?”

 

The impact of legacies – data, processes & people

For those who haven’t read my previous post, The week after PLM Roadmap / PDT Europe 2022, I wrote about the importance of Heterogeneous and federated PLM, one of the discussions related to data-driven PLM.

Looking back, I have been writing about data-driven PLM since 2014, and few companies have made progress here. Understandable, first of all, due to legacy data, which is not in the right format or quality to support data-driven processes.

However, also here, legacy processes and legacy people are blocking the change. There is no blame here; it is difficult to change. You might have a visionary management team, but then it comes down to the execution of the strategy. The organizational structure and the existing people skills are creating more resistance than progress.

For that reason, I wrote this post in 2015: PLM and Global Warming, where I compared the progress we made within our PLM community with the lack of progress we are making in solving global warming. We know the problem, but we are unable to act due to the lack of feeling the urgency.

This blog post triggered Rich McFall to start together in 2018 the PLM Global Green Alliance.

 

In my PLM Roadmap / PDT Europe session Sustainability and Data-driven PLM – the perfect storm, I raised the awareness that we need to speed up. We have 10 perhaps 15 years to implement radical changes, according to scientists, before we reach irreversible tipping points.

 

Why PLM and Sustainability?

Sustainability starts with the business strategy. How does your company want to contribute to a more sustainable future? The strategy to follow with probably the most impact is the concept of a circular economy – image below and more info here.

The idea behind the circular economy is to minimize the need for new finite materials (the right side) and to use for energy delivery only renewables. Implementing these principles clearly requires a more holistic design of products and services. Each loop should be analyzed and considered when delivering solutions to the market.

Therefore, a logical outcome of the circular economy would be transforming from selling products to the market towards a product-as-a-service model. In this case, the product manufacturer becomes responsible for the full product lifecycle and its environmental impact.

And here comes the importance of PLM. You can measure and tune your environmental impact during production in your ERP or MES environment. However, 80 % of the environmental impact is defined during the design phase, the domain of PLM. All these analysis together are called Life Cycle Analysis or Life Cycle Assessment (LCA). A practice that starts at the moment you start to think about a product or solution – a specialized systems thinking approach.

So how to define and select the right options for future products?

 

Virtual products / Digital Twins

This is where sustainability is pushing for digitization of the product lifecycle. Building and analyzing products in the virtual world is much cheaper than working with physical prototypes.

The importance of a model-based approach here allows companies efficiently deal with trade-off studies for each solution.

In addition, the choice and the behavior of materials also have an impact. These material properties will come from various databases, some based on hazardous substances, others on environmental parameters. Connecting these databases to the virtual model is crucial to remain efficient.

Imagine you need manually collect and process in these properties whenever studying an alternative. The manual process will be too costly (fewer trade-offs and not finding the optimum) and too slow (time-to-market impact).

That’s why I am greatly interested in all the developments related to a federated PLM infrastructure. A monolithic system cannot be the solution for such a model-based environment. In my terminology, here we need an architecture with systems of engagement combined with system(s) of record.

I will publish more on this topic in the future.

In the previous paragraphs, I wrote about the virtual product environment, which some companies call the virtual twin. However, besides the virtual twin, we also need several digital twins. These digital models allow us to monitor and optimize the production process, which can lead to design changes.

Also, monitoring the product in operation using a digital twin allows us to optimize the performance and execution of the solutions in the field.

The feedback from these digital twins will then help the company to improve the design and calibrate their simulation models. It should be a closed loop. You can find a more recent discussion related to the above image here.

 

Our mission

At this moment, sustainability is at the top of my personal agenda, and I hope for many of you. However, besides the choices we can make in our personal lives, there is also an area where we, as PLM interested parties, should contribute: The digitization of the product lifecycle as an enabler for a sustainable business.

Without mature concepts for a connected enterprise, implementing sustainable products and business processes will be a wish, not a strategy. So add digitization to your skillset and use it in the context of sustainability.

Conclusion

It might look like this PLM blog has become an environmental blog. This might be right, as the environmental impact of products and solutions is directly related to product lifecycle management. However, do not worry. In the upcoming time, I will focus on the aspects and experiences of a connected enterprise. I will leave the easier discussions (EBOM/MBOM/FFF/Smart Numbers) from a coordinated enterprise as they are. There is work to do shortly. Your thoughts?

 

 

 

 

 

 

 

I hope you all remained curious after last week’s report from day 1 of the PLM Roadmap / PDT Europe 2022 conference in Gothenburg. The networking dinner after day 1 and the Share PLM after-party allowed us to discuss and compare our businesses. Now the highlights of day 2

 

The Power of Curiosity

We started with a keynote speech from Stefaan van Hooydonk, Founder of the Global Curiosity Institute. It was a well-received opener of the day and an interesting theme concerning PLM.

According to Stefaan, in the previous century, curiosity had a negative connotation. Curiosity killing the cat is one of these expressions confirming the mindset. It was all about conformity to the majority, the company, and curiosity was non-conformant.

The same mindset I would say we have with traditional PLM; we all have to work the same way with the same processes.

In the 21st century, modern enterprises stimulate curiosity as we understand that throughout history, curiosity has been the engine of individual, organizational, and societal progress. And in particular, in modern, unpredictable times, curiosity becomes important, for the world, the others around us and ourselves.

As Stefaan describes in his book, the Curiosity Manifesto, organizations and individuals can develop curiosity. Stefaan pushed us to reflect on our personal curiosity behavior.

  • Are we really interested in the person, the topic I do not know or do not like?
  • Are we avoiding curious steps out of fear? Fear for failing, judgment?

After Stefaan’s curiosity storm, you could see that the audience was inspired to apply it to themselves and their PLM mission(s).

I hope the latter – as here there is a lot to discover.


 

Digital Transformation – Time to roll up your sleeves

In his presentation, Torbjörn Holm, co-founder of Eurostep, addressed one of the bigger elephants in the modern enterprise: how to deal with data?

Thanks to digitization, companies are gathering ad storing data, and there seem to be no limits. However, data centers compete for electricity from the grid with civilians.

Torbjörn also introduced the term “Dark data – the dirty secret of the ICT sector. We store too much data; some research mentions that only 12 % of the data stored is critical, and the rest clogs up on some file servers. Storing unstructured and unused data generates millions of greenhouse gasses yearly.

It is time for a data cleanup day, and inspired by Torbjörn’s story, I have already started to clean up my cloud storage. However, I did not touch my backup hard disks as they do not use energy when switched off.

Further, Torbjörn elaborated that companies need to have end-to-end data policies. Which data is required? And in the case of contracted work or suppliers, data is crucial.

Ultimately companies that want to benefit from a virtual twin of their asset in operation need to have processes in place to acquire the correct data and maintain the valid data. Digital twins do not run on documents; as mentioned in some of my blog posts, they need accurate data.

Torbjörn once more reminded us that the PLCS objective is designed for that.


 

Heterogeneous and federated PLM – is it feasible?

One of the sessions that upfront had most of my attention was the presentation from Erik Herzog, Technical Fellow at Saab Aeronautics and Jad El-Khoury, Researcher at the KTH/Royal Institute of Technology.

Their presentation was closely related to the pre-conference workshop we had organized by Erik and Eurostep. More about this topic in the future.

Saab, Eurostep and KTH conducted a research project named Helipe to analyze and test a federated PLM architecture. The concept was strongly driven by engineering. The idea is shown in the images below.

First are the four main modular engineering environments; in the image, we see mechanical, electrical, software and engineering environments. The target is to keep these environments as standard as possible towards the outside world so that later, an environment could be swapped for a better environment. Inside an environment, automation should provide optimal performance for the users.

In my terminology, these environments serve as systems of engagement.

The second dimension of this architecture is the traceability layer(s) – the requirements management layer, the configuration item structures, change control and realization structures.

These traceability structures look much like what we have been doing with traditional PLM, CM and ERP systems. In my terminology, they are the systems or record, not mentioned to directly serve end-users but to provide traceability, baselines for configuration, compliance and more.

The team chose the OSLC standard to realize these capabilities. One of the main reasons because OSLC is an existing open standard based on linked data, not replicating data. In this way, a federated environment would be created with designated connections between datasets.

Jad El-Koury demonstrated how to link an existing requirement in Siemens Polarion to a Defect in IBM ELM and then create a new requirement in Polarion and link this requirement to the same defect. I never get excited from technical demos; more important to learn is the effort to build such integration and its stability over time. Click on the image for the details

The conclusions from the team below give the right indicators where the last two points seem feasible.

Still, we need more benchmarking in other environments to learn.

I remain curious about this approach as I believe it is heading toward what is necessary for the future, the mix of systems of record and systems of engagement connected through a digital web.

The bold part of the last sentence may be used by marketers.


 

Sustainability and Data-driven PLM – the perfect storm 

For those familiar with my blog (virtualdutchman.com) and my contribution to the PLM Global Green Alliance, it will be no surprise that I am currently combining new ways of working for the PLM domain (digitization) with an even more hot topic, sustainability.

More hot is perhaps a cynical remark.

In my presentation, I explained that a model-based, data-driven enterprise will be able to use digital twins during the design phase, the manufacturing process planning and twins of products in operation. Each twin has a different purpose.

The virtual product during the design phase does not have a real physical twin yet, so some might say it is not a twin at this stage. The virtual product/twin allows companies to perform trade-offs, verification and validation relatively fast and inexpensively. The power of analyzing this virtual twin will enable companies to design products not only at the best price/performance range but even as important, with the lowest environmental impact during manufacturing and usage in the field.

The virtual world of digital twins – (c) 2018 Boeing – diamond

As the Boeing diamond nicely shows, there is a whole virtual world for digital twins. The manufacturing digital twin allows companies to analyze their manufacturing process and virtually analyze the most effective manufacturing process, preferably with the lowest environmental impact.

For digital twins from a product in the field, we can analyze its behavior and optimize performance, hopefully with environmental performance indicators in mind.

For a sustainable future, it is clear that we need to implement concepts of the circular economy as the earth does not have enough resources and renewables to support our current consumption behavior and ways of living.

Note: not for everybody on the globe,  a quote from the European Environment Agency below:

Europe consumes more resources than most other regions. An average European citizen uses approximately four times more resources than one in Africa and three times more than one in Asia, but half of that of a citizen of the USA, Canada, or Australia

To reduce consumption, one of the recommendations is to switch the business model from owning products to products as a service. In the case of products as a service, the manufacturer becomes the owner of the full product lifecycle. Therefore, the manufacturer will have business reasons to make the products repairable, upgradeable, recyclable and using energy efficiently, preferably with renewables. If not, the product might become too expensive; fossil energy will be too expensive as carbon taxes will increase, and virgin materials might become too expensive.

It is a business change; however, sustainability will push organizations to change faster than we are used to. For example, we learned this week that the peeking energy prices and Russia’s current war in Ukraine have led to strong investments in renewables.

As a result, many countries no longer want to depend on Russian energy. The peak of carbon emissions for the world is now expected in 2025.
(Although we had a very bad year so far)

Therefore, my presentation concluded that we should use sustainability as an additional driver for our digital transformation in the PLM domain. The planet cannot wait until we slowly change our traditional working methods.

Therefore, the need for digital twins to support sustainability and systems thinking are the perfect storm to speed up our digitization projects.

You can find my presentation as usual, here on SlideShare and a “spoken” version on our PGGA YouTube channel here


 

Digitalization for the Development and Industrialization of Innovative and Sustainable Solutions

This session, given by  Ola Isaksson, Professor, Product Development & Systems Engineering Design Research Group Leader at Chalmers University, was a great continuation on my part of sustainability. Ola went deeper into the aspects of sustainable products and sustainable business models.

The DSIP project (Digital Sustainability Implementation Package – image above) aims to help companies understand all aspects of sustainable development. Ola mentioned that today’s products’ evolution is insufficient to ensure a sustainable outcome. Currently, not products nor product development practices are adequate enough as we do not understand all the aspects.

For example, Ola used the electrification process, taking the Lithium raw material needed for the batteries. If we take the Nissan Leaf car as the point of measure, we would have used all Lithium resources within 50 years.

Therefore, other business models are also required, where the product ownership is transferred to the manufacturer. This is one of the 9Rs (or 10), as the image shows moving from a linear economy towards a circular economy.

Also, as I mentioned in my session,  Ola referred to the upcoming regulations forcing manufacturers to change their business model or product design.  All these aspects are discussed in the DSIP project, and I look forward to learning the impact this project had on educating and supporting companies in their sustainability journey.

Click on the image to discover the scope


 

A day 2 summary

We had Bernd Feldvoss, Value Stream Leader PLM Interoperability Standards at Airbus, reporting on the progress of the A&D action group focusing on Collaboration. At this stage, the project team has developed an open-service Collaboration Management System (CMS) web application, providing navigation through the eight-step guidelines and offering the potential to improve OEM-supplier collaboration consistency and efficiency within the A&D community.

We had Henrik Lindblad, Group Leader PLM & Process Support at the European Spallation Source, building and soon operating the world’s most powerful neutron source, enabling scientific breakthroughs in research related to materials, energy, health and the environment. Besides a scientific breakthrough, this project is also an example of starting with building a virtual twin of the facility from the start providing a multidisciplinary collaboration space.


 

Conclusion

I left the conference with a lot of positive energy. The Curiosity session from Stefaan van Hooydonk energized us all, but as important for our PLM domain, I saw the trend towards more federated PLM environments, more discussions related to sustainability, and people in 3D again. So far, my takeaways this time.  Enough to explore till the next event.

With great pleasure, I am writing this post, part of a tradition that started for me in 2014. Posts starting with “The weekend after …. “describing what happened during a PDT conference, later the event merged with CIMdata becoming THE PLM event for discussions beyond marketing.

For many of us, this conference was the first time after COVID-19 in 2020. It was a 3D (In person) conference instead of a 2D (digital) conference. With approximately 160 participants, this conference showed that we wanted to meet and network in person and the enthusiasm and interaction were great.

The conference’s theme, Digital Transformation and PLM – a call for PLM Professionals to redefine and re-position the benefits and value of PLM, was quite open.

There are many areas where digitization affects the way to implement a modern PLM Strategy.

Now some of my highlights from day one. I needed to filter to remain around max 1500 words. As all the other sessions, including the sponsor vignettes, were informative, they increased the value of this conference.


Digital Skills Transformation -Often Forgotten Critical Element of Digital Transformation

Day 1 started traditionally with the keynote from Peter Bilello, CIMdata’s president and CEO. In previous conferences, Peter has recently focused on explaining the CIMdata’s critical dozen (image below). If you are unfamiliar with them, there is a webinar on November 10 where you can learn more about them.

All twelve are equally important; it is not a sequence of priorities. This time Peter spent more time on Organisational Change management (OCM), number 12 of the critical dozen – or, as stated, the Digital Transformation’s Achilles heel. Although we always mention people are important, in our implementation projects, they often seem to be the topic that gets the less focus.

We all agree on the statement: People, Process, Tools & Data. Often the reality is that we start with the tools, try to build the processes and push the people in these processes. Is it a coincidence that even CIMdata puts Digital Skills transformation as number 12? An unconscious bias?

This time, the people’s focus got full attention. Peter explained the need for a digital skills transformation framework to educate, guide and support people during a transformation. The concluding slide below says it all.


Transformation Journey and PLM & PDM Modernization to the Digital Future

The second keynote of the day was from Josef Schiöler, Head of Core Platform Area PLM/PDM from the Volvo Group. Josef and his team have a huge challenge as they are working on a foundation for the future of the Volvo Group.

The challenge is that it will provide the foundation for new business processes and the various group members, as the image shows below:


As Josef said, it is really the heart of the heart, crucial for the future. Peter Bilello referred to this project as open-heart surgery while the person is still active, as the current business must go on too.

The picture below gives an impression of the size of the operation.

And like any big transformation project also, the Volvo Group has many questions to explore as there is no existing blueprint to use.

To give you an impression:

  • How to manage complex documentation with existing and new technology and solution co-existing?
    (My take: the hybrid approach)
  • How to realize benefits and user adoption with user experience principles in mind?
    (My take: Understand the difference between a system of engagement and a system of record)
  • How to avoid seeing modernization as pure an IT initiative and secure that end-user value creation is visible while still keeping a focus on finalizing the technology transformation?
    (My take: think hybrid and focus first on the new systems of engagement that can grow)
  • How to efficiently partner with software vendors to ensure vendor solutions fit well in the overall PLM/PDM enterprise landscape without heavy customization?
    (My take: push for standards and collaboration with other similar companies – they can influence a vendor)

Note: My takes are just a starting point of the conversation. There is a discussion in the PLM domain, which I described in my blog post: A new PLM paradigm.

 

The day before the conference, we had a ½ day workshop initiated by SAAB and Eurostep where we discussed the various angles of the so-called Federated PLM.

I will return to that topic soon after some consolidation with the key members of that workshop.


Steering future Engineering Processes with System Lifecycle Management

Patrick Schäfer‘s presentation was different than the title would expect. Patrick is the IT Architect Engineering IT from ThyssenKrupp Presta AG. The company provides steering systems for the automotive industry, which is transforming from mechanical to autonomous driving, e-mobility, car-to-car connectivity, stricter safety, and environmental requirements.

The steering system becomes a system depending on hardware and software. And as current users of Agile PLM, the old Eigner PLM software, you can feel Martin Eigner’s spirit in the project.

I briefly discussed Martin’s latest book on System Lifecycle Management in my blog post, The road to model-based and connected PLM (part 5).

Martin has always been fighting for a new term for modern PLM, and you can see how conservative we are – for sometimes good reasons.

Still, ThyssenKrupp Presta has the vision to implement a new environment to support systems instead of hardware products. And in addition, they had to work fast to upgrade their current almost obsolete PLM environment to a new supported environment.

The wise path they chose was first focusing on a traditional upgrade, meaning making sure their PLM legacy data became part of a modern (Teamcenter) PLM backbone. Meanwhile, they started exploring the connection between requirements management for products and software, as shown below.

From my perspective, I would characterize this implementation as the coordinated approach creating a future option for the connected approach when the organization and future processes are more mature and known.

A good example of a pragmatic approach.


Digital Transformation in the Domain of Products and Plants at Siemens Energy

Per Soderberg, Head of Digital PLM at Siemens Energy, talked about their digital transformation project that started 6 – 7 years ago. Knowing the world of gas- and steam turbines, it is a domain where a lot of design and manufacturing information is managed in drawings.

The ultimate vision from Siemens Energy is to create an Industrial Metaverse for its solutions as the benefits are significant.

Is this target too ambitious, like GE’s 2014 Industrial Transformation with Predix? Time will tell. And I am sure you will soon hear more from Siemens Energy; therefore, I will keep it short. An interesting and ambitious program to follow. Sure you will read about them in the near future. 


Accelerating Digitalization at Stora Enso

Stora Enso is a Finish company, a leading global provider of renewable solutions in packaging, biomaterials, wooden construction and paper. Their director of Innovation Services, Kaisa Suutari, shared Stora Enso’s digital transformation program that started six years ago with a 10 million/year budget (some people started dreaming too). Great to have a budget but then where to start?

In a very systematic manner using an ideas funnel and always starting from the business need, they spend the budget in two paths, shown in the image below.

Their interesting approach was in the upper path, which Kaisa focused on. Instead of starting with an analysis of how the problem could be addressed, they start by doing and then analyze the outcome and improve.

I am a great fan of this approach as it will significantly reduce the time to maturity. However, how much time is often wasted in conducting the perfect analysis?

Their Digi Fund process is a fast process to quickly go from idea to concept, to POC and to pilot, the left side of the funnel. After a successful pilot, an implementation process starts small and scales up.

There were so many positive takeaways from this session. Start with an MVP (Minimal Viable Product) to create value from the start. Next, celebrate failure when it happens, as this is the moment you learn. Finally, continue to create measurable value created by people – the picture below says it all.

It was the second time I was impressed by Stora Enso’s innovative approach. During the PI PLMX 2020 London, Samuli Savo, Chief Digital Officer at Stora Enso, gave us insights into their innovation process. At that time, the focus was a little bit more on open innovation with startups. See my post:  The weekend after PI PLMx London 2020. An interesting approach for other businesses to make their digital transformation business-driven and fun for the people


 A day-one summary

There was Kyle Hall, who talked about MoSSEC and the importance of this standard in a connected enterprise. MoSSEC (Modelling and Simulation information in a collaborative Systems Engineering Context) is the published ISO standard (ISO 10303-243) for improving the decision-making process for complex products. Standards are a regular topic for this conference, more about MoSSEC here.

There was Robert Rencher, Sr. Systems Engineer, Associate Technical Fellow at Boeing, talking about the progress that the A&D action group is making related to Digital Thread, Digital Twins. Sometimes asking more questions than answers as they try to make sense of the marketing definition and what it means for their businesses. You can find their latest report here.

There was Samrat Chatterjee, Business Process Manager PLM at the ABB Process Automation division. Their businesses are already quite data-driven; however, by embedding PLM into the organization’s fabric, they aim to improve effectiveness, manage a broad portfolio, and be more modular and efficient.

The day was closed with a CEO Spotlight, Peter Bilello. This time the CEOs were not coming from the big PLM vendors but from complementary companies with their unique value in the PLM domain. Henrik Reif Andersen, co-founder of Configit; Dr. Mattias Johansson, CEO of Eurostep; Helena Gutierrez, co-founder of Share PLM; Javier Garcia, CEO of The Reuse Company and  Karl Wachtel, CEO, XPLM discussed their various perspectives on the PLM domain.

 

Conclusion

Already so much to say; sorry, I reached the 1500 words target; you should have been there. Combined with the networking dinner after day one, it was a great start to the conference. Are you curious about day 2 – stay tuned, and your curiosity will be rewarded.

 

Thanks to Ewa Hutmacher, Sumanth Madala and Ashish Kulkarni, who shared their pictures of the event on LinkedIn. Clicking on their names will lead you to the relevant posts.

 

As human beings, we believe in the truth. We claim the truth. During my holiday in Greece, the question was, did the Greek Prime Minister tell the truth about the internal spy scandal?

In general, we can say, politicians never speak the real truth, and some countries are trying to make sure there is only one single source of truth – their truth. The concept of a Single Source Of Truth (SSOT) is difficult to maintain in politics.

On social media, Twitter and Facebook, people are claiming their truth. But unfortunately, without any scientific background, people know better than professionals by cherry-picking messages, statistics or even claiming non-existing facts.

Nicely described in The Dunning-Kruger effect. Unfortunately, this trend will not disappear.

If you want to learn more about the impact of social media, read this long article from The Atlantic:  Why the Past 10 Years of American Life Have Been Uniquely Stupid. Although the article is about the US, the content is valid for all countries where social media are still allowed.

The PLM and CM domain is the only place where people still rely on the truth defined by professionals. Manufacturing companies depend on reliable information to design, validate, manufacture and support their products. Compliance and safe products require an accurate and stable product definition based on approved information. Therefore, the concept of SSOT is crucial along the product lifecycle.

The importance may vary depending on the product type. The difference in complexity between an airplane and a plastic toy, for example. It is all about the risk and impact of a failure caused by the product.

During my holiday, the SSOT discussion was sparked on LinkedIn by Adam Keating, and the article starts with:

The “Single Source of Truth (SSOT)” wasn’t built for you. It was built for software vendors to get rich. Not a single company in the world has a proper SSOT.

A bit provocative, as there is nothing wrong with software vendors being profitable. Profitability guarantees the long-time support of the software solution. Remember the PLM consolidation around 2006, when SmarTeam, Matrix One (Dassault), Agile and Eigner & Partner (Oracle) were acquired, disappeared or switched to maintenance mode.

Therefore it makes sense to have a profitable business model or perhaps a real open source business model.

Still, the rest of the discussion was interesting, particularly in the LinkedIn comments. Adam mentioned the Authoritative Source of Truth (ASOT) as the new future. And although this concept becomes more and more visible in the PLM domain, I believe we need both. So, let’s have a look at these concepts.

 

Truth 1.0 – SSOT

Historically, manufacturing companies stored the truth in documents, first paper-based, later in electronic file formats and databases.

The truth consists of drawings, part lists, specifications, and other types of information.

Moreover, the information is labeled with revisions and versions to identify the information.

By keeping track of the related information through documents or part lists with significant numbers, a person in the company could find the correct corresponding information at any stage of the lifecycle.

Later, by storing all the information in a central (PLM) system, the impression might be created that this system is the Single Source Of Truth. The system Adam Keating agitated against in his LinkedIn post.

Although for many companies, the ERP has been the SSOT  (and still is). All relevant engineering information was copied into the ERP system as attached files. Documents are the authoritative, legal pieces of information that a company shares with suppliers, authorities, or customers. They can reside in PLM but also in ERP. Therefore, you need an infrastructure to manage the “truth.”

Note: The Truth 1.0 story is very much a hardware story.

Even for hardware, ensuring a consistent single version of the truth for each product remains difficult. In theory, its design specifications should match the manufacturing definition. The reality, however, shows that often this is not the case. Issues discovered during the manufacturing process are fixed in the plant – redlining the drawing  – is not always processed by engineering.

As a result, Engineering and Manufacturing might have a different version of what they consider the truth.

The challenge for a service engineer in the field is often to discover the real truth. So the “truth” might not always be in the expected place – no guaranteed Single Source Of Truth.

Configuration Management is a discipline connected to PLM to ensure that the truth is managed so that as-specified, as-manufactured, and as-delivered information has been labeled and documented unambiguously. In other words, you could say Configuration Management(CM) is aiming for the Single Source Of Truth for a product.

If you want to read more about the relation between PLM and CM  – read this post: PLM and Configuration Management (CM), where I speak with Martijn Dullaart about the association between PLM and CM.

Martijn has his blog mdux.net and is the Lead Architect for Enterprise Configuration Management at our Dutch pride ASML. Martijn is also Chairperson I4.0 Committee IPX Congress.

Summarizing: The Single Source Of Truth 1.0 concept is document-based and should rely on CM practices, which require skilled people and the right methodology. In addition, some industries require Truth 1.0.

Others take the risk of working without solid CM practices, and the PLM system might create the impression of the SSOT; it will not be the case, even for only hardware.

 Truth 2.0 – ASOT

Products have become more complex, mainly due to the combination of electronics and software. Their different lifecycles and the speed of change are hard to maintain using the traditional PLM approach of SSOT.

It will be impossible to maintain an SSOT, particularly if it is based on documents.

As CM is the discipline to ensure data consistency, it is important to look into the future of CM. At the end of last year, I discussed this topic with 3 CM thought leaders. Martijn Dullaart, Maxime Gravel and Lisa Fenwick discussed with me what they believe the change would be. Read and listen here: The future of Configuration Management.


From the discussion, it became clear that managing all the details is impossible; still, you need an overreaching baseline to identify the severity and impact of a change along the product lifecycle.

New methodologies can be developed for this, as reliable data can be used in algorithms to analyze a change impact. This brings us to the digital thread. According to the CIMdata definition used in the A&D digital twin phase 2 position paper:

The digital thread provides the ability for a business to have an Authoritative Source of Truth(ASOT), which is information available and connected in a core set of the enterprise systems across the lifecycle and supplier networks

The definition implies that, in the end, a decision is made on data from the most reliable, connected source. There might be different data in other locations. However, this information is less reliable. Updating or fixing this information does not make sense as the effort and cost of fixing will be too expensive and give no benefit.

Obviously, we need reliable data to implement the various types of digital twins.

As I am intrigued by the power of the brain – its strengths and weaknesses – the concept of ASOT can also be found in our brains. Daniel Kahneman’s book, Thinking Fast and Slow talks about the two systems/modes our brain uses. The Fast one (System 1 – low energy usage) could be the imaginary SSOT, whereas the Slow one (System 2 – high energy required) is the ASOT. The brain needs both, and I believe this is the same in our PLM domain.

A new PLM Paradigm

In this context, there is a vivid discussion about the System of Record and Systems of Engagement. I wrote about it in June (post: A new PLM paradigm); other authors name it differently, but all express a similar concept. Have a look at these recent articles and statements from:

Author Link to content

Authentise

 

The challenge of cross-discipline collaboration …….

Beyond PLM

 

When is the right time to change your PLM system + discussion

Colab

 

The Single Source Of Truth wasn’t built for you …….

Fraunhofer institute

 

Killing the PLM Monolith – the Emergence of cloud-native System Lifecycle Management (SysLM)

SAAB Group

 

Don’t mix the tenses. Managing the Present and the Future in an MBSE context

Yousef Hooshmand

 

From a Monolithic PLM Landscape to a Federated Domain and Data Mesh

If you want to learn more about these concepts and discuss them with some of the experts in this domain, come to the upcoming PLM Roadmap PTD Europe conference on 18-19 October in Gothenburg, Sweden. Have a look at the final agenda here

Register before September 12 to benefit from a 15 % Early Bird discount, which you can spend for the dinner after day 1. I look forward to discussing the SSOT/ASOT topics there.


Conclusion

The Single Source Of Truth (SSOT) and the Authoritative Source of Truth (ASOT) are terms that illustrate the traditional PLM paradigm is changing thanks to digitization and connected stakeholders. The change is in the air. Now, the experience has to come. So be part of the change and discuss with us.

 

In the last weeks, I had several discussions related to sustainability. What can companies do to become sustainable and prove it? But, unfortunately, there is so much greenwashing at this moment.

Look at this post: 10 Companies and Corporations Called Out For Greenwashing.

Therefore I thought about which practical steps a company should take to prepare for a sustainable future, as the change will not happen overnight. It reminds me of the path towards a digital, model-based enterprise (my other passion). In my post Why Model-Based definition is important for all, I mentioned that MBD (Model-Based Definition) could be considered the first stepping-stone toward a Model-Based enterprise.

The analogy for Material Compliance came after an Aras seminar I watched a month ago. The webinar How PLM Paves the Way for Sustainability with  Insensia (an Aras implementer) demonstrates how material compliance is the first step toward sustainable product development.

Let’s understand why

The first steps

Companies that currently deliver solutions mostly only focus on economic gains. The projects or products they sell need to be profitable and competitive, which makes sense if you want a future.

And this would not have changed if the awareness of climate impact has not become apparent.

First, CFKs and hazardous materials lead to new regulations. Next global agreements to fight climate change – the Paris agreement and more to come – have led and will lead to regulations that will change how products will be developed. All companies will have to change their product development and delivery models when it becomes a global mandate.

A required change is likely going to happen. In Europe, the Green Deal is making stable progress. However, what will happen in the US will be a mystery as even their supreme court becomes a political entity against sustainability (money first).

Still, compliance with regulations will be required if a company wants to operate in a global market.

What is Material Compliance?

In 2002, the European Union published a directive to restrict hazardous substances in materials. The directive, known as RoHS (Restriction of Hazardous Substances), was mainly related to electronic components. In the first directive, six hazardous materials were restricted.

The most infamous are Cadmium(Cd), Lead(Pb), and Mercury (Hg). In 2006 all products on the EU market must pass RoHS compliance, and in 2011 was now connected the CE marking of products sold in the European market was.

In 2015 four additional chemical substances were added, most softening PVC but also affecting the immune system. Meanwhile, other countries have introduced similar RoHS regulations; therefore, we can see it as a global restricting. Read more here: The RoHS guide.

Consumers buying RoHS-compliant products now can be assured that none of the threshold values of the substances is reached in the product. The challenge for the manufacturer is to go through each of the components of the MBOM. To understand if it contains one of the ten restricted substances and, if yes, in which quantity.

Therefore, they need to get that information from each relevant supplier a RoHS declaration.

Besides RoHS, additional regulations protect the environment and the consumer. For example, REACH (Registration, Evaluation, Authorization and Restriction of Chemicals) compliance deals with the regulations created to improve the environment and protect human health. In addition, REACH addresses the risks associated with chemicals and promotes alternative methods for the hazard assessment of substances.

The compliance process in four steps

Material compliance is most of all the job of engineers. Therefore around 2005, some of my customers started to add RoHS support to their PLM environment.

 

Step 1

The image below shows the simple implementation – the PDF-from from the supplier was linked to the (M)BOM part.

An employee had to manually add the substances into a table and ensure the threshold values were not reached. But, of course, there was already a selection of preferred manufacturer parts during the engineering phase. Therefore RoHS compliance was almost guaranteed when releasing the EBOM.

But this process could be done more cleverly.

 

Step 2

So the next step was that manufacturers started to extend their PLM data model with the additional attributes for RoHS compliance. Again, this could be done cleverly or extremely generic, adding the attributes to all parts.

So now, when receiving the material declaration, a person just has to add the substance values to the part attributes. Then, through either standard functionality or customization, a compliance report could be generated for the (M)BOM. So this already saves some work.

 

Step 3

The next step was to provide direct access to these attributes to the supplier and push the supplier to do the work.

Now the overhead for the manufacturer has been reduced again. This is because only the supplier needs to do the job for his customer.

 

Step 4

In step 4, we see a real connected environment, where information is stored only once, referenced by manufacturers, and kept actual by the part suppliers.

Who will host the RoHS databank? From some of my customer projects, I recall IHS as a data provider – it seems they are into this business when you look at their website HERE.

 

Where is your company at this moment?

Having seen the four stepping-stones leading towards efficient RoHS compliance, you see the challenge of moving from a document-driven approach to a data-driven approach.

Now let’s look into the future. Concepts like Life Cycle Assessment (LCA) or a Digital Product Passport (DPP) will require a fully connected approach.

Where is your company at this moment – have you reached RoHS compliance step 3 or 4? A first step to learn and work connected and data-driven.

 

Life Cycle Assessment – the ultimate target

A lifecycle assessment, or lifecycle analysis (two times LCA again), is a methodology to assess the environmental impact of a product (or solution) through its whole lifecycle. From materials sourcing, manufacturing, transportation, usage, service, and decommissioning. And by assessing, we mean a clear, verifiable, and shareable manner, not just guessing.

Traditional engineering education is not bringing these skills, although LCA is not new, as this 10-years old YouTube movie from Autodesk illustrates:

What is new is that due to global understanding, we are reaching the limits of what our planet can endure; we must act now. Upcoming international regulations will enforce life cycle analysis reporting for manufacturers or service providers. This will happen gradually.

Meanwhile, we all should work on a circular economy, the major framework for a sustainable planet- click on the image on the left.

In my post, I wrote about these combined topics: SYSTEMS THINKING – a must-have skill in the 21st century.

 

Life Cycle Analysis – Digital Twin – Digitization

The big elephant in the room is that when we talk about introducing LCA in your company, it has a lot to do with the digitization of your company. Assessment data in a document can require too much human effort to maintain the data at the right quality. The costs are not affordable if your competitor is more efficient.

When coming to the Analysis part, here, a model-based, data-driven infrastructure is the most efficient way to run virtual analysis, using digital twin concepts at each stage of the product lifecycle.

Virtual models for design, manufacturing and operations allow your company to make trade-off studies with low cost before committing to the physical world. 80 % of the environmental impact of a product comes from decisions in the virtual world.

Once you have your digital twins for each phase of the product lifecycle, you can benchmark your models with data reported from the physical world. All these interactions can be found in the beautiful Boeing diamond below, which I discussed before – Read A digital twin for everybody.

 

Conclusion

Efficient and sustainable life cycle assessment and analysis will come from connected information sources. The old document-driven paradigm is too costly and too slow to maintain. In particular, when the scope is not only a subset of your product, it is your full product and its full lifecycle with LCA. Another stepping stone towards the near future. Where are you?

 

Stepping-stone 1:            From Model-Based Definition to an efficient Model-Based, Data-driven Enterprise

Stepping-stone 2:            For RoHS compliance to an efficient and sustainable Model-Based, data-driven enterprise.

Once and a while, the discussion pops up if, given the changes in technology and business scope, we still should talk about PLM. John Stark and others have been making a point that PLM should become a profession.

In a way, I like the vagueness of the definition and the fact that the PLM profession is not written in stone. There is an ongoing change, and who wants to be certified for the past or framed to the past?

However, most people, particularly at the C-level, consider PLM as something complex, costly, and related to engineering. Partly this had to do with the early introduction of PLM, which was a little more advanced than PDM.

The focus and capabilities made engineering teams happy by giving them more access to their data. But unfortunately, that did not work, as engineers are not looking for more control.

Old (current) PLM

Therefore, I would like to suggest that when we talk about PLM, we frame it as Product Lifecycle Data Management (the definition). A PLM infrastructure or system should be considered the System of Record, ensuring product data is archived to be used for manufacturing, service, and proving compliance with regulations.

In a modern way, the digital thread results from building such an infrastructure with related artifacts. The digital thread is somehow a slow-moving environment, connecting the various as-xxx structures (As-Designed, As-Planned, As-Manufactured, etc.). Looking at the different PLM vendor images, Aras example above, I consider the digital thread a fancy name for traceability.

I discussed the topic of Digital Thread in 2018:  Document Management or Digital Thread. One of the observations was that few people talk about the quality of the relations when providing traceability between artifacts.

The quality of traceability is relevant for traditional Configuration Management (CM). Traditional CM has been framed, like PLM, to be engineering-centric.

Both PLM and CM need to become enterprise activities – perhaps unified.

Read my blog post and see the discussion with Martijn Dullaart, Lisa Fenwick and Maxim Gravel when discussing the future of Configuration Management.

New digital PLM

In my posts, I talked about modern PLM. I described it as data-driven, often in relation to a model-based approach. And as a result of the data-driven approach, a digital PLM environment could be connected to processes outside the engineering domain. I wrote a series of posts related to the potential of such a new PLM infrastructure (The road to model-based and connected PLM)

Digital PLM, if implemented correctly, could serve people along the full product lifecycle, from marketing/portfolio management until service and, if relevant, decommissioning). The bigger challenge is even connecting eco-systems to the same infrastructure, in particular suppliers & partners but also customers. This is the new platform paradigm.

Some years ago, people stated IoT is the new PLM  (IoT is the new PLM – PTC 2017). Or MBSE is the foundation for a new PLM (Will MBSE be the new PLM instead of IoT? A discussion @ PLM Roadmap conference 2018).

Even Digital Transformation was mentioned at that time. I don’t believe Digital Transformation is pointing to a domain, more to an ongoing process that most companies have t go through. And because it is so commonly used, it becomes too vague for the specifics of our domain. I liked Monica Schnitger‘s LinkedIn post: Digital Transformation? Let’s talk. There is enough to talk about; we have to learn and be more specific.

 

What is the difference?

The challenge is that we need more in-depth thinking about what a “digital transformed” company would look like. What would impact their business, their IT infrastructure, and their organization and people? As I discussed with Oleg Shilovitsky, a data-driven approach does not necessarily mean simplification.

I just finished recording a podcast with Nina Dar while writing this post. She is even more than me, active in the domain of PLM and strategic leadership toward a digital and sustainable future. You can find the pre-announcement of our podcast here (it was great fun to talk), and I will share the result later here too.

What is clear to me is that a new future data-driven environment becomes like a System of Engagement. You can simulate assumptions and verify and qualify trade-offs in real-time in this environment. And not only product behavior, but you can also simulate and analyze behaviors all along the lifecycle, supporting business decisions.

This is where I position the digital twin. Modern PLM infrastructures are in real-time connected to the business. Still, PLM will have its system of record needs; however, the real value will come from the real-time collaboration.

The traditional PLM consultant should transform into a business consultant, understanding technology. Historically this was the opposite, creating friction in companies.

Starting from the business needs

In my interactions with customers, the focus is no longer on traditional PLM; we discuss business scenarios where the company will benefit from a data-driven approach. You will not obtain significant benefits if you just implement your serial processes again in a digital PLM infrastructure.

Efficiency gains are often single digit, where new ways of working can result in double-digit benefits or new opportunities.

Besides traditional pressure on companies to remain competitive, there is now a new additional driver that I have been discussing in my previous post, the Innovation Dilemma. To survive on our planet, we and therefore also companies, need to switch to sustainable products and business models.

This is a push for innovation; however, it requires a coordinated, end-to-end change within companies.

Be the change

When do you decide to change your business model from pushing products to the marker into a business model of Product as a Service? When do you choose to create repairable and upgradeable products? It is a business need. Sustainability does not start with the engineer. It must be part of the (new) DNA of a company.

Interesting to read is this article from Jan Bosch that I read this morning: Resistance to Change. Read the article as it makes so much sense, but we need more than sense – we need people to get involved. My favorite quote from the article:

“The reasonable man adapts himself to the world; the unreasonable one persists in trying to adapt the world to himself. Therefore, all progress depends on the unreasonable man”.

Conclusion

PLM consultants should retrain themselves in System Thinking and start from the business. PLM technology alone is no longer enough to support companies in their (digital/sustainable) transformation. Therefore, I would like to introduce BLM (Business Lifecycle Management) as the new TLA.

However, BLM has been already framed as Black Lives Matter. I agree with that, extending it to ALM (All Lives Matter).

What do you think should we leave the comfortable term PLM behind us for a new frame?

In February, the PLM Global Green Alliance published our first interview discussing the relationship between PLM and Sustainability with the main vendors. We talked with Darren West from SAP.

You can find the interview here: PLM and Sustainability: talking with SAP. We spoke with Darren about SAP’s Responsible Design and Production module, allowing companies to understand their environmental and economic impact by calculating fees and taxes and implement measures to reduce regulatory costs. The high reliance on accurate data was one of the topics in our discussion.

In March,  we interviewed Zoé Bezpalko and Jon den Hartog from Autodesk. Besides Autodesk’s impressive sustainability program, we discussed Autodesk’s BIM technology helping the construction industry to become greener and their Generative Design solution to support the designer in making better material usage or reuse decisions.

The discussion ended with discussing Life Cycle Assessment tools to support the engineer in making sustainable decisions.

In my last blog post, the Innovation Dilemma, I explored the challenges of a Life Cycle Assessment. As it appears, it is not about just installing a tool. The concepts of a data-driven PLM infrastructure and digital twins are strong transformation prerequisites combined with the Inner Development Goals (IDG).

The IDGs are a human attitude needed besides the Sustainability Development Goals.

Therefore we were happy to discuss last week with Florence Verzelen, Executive Vice President Industry, Marketing & Sustainability and Xavier Adam, Worldwide Sustainability Senior Manager from Dassault Systemes. We discussed Dassault Systemes’ business sustainability goals and product offerings based on the 3DEXPERIENCE platform.

Have a look at the discussion below:


The slides shown in the recording can be found HERE.

 

What I learned

Dassault Systemes’ purpose has been to help their customers imagine sustainable innovations capable of harmonizing product, nature, and life for many years. A statement that now is slowly bubbling up in other companies too. Dassault Systemes has set a clear and interesting target for themselves in 2025. In that year two/thirds of their sales should come from solutions that make their customers more sustainable.

Their Eco-design solution is one of the first offerings to reach this objective. Their Life Cycle Assessment solution can govern your (virtual) product design on multiple criteria, not only greenhouse gas emissions.  It will be interesting to follow up on this topic to see how companies make the change internally by relying on data and virtual twins of a product or a manufacturing process.

Want to learn more?

Conclusion

80 % of the environmental impact of products is decided during the design phase. A Lifecycle Assessment Solutions combined with a virtual product model, the virtual design twin, allows you to decide on trade-offs in the virtual space before committing to the physical solution. Creating a data-driven, closed-loop between design, engineering, manufacturing and operations based on accurate data is the envisioned infrastructure for a sustainable future.

Yes, it is not a typo. Clayton Christensen famous book written in 1995 discussed the Innovator’s Dilemma when new technologies cause great firms to fail. This was the challenge two decades ago. Existing prominent companies could become obsolete quickly as they were bypassed by new technologies.

The examples are well known. To mention a few: DEC (Digital Equipment Corporation), Kodak, and Nokia.

Why the innovation dilemma?

This decade the challenge has become different. All companies are forced to become more sustainable in the next ten years. Either pushed by global regulations or because of their customer demands. The challenge is this time different. Besides the priority of reducing greenhouse gas emissions, there is also the need to transform our society from a linear, continuous growth economy into a circular doughnut economy.

The circular economy makes the creation, the usage and the reuse of our products more complex as the challenge is to reduce the need for raw materials and avoid landfills.

The circular economy concept – the regular product lifecycle in the middle

The doughnut economy makes the values of an economy more complex as it is not only about money and growth, human and environmental factors should also be considered.

Doughnut Economics: Trying to stay within the green boundaries

To manage this complexity, I wrote SYSTEMS THINKING – a must-have skill in the 21st century, focusing on the logical part of the brain. In my follow-up post, Systems Thinking: a second thought, I looked at the human challenge. Our brain is not rational and wants to think fast to solve direct threats. Therefore, we have to overcome our old brains to make progress.

An interesting and thought-provoking was shared by Nina Dar in this discussion, sharing the video below. The 17 Sustainability Development Goals (SDGs) describe what needs to be done. However, we also need the Inner Development Goals (IDGs) and the human side to connect. Watch the movie:

Our society needs to change and innovate; however, we cannot. The Innovation Dilemma.

The future is data-driven and digital.

What is clear to me is that companies developing products and services have only one way to move forward: becoming data-driven and digital.

Why data-driven and digital?

Let’s look at something companies might already practice, REACH (Registration, Evaluation, Authorization and Restriction of Chemicals). This European directive, introduced in 2007, had the aim to protect human health and protect the environment by communicating information on chemicals up and down the supply chain. This would ensure that manufacturers, importers, and their customers are aware of information relating to the health and safety of the products supplied.

The regulation is currently still suffering in execution as most of the reporting and evaluation of chemicals is done manually. Suppliers report their chemicals in documents, and companies report the total of chemicals in their summary reports. Then, finally, authorities have to go through these reports.

Where the scale of REACH is limited, the manual effort to have end-to-end reporting is relatively high. In addition, skilled workers are needed to do the job because reporting is done in a document-based manner.

Life Cycle Assessments (LCA)

Where you might think REACH is relatively simple, the real new challenges for companies are the need to perform Life Cycle Assessments for their products. In a Life Cycle Assessment. The Wiki definition of LCA says:

Life cycle assessment or LCA (also known as life cycle analysis) is a methodology for assessing environmental impacts associated with all the stages of the life cycle of a commercial product, process, or service. For instance, in the case of a manufactured product, environmental impacts are assessed from raw material extraction and processing (cradle), through the product’s manufacture, distribution and use, to the recycling or final disposal of the materials composing it (grave)

This will be a shift in the way companies need to define products. Much more thinking and analysis are required in the early design phases. Before committing to a physical solution, engineers and manufacturing engineers need to simulate and calculate the impact of their design decisions in the virtual world.

This is where the digital twin of the design and the digital twin of the manufacturing process becomes relevant. And remember: Digital Twins do not run on documents – you need connected data and various types of models to calculate and estimate the environmental impact.

LCA done in a document-based manner will make your company too slow and expensive.

I described this needed transformation in my series from last year: The road to model-based and connected PLM – nine posts exploring the technology and concept of a model-based, data-driven PLM infrastructure.

Digital Product Passport (DPP)

The European Commission has published an action plan for the circular economy, one of the most important building blocks of the European Green Deal. One of the defined measures is the gradual introduction of a Digital Product Passport (DPP). As the quality of an LCA depends on the quality and trustworthy information about products and materials, the DPP is targeting to ensure circular economy metrics become reliable.

This will be a long journey. If you want to catch a glimpse of the complexity, read this Medium article: The digital product passport and its technical implementation related to the DPP for batteries.

The innovation dilemma

Suppose you agree with my conclusion that companies need to change their current product or service development into a data-driven and model-based manner. In that case, the question will come up: where to start?

Becoming data-driven and model-based, of course, is not the business driver. However, this change is needed to be able to perform Life Cycle Assessments and comply with current and future regulations by remaining competitive.

A document-driven approach is a dead-end.

Now let’s look at the real dilemmas by comparing a startup (clean sheet / no legacy) and an existing enterprise (experience with the past/legacy). Is there a winning approach?

The Startup

Having lived in Israel – the nation where almost everyone is a startup – and working with startups afterward in the past 10 years, I always get inspired by these people’s energy in startup companies. They have a unique value proposition most of the time, and they want to be visible on the market as soon as possible.

This approach is the opposite of systems thinking. It is often a very linear process to deliver this value proposition without exploring the side effects of such an approach.

For example, the new “green” transportation hype. Many cities now have been flooded with “green” scooters and electric bikes to promote transportation as a service. The idea behind this concept is that citizens do not require to own polluting motorbikes or cars anymore, and transportation means will be shared. Therefore, the city will be cleaner and greener.

However, these “green” vehicles are often designed in the traditional linear way. Is there a repair plan or a plan to recycle the batteries? Reuse of materials used.? Most of the time, not. Please, if you have examples contradicting my observations, let me know. I like to hear good news.

When startup companies start to scale, they need experts to help them grow the company. Often these experts are seasoned people, perhaps close to retirement. They will share their experience and what they know best from the past:  traditional linear thinking.

As a result, even though startup companies can start with a clean sheet, their focus on delivering the product or service blocks further thinking. Instead, the seasoned experts will drive the company towards ways of working they know from the past.

Out of curiosity: Do you know or work in a startup that has started with a data-driven and model-based vision from scratch?  Please add the name of this company in the comments, and let’s learn how they did it.

The Existing company

Working in an established company is like being on board a big tanker. Changing its direction takes a clear eye on the target and navigation skills to come there. Unfortunately, most of the time, these changes take years as it is impossible to switch the PLM infrastructure and the people skills within a short time.

From the bimodal approach in 2015 to the hybrid approach for companies, inspired by this 2017 McKinsey article: Toward an integrated technology operating model, I discovered that this is probably the best approach to ensure a change will happen. In this approach – see image – the organization keeps running on its document-driven PLM infrastructure. This type of infrastructure becomes the system of record. Nothing different from what PLM currently is in most companies.

In parallel, you have to start with small groups of people who independently focus on a new product, a new service. Using the model-based approach, they work completely independently from the big enterprise in a data-driven approach. Their environment can be considered the future system of engagement.

The data-driven approach allows all disciplines to work in a connected, real-time manner. Mastering the new ways of working is usually the task of younger employees that are digital natives. These teams can be completed by experienced workers who behave as coaches. However, they will not work in the new environment; these coaches bring business knowledge to the team.

People cannot work in two modes, but organizations can. As you can see from the McKinsey chart, the digital teams will get bigger and more important for the core business over time. In parallel, when their data usage grows, more and more data integration will occur between the two operation modes. Therefore, the old PLM infrastructure can remain a System of Record and serve as a support backbone for the new systems of engagement.

The Innovation Dilemma conclusion

The upcoming ten years will push organizations to innovate their ways of working to become sustainable and competitive. As discussed before, they must learn to work in a data-driven, connected manner. Both startups and existing enterprises have challenges – they need to overcome the “thinking fast and acting slow” mindset. Do you see the change in your company?

 

Note: Before publishing this post, I read this interesting and complementary post from Jan Bosch Boost your digitalization: instrumentation.

It is in the air – grab it.

 

After two quiet weeks of spending time with my family in slow motion, it is time to start the year.

First of all, I wish you all a happy, healthy, and positive outcome for 2022, as we need energy and positivism together. Then, of course, a good start is always cleaning up your desk and only leaving the relevant things for work on the desk.

Still, I have some books at arm’s length, either physical or on my e-reader, that I want to share with you – first, the non-obvious ones:

The Innovators Dilemma

A must-read book was written by Clayton Christensen explaining how new technologies can overthrow established big companies within a very short period. The term Disruptive Innovation comes up here. Companies need to remain aware of what is happening outside and ready to adapt to your business. There are many examples even recently where big established brands are gone or diminished in a short period.

In his book, he wrote about DEC (Digital Equipment Company)  market leader in minicomputers, not having seen the threat of the PC. Or later Blockbuster (from video rental to streaming), Kodak (from analog photography to digital imaging) or as a double example NOKIA (from paper to market leader in mobile phones killed by the smartphone).

The book always inspired me to be alert for new technologies, how simple they might look like, as simplicity is the answer at the end. I wrote about in 2012: The Innovator’s Dilemma and PLM, where I believed cloud, search-based applications and Facebook-like environments could disrupt the PLM world. None of this happened as a disruption; these technologies are now, most of the time, integrated by the major vendors whose businesses are not really disrupted. Newcomers still have a hard time to concur marketspace.

In 2015 I wrote again about this book, The Innovator’s dilemma and Generation change. – image above. At that time, understanding disruption will not happen in the PLM domain. Instead, I predict there will be a more evolutionary process, which I would later call: From Coordinated to Connected.

The future ways of working address the new skills needed for the future. You need to become a digital native, as COVID-19 pushed many organizations to do so. But digital native alone does not bring success. We need new ways of working which are more difficult to implement.

Sapiens

The book Sapiens by Yuval Harari made me realize the importance of storytelling in the domain of PLM and business transformation. In short, Yuval Harari explains why the human race became so dominant because we were able to align large groups around an abstract theme. The abstract theme can be related to religion, the power of a race or nation, the value of money, or even a brand’s image.

The myth (read: simplified and abstract story) hides complexity and inconsistencies. It allows everyone to get motivated to work towards one common goal. A Yuval says: “Fiction is far more powerful because reality is too complex”.

Too often, I have seen well-analyzed PLM projects that were “killed” by management because it was considered too complex. I wrote about this in 2019  PLM – measurable or a myth? claiming that the real benefits of PLM are hard to predict, and we should not look isolated only to PLM.

My 2020 follow-up post The PLM ROI Myth, eludes to that topic. However, even if you have a soundproof business case at the management level, still the myth might be decisive to justify the investment.

That’s why PLM vendors are always working on their myths: the most cost-effective solution, the most visionary solution, the solution most used by your peers and many other messages to influence your emotions, not your factual thinking. So just read the myths on their websites.

If you have no time to read the book, look at the above 2015 Ted to grasp the concept and use it with a PLM -twisted mind.

Re-use your CAD

In 2015, I read this book during a summer holiday (meanwhile, there is a second edition). Although it was not a PLM book, it was helping me to understand the transition effort from a classical document-driven enterprise towards a model-based enterprise.

Jennifer Herron‘s book helps companies to understand how to break down the (information) wall between engineering and manufacturing.

At that time, I contacted Jennifer to see if others like her and Action Engineering could explain Model-Based Definition comprehensively, for example, in Europe- with no success.

As the Model-Based Enterprise becomes more and more the apparent future for companies that want to be competitive or benefit from the various Digital Twin concepts. For that reason, I contacted Jennifer again last year in my post: PLM and Model-Based Definition.

As you can read, the world has improved, there is a new version of the book, and there is more and more information to share about the benefits of a model-based approach.

I am still referencing Action Engineering and their OSCAR learning environment for my customers. Unfortunately, many small and medium enterprises do not have the resources and skills to implement a model-based environment.

Instead, these companies stay on their customers’ lowest denominator: the 2D Drawing. For me, a model-based definition is one of the first steps to master if your company wants to provide digital continuity of design and engineering information towards manufacturing and operations. Digital twins do not run on documents; they require model-based environments.

The book is still on my desk, and all the time, I am working on finding the best PLM practices related to a Model-Based enterprise.

It is a learning journey to deal with a data-driven, model-based environment, not only for PLM but also for CM experts, as you might have seen from my recent dialogue with CM experts: The future of Configuration Management.

Products2019

This book was an interesting novelty published by John Stark in 2020. John is known for his academic and educational books related to PLM. However, during the early days of the COVID-pandemic, John decided to write a novel. The novel describes the learning journey of Jane from Somerset, who, as part of her MBA studies, is performing a research project for the Josef Mayer Maschinenfabrik. Her mission is to report to the newly appointed CEO what happens with the company’s products all along the lifecycle.

Although it is not directly a PLM book, the book illustrates the complexity of PLM. It Is about people and culture; many different processes, often disconnected. Everyone has their focus on their particular discipline in the center of importance. If you believe PLM is all about the best technology only, read this book and learn how many other aspects are also relevant.

I wrote about the book in 2020: Products2019 – a must-read if you are new to PLM if you want to read more details. An important point to pick up from this book is that it is not about PLM but about doing business.

PLM is not a magical product. Instead, it is a strategy to support and improve your business.

System Lifecycle Management

Another book, published a little later and motivated by the extra time we all got during the COVID-19 pandemic, was Martin Eigner‘s book System Lifecycle Management.

A 281-page journey from the early days of data management towards what Martin calls System Lifecycle Management (SysLM). He was one of the first to talk about System Lifecycle Management instead of PLM.

I always enjoyed Martin’s presentations at various PLM conferences where we met. In many ways, we share similar ideas. However, during his time as a professor at the University of Kaiserslautern (2003-2017), he explored new concepts with his students.

I briefly mentioned the book in my series The road to model-based and connected PLM (Part 5) when discussing SLM or SysLM. His academic research and analysis make this book very valuable. It takes you in a very structured way through the times that mechatronics becomes important, next the time that systems (hardware and software) become important.

We discussed in 2015 the applicability of the bimodal approach for PLM. However, as many enterprises are locked in their highly customized PDM/PLM environments, their legacy blocks the introduction of modern model-based and connected approaches.

Where John Stark’s book might miss the PLM details, Martin’s book brings you everything in detail and with all its references.

It is an interesting book if you want to catch up with what has happened in the past 20 years.

More Books …..

More books on my desk have helped me understand the past or that helped me shape the future. As this is a blog post, I will not discuss more books this time reaching my 1500 words.

Still books worthwhile to read – click on their images to learn more:

I discussed this book two times last year. An introduction in PLM and Modularity and a discussion with the authors and some readers of the book: The Modular Way – a follow-up discussion

x

x

A book I read this summer contributed to a better understanding of sustainability. I mentioned this book in my presentation for the Swedish CATIA Forum in October last year – slide 29 of The Challenges of model-based and traditional plm. So you could see it as an introduction to System Thinking from an economic point of view.

System Thinking becomes crucial for a sustainable future, as I addressed in my post PLM and Sustainability.

Sustainability is my area of interest at the PLM Green Global Alliance, an international community of professionals working with Product Lifecycle Management (PLM) enabling technologies and collaborating for a more sustainable decarbonized circular economy.

Conclusion

There is a lot to learn. Tell us something about your PLM bookshelf – which books would you recommend. In the upcoming posts, I will further focus on PLM education. So stay tuned and keep on learning.

Translate

Categories

  1. Jos, one could take the approach that there is an engineering transformation strategy that can be realized by implementing PLM…

  2. Jos, I agree we should break out from the monolithic approach as this typically means lock-in, risk and frustration. The…

  3. Jos, Thanks for these insights. I believe that the mature capabilities provided by advanced toolsets can also be of benefit…

%d bloggers like this: