You are currently browsing the category archive for the ‘Configuration Management’ category.

Those who have read my blog posts over the years will have seen the image to the left.

The people, processes and tools slogan points to the best practice of implementing (PLM and CM) systems.

Theoretically, a PLM implementation will move smoothly if the company first agrees on the desired processes and people involved before a system implementation using the right tools.

Too often, companies start from their historical landscape (the tools – starting with a vendor selection) and then try to figure out the optimal usage of their systems. The best example of this approach is the interaction between PDM(PLM) and ERP.

 

PDM and ERP

Historically ERP was the first enterprise system that most companies implemented. For product development, there was the PDM system, an engineering tool, and for execution, there was the ERP system. Since ERP focuses on the company’s execution, the system became the management’s favorite.

The ERP system and its information were needed to run and control the company. Unfortunately, this approach has introduced the idea that the ERP system should also be the source of the part information, as it was often the first enterprise system for a company. The PDM system was often considered an engineering tool only. And when we talk about a PLM system, who really implements PLM as an enterprise system or was it still an engineering tool?

This is an example of Tools, Processes, and People – A BAD PRACTICE.

Imagine an engineer who wants to introduce a new part needed for a product to deliver. In many companies at the beginning of this century, even before starting the exercise, the engineer had to request a part number from the ERP system. This is implementation complexity #1.

Next, the engineer starts developing versions of the part based on the requirements. Ultimately the engineer might come to the conclusion this part will never be implemented. The reserved part number in ERP has been wasted – what to do?

It sounds weird, but this was a reality in discussions on this topic until ten years ago.

Next, as the ERP system could only deal with 7 digits, what about part number reuse? In conclusion, it is a considerable risk that reused part numbers can lead to errors. With the introduction of the PLM systems, there was the opportunity to bridge the gap between engineering and manufacturing. Now it is clear for most companies that the engineer should create the initial part number.

Only when the conceptual part becomes approved to be used for the realization of the product, an exchange with the ERP system will be needed. Using the same part number or not, we do not care if we can map both identifiers between these environments and have traceability.

It took almost 10 years from PDM to PLM until companies agreed on this approach, and I am curious about your company’s status.

Meanwhile, in the PLM world, we have evolved on this topic. The part and the BOM are no longer simple entities. Instead, we often differentiate between EBOM and MBOM, and the parts in those BOMs are not necessarily the same.

In this context, I like Prof. Dr. Jörg W. Fischer‘s framing:
EBOM is the specification, and MBOM is the realization.
(Leider schreibt Er viel auf Deutsch).

An interesting discussion initiated by Jörg last week was again about the interaction between PLM and ERP. The article is an excellent example of how potentially mainstream enterprises are thinking. PLM = Siemens, ERP = SAP – an illustration of the “tools first” mindset before the ideal process is defined.

There was nothing wrong with that in the early days, as connectivity between different systems was difficult and expensive. Therefore people with a 20 year of experience might still rely on their systems infrastructure instead of data flow.

But enough about the bad practice – let’s go to people, processes, (data), and Tools

People, Processes, Data and Tools?

I got inspired by this topic, seeing this post two weeks ago from Juha Korpela, claiming:

Okay, so maybe a hot take, maybe not, but: the old “People, Process, Technology” trinity is one of the most harmful thinking patterns you can have. It leaves out a key element: Data.

His full post was quite focused on data, and I liked the ” wrapping post” from Dr. Nicolas Figay here, putting things more in perspective from his point of view. The reply made me think about how this discussion fits into the PLM digital transformation discussion. How would it work in the two major themes I use to explain the digital transformation in the PLM landscape?

For incidental readers of my blog, these are the two major themes I am using:

  1. From Coordinated to Connected, based on the famous diagram from Marc Halpern (image below). The coordinated approach based on documents (files) requires a particular timing (processes) and context (Bills of Information) – it is the traditional and current PLM approach for most companies. On the other hand, the Connected approach is based on connected datasets (here, we talk about data – not files). These connected datasets are available in different contexts, in real-time, to be used by all kinds of applications, particularly modeling applications. Read about it in the series: The road to model-based and connected PLM.
    .
  2. The need to split PLM, thinking in System(s) of Record and Systems of Engagement. (example below) The idea behind this split is driven by the observation that companies need various Systems of Record for configuration management, change management, compliance and realization. These activities sound like traditional PLM targets and could still be done in these systems. New in the discussion is the System of Engagement which focuses on a specific value stream in a digitally connected manner. Here data is essential.I discussed the coexistence of these two approaches in my post Time to Split PLM. A post on LinkedIn with many discussions and reshares illustrating the topic is hot. And I am happy to discuss “split PLM architectures” with all of you.

These two concepts discuss the processes and the tools, but what about the people? Here I came to a conclusion to complete the story, we have to imagine three kinds of people. And this will not be new. We have the creators of data, the controllers of data and the consumers of data. Let’s zoom in on their specifics.

 

A new representation?

I am looking for a new simplifaction of the people, processes, and tools trinity combined with data; I got inspired by the work Don Farr did at Boeing, where he worked on a new visual representation for the model-based enterprise. You might have seen the image on the left before – click on it to see it in detail.

I wrote the first time about this new representation in my post: The weekend after CIMdata Roadmap / PDT Europe 2018

Related to Configuration Management, Martijn Dullaart and Martin Haket have also worked on a diagram with their peers to depict the scope of CM and Impact Analysis. The image leads to the post with my favorite quote: Communication is merely an exchange of information, but connections tell the story.

Below I share my first attempt to combine the people, process and tools trinity with the concepts of document and data, system(s) of record and system(s) of engagement. Trying to build the story.  Look if you recognize the aspects of the discussion above, and feel free to develop enhancements.

I look forward to your suggestions. Like the understanding that we have to split PLM thinking, as it impacts how we look at implementations.

Conclusion

Digital transformation in the PLM domain is forcing us to think differently. There will still be processes based on people collecting, interpreting and combining information. However, there will also be a new domain of connected data interpreted by models and algorithms, not necessarily depending on processes.

Therefore we need to work on new representations that can be used to tell this combined story. What do you think? How can we improve?

 

This week there was an interesting discussion on LinkedIn initiated by Alex Bruskin from Senticore Technologies. I have known Alex for over 20 years, starting from the SmarTeam days and later through encounters in the PLM space. Alex is a real techie on the outside but also a person with a very creative mind to connect technology to business.

You can see his LinkedIn featured posts here to get an impression.

 

Where is PLM @ Startups?

This time Alex shared an observation from an event organized by the Pittsburgh Robotics Network, where he spoke with several startups.

His point, and I quote Alex:

Then, I spoke to a number of presenters there, explaining Senticore capabilities and listening to their situation around engineering/ manufacturing.

– many startups offered an add-on to other platforms => an autonomous module for UAV/helicopter/Vehicle. Some offered robotic components or entire robots (robot-dog).

– all startups use #solidworks , and none use #catia or #nx

– none of them have a PLM system nor an MES. I am 90% certain none of them have ERP, either. They all are apparently using #excel for all these purposes.

– only a handful of them are considering getting a PLM system in the near future.

Read the full post here and the comments below to get a broader insight into the topic.

 

The PLM Doctor knows it all.

The point reminded me of an episode I did together with Helena Gutierrez from Share PLM last year. She asked the same question to the PLM Doctor.

Do you think PLM is only for big corporations or can startups also benefit from it?

You can see the conversation here:

 

Meanwhile, the PLM Doctor is unemployed due to the lack of incoming questions.

When looking at startups, I could see two paths. One is the traditional path based on historical mechanical PLM, and a second (potential) approach which is based on understanding the future complexity of the startup offering.

 

There are two paths – path #1

The first evolutionary path you might have seen a few times before in my blog post is the one depicted by Marc Halpern from Gartner in 2015. At that time, we started discussing Product Innovation Platforms and the new generation of PLM. You can see Marc’s slide below, which is still valid for most situations.

In the slide above, you see the startup company on the left side.

Often the main purpose of a startup company is to be visible on the market with their concept as fast as possible. Startups are often driven by a small group of multifunctional people developing a solution. In this approach, there is no place for people and reflection on processes as they are considered overhead.

Only when you target your solution in a strongly regulated environment, e.g., medical devices and aerospace, you need to focus on the process too.

Therefore it is logical that most startup companies focus on the tools to develop their solution. A logical path, as what could you do without tools? Next, the choice of the tools will be, most of the time, driven by the team’s experience and available skills in the market.

Again statistics show it is not likely that advanced tools like NX or CATIA will be chosen for the design part. More likely mid-market products like SolidWorks or Autodesk products. And for data management and reporting, the logical tools are the office tools, Excel, Word and Visio.

And don’t forget PowerPoint to sell the solution.

The role of investors is often also here to question investments that are not clearly understood or relevant at that time.

How a startup scales up very much depends on the choices they make for Repeatable business. This is the moment that a company starts to create its legacy. Processes and best practices need to be established and why you often see is that seasoned people join the company. These people have proven their skills in the past, and most likely, they are willing to repeat this.

And here comes the risk – experienced people come with a much better holistic overview of the product lifecycle aspects. They know what critical steps are needed to move the company to an Integrated business. These experiences are crucial; however, they should not become the new single standard.

Implementing the past is not a guarantee for success in a digital and connected future.

Implementing their past experiences would focus too much on creating a System of Record (PLM 1.0), which is crucial for configuration management, change management and compliance. However, it would also create a productivity dip for those developing the product or solution.

This is the same dilemma that very small and medium enterprises face. They function reasonably well in a Repeatable business. How much should they invest in an Integrated or Collaborating business approach?

Following the evolution path described by Marc Halpern always brings you to the point where technology changes from Coordinated to Connected. This is a challenging and immature topic, which I have discussed in my blog posts and during conferences.

See: The Challenges of a connected ecosystem for PLM or this full series of posts:  The road to model-based and connected PLM.

 

There are two paths – path #2

Another path that startups could follow is a more forward-looking path, understanding that you need a coordinated and connected approach in the long term. For the fastest execution, you would like to work in a multidisciplinary mode in real time, exactly the characteristic of a startup.

However, in path #2, the startup should have a longer-term vision. Instead of choosing the obvious tools, they should focus on their company’s most important value streams. They have the opportunity to select integrated domains that are based on a connected, often model-based approach. Some examples of these integrated domains:

  • An MBSE environment focusing on real-time interaction related to product architecture and solution components(RFLP)
  • A connected product design environment, where in real-time a virtual product can be created, analyzed, and optimized – connected software might be relevant here.
  • A connected product realization environment where product engineering and suppliers work together in real time.

All three examples are typical Systems of Engagement. The big difference with individual tools is that they already focus on multidisciplinary collaboration on a data-driven, model-based approach.

In addition, having these systems in place allows the startup company to invest separately in a System of Record(s) environment when scaling up. This could be a traditional PLM system combined with a Configuration Management System or an Asset Management System.

System of Record choices, of course, depends on the industry needs and the usage of the product in the field. We should not consider one system that serves all; it is an infrastructure.

In the image below, you see the concept of this approach described by Erik Herzog from SAAB Aeronautics during the recent PLM Roadmap / PDT Europe conference. You can read more details of this approach in this post: The Week after PLM Roadmap PDT Europe.

Note: SAAB is not a startup; therefore, they must deal with their legacy. They are now working on business sustainable concepts for the future: Heterogeneous and federated PLM.

My opinion: The heterogeneous and federated approach is the ultimate target for any enterprise. I already mentioned the importance of connected environments regarding digital twins and sustainability. Material properties, process environmental impacts and product behavior coming from the field will all work only efficiently if dealt with in a connected and federated manner.

 

Conclusion

The challenge for startups is that they often start without the knowledge and experience that multidisciplinary collaboration within a value stream is crucial for a connected future. This a topic that I would like to explore further with startups and peers in my ecosystem. What do you think? What are your questions? Join the conversation.

 

 

As human beings, we believe in the truth. We claim the truth. During my holiday in Greece, the question was, did the Greek Prime Minister tell the truth about the internal spy scandal?

In general, we can say, politicians never speak the real truth, and some countries are trying to make sure there is only one single source of truth – their truth. The concept of a Single Source Of Truth (SSOT) is difficult to maintain in politics.

On social media, Twitter and Facebook, people are claiming their truth. But unfortunately, without any scientific background, people know better than professionals by cherry-picking messages, statistics or even claiming non-existing facts.

Nicely described in The Dunning-Kruger effect. Unfortunately, this trend will not disappear.

If you want to learn more about the impact of social media, read this long article from The Atlantic:  Why the Past 10 Years of American Life Have Been Uniquely Stupid. Although the article is about the US, the content is valid for all countries where social media are still allowed.

The PLM and CM domain is the only place where people still rely on the truth defined by professionals. Manufacturing companies depend on reliable information to design, validate, manufacture and support their products. Compliance and safe products require an accurate and stable product definition based on approved information. Therefore, the concept of SSOT is crucial along the product lifecycle.

The importance may vary depending on the product type. The difference in complexity between an airplane and a plastic toy, for example. It is all about the risk and impact of a failure caused by the product.

During my holiday, the SSOT discussion was sparked on LinkedIn by Adam Keating, and the article starts with:

The “Single Source of Truth (SSOT)” wasn’t built for you. It was built for software vendors to get rich. Not a single company in the world has a proper SSOT.

A bit provocative, as there is nothing wrong with software vendors being profitable. Profitability guarantees the long-time support of the software solution. Remember the PLM consolidation around 2006, when SmarTeam, Matrix One (Dassault), Agile and Eigner & Partner (Oracle) were acquired, disappeared or switched to maintenance mode.

Therefore it makes sense to have a profitable business model or perhaps a real open source business model.

Still, the rest of the discussion was interesting, particularly in the LinkedIn comments. Adam mentioned the Authoritative Source of Truth (ASOT) as the new future. And although this concept becomes more and more visible in the PLM domain, I believe we need both. So, let’s have a look at these concepts.

 

Truth 1.0 – SSOT

Historically, manufacturing companies stored the truth in documents, first paper-based, later in electronic file formats and databases.

The truth consists of drawings, part lists, specifications, and other types of information.

Moreover, the information is labeled with revisions and versions to identify the information.

By keeping track of the related information through documents or part lists with significant numbers, a person in the company could find the correct corresponding information at any stage of the lifecycle.

Later, by storing all the information in a central (PLM) system, the impression might be created that this system is the Single Source Of Truth. The system Adam Keating agitated against in his LinkedIn post.

Although for many companies, the ERP has been the SSOT  (and still is). All relevant engineering information was copied into the ERP system as attached files. Documents are the authoritative, legal pieces of information that a company shares with suppliers, authorities, or customers. They can reside in PLM but also in ERP. Therefore, you need an infrastructure to manage the “truth.”

Note: The Truth 1.0 story is very much a hardware story.

Even for hardware, ensuring a consistent single version of the truth for each product remains difficult. In theory, its design specifications should match the manufacturing definition. The reality, however, shows that often this is not the case. Issues discovered during the manufacturing process are fixed in the plant – redlining the drawing  – is not always processed by engineering.

As a result, Engineering and Manufacturing might have a different version of what they consider the truth.

The challenge for a service engineer in the field is often to discover the real truth. So the “truth” might not always be in the expected place – no guaranteed Single Source Of Truth.

Configuration Management is a discipline connected to PLM to ensure that the truth is managed so that as-specified, as-manufactured, and as-delivered information has been labeled and documented unambiguously. In other words, you could say Configuration Management(CM) is aiming for the Single Source Of Truth for a product.

If you want to read more about the relation between PLM and CM  – read this post: PLM and Configuration Management (CM), where I speak with Martijn Dullaart about the association between PLM and CM.

Martijn has his blog mdux.net and is the Lead Architect for Enterprise Configuration Management at our Dutch pride ASML. Martijn is also Chairperson I4.0 Committee IPX Congress.

Summarizing: The Single Source Of Truth 1.0 concept is document-based and should rely on CM practices, which require skilled people and the right methodology. In addition, some industries require Truth 1.0.

Others take the risk of working without solid CM practices, and the PLM system might create the impression of the SSOT; it will not be the case, even for only hardware.

 Truth 2.0 – ASOT

Products have become more complex, mainly due to the combination of electronics and software. Their different lifecycles and the speed of change are hard to maintain using the traditional PLM approach of SSOT.

It will be impossible to maintain an SSOT, particularly if it is based on documents.

As CM is the discipline to ensure data consistency, it is important to look into the future of CM. At the end of last year, I discussed this topic with 3 CM thought leaders. Martijn Dullaart, Maxime Gravel and Lisa Fenwick discussed with me what they believe the change would be. Read and listen here: The future of Configuration Management.


From the discussion, it became clear that managing all the details is impossible; still, you need an overreaching baseline to identify the severity and impact of a change along the product lifecycle.

New methodologies can be developed for this, as reliable data can be used in algorithms to analyze a change impact. This brings us to the digital thread. According to the CIMdata definition used in the A&D digital twin phase 2 position paper:

The digital thread provides the ability for a business to have an Authoritative Source of Truth(ASOT), which is information available and connected in a core set of the enterprise systems across the lifecycle and supplier networks

The definition implies that, in the end, a decision is made on data from the most reliable, connected source. There might be different data in other locations. However, this information is less reliable. Updating or fixing this information does not make sense as the effort and cost of fixing will be too expensive and give no benefit.

Obviously, we need reliable data to implement the various types of digital twins.

As I am intrigued by the power of the brain – its strengths and weaknesses – the concept of ASOT can also be found in our brains. Daniel Kahneman’s book, Thinking Fast and Slow talks about the two systems/modes our brain uses. The Fast one (System 1 – low energy usage) could be the imaginary SSOT, whereas the Slow one (System 2 – high energy required) is the ASOT. The brain needs both, and I believe this is the same in our PLM domain.

A new PLM Paradigm

In this context, there is a vivid discussion about the System of Record and Systems of Engagement. I wrote about it in June (post: A new PLM paradigm); other authors name it differently, but all express a similar concept. Have a look at these recent articles and statements from:

Author Link to content

Authentise

 

The challenge of cross-discipline collaboration …….

Beyond PLM

 

When is the right time to change your PLM system + discussion

Colab

 

The Single Source Of Truth wasn’t built for you …….

Fraunhofer institute

 

Killing the PLM Monolith – the Emergence of cloud-native System Lifecycle Management (SysLM)

SAAB Group

 

Don’t mix the tenses. Managing the Present and the Future in an MBSE context

Yousef Hooshmand

 

From a Monolithic PLM Landscape to a Federated Domain and Data Mesh

If you want to learn more about these concepts and discuss them with some of the experts in this domain, come to the upcoming PLM Roadmap PTD Europe conference on 18-19 October in Gothenburg, Sweden. Have a look at the final agenda here

Register before September 12 to benefit from a 15 % Early Bird discount, which you can spend for the dinner after day 1. I look forward to discussing the SSOT/ASOT topics there.


Conclusion

The Single Source Of Truth (SSOT) and the Authoritative Source of Truth (ASOT) are terms that illustrate the traditional PLM paradigm is changing thanks to digitization and connected stakeholders. The change is in the air. Now, the experience has to come. So be part of the change and discuss with us.

 

Once and a while, the discussion pops up if, given the changes in technology and business scope, we still should talk about PLM. John Stark and others have been making a point that PLM should become a profession.

In a way, I like the vagueness of the definition and the fact that the PLM profession is not written in stone. There is an ongoing change, and who wants to be certified for the past or framed to the past?

However, most people, particularly at the C-level, consider PLM as something complex, costly, and related to engineering. Partly this had to do with the early introduction of PLM, which was a little more advanced than PDM.

The focus and capabilities made engineering teams happy by giving them more access to their data. But unfortunately, that did not work, as engineers are not looking for more control.

Old (current) PLM

Therefore, I would like to suggest that when we talk about PLM, we frame it as Product Lifecycle Data Management (the definition). A PLM infrastructure or system should be considered the System of Record, ensuring product data is archived to be used for manufacturing, service, and proving compliance with regulations.

In a modern way, the digital thread results from building such an infrastructure with related artifacts. The digital thread is somehow a slow-moving environment, connecting the various as-xxx structures (As-Designed, As-Planned, As-Manufactured, etc.). Looking at the different PLM vendor images, Aras example above, I consider the digital thread a fancy name for traceability.

I discussed the topic of Digital Thread in 2018:  Document Management or Digital Thread. One of the observations was that few people talk about the quality of the relations when providing traceability between artifacts.

The quality of traceability is relevant for traditional Configuration Management (CM). Traditional CM has been framed, like PLM, to be engineering-centric.

Both PLM and CM need to become enterprise activities – perhaps unified.

Read my blog post and see the discussion with Martijn Dullaart, Lisa Fenwick and Maxim Gravel when discussing the future of Configuration Management.

New digital PLM

In my posts, I talked about modern PLM. I described it as data-driven, often in relation to a model-based approach. And as a result of the data-driven approach, a digital PLM environment could be connected to processes outside the engineering domain. I wrote a series of posts related to the potential of such a new PLM infrastructure (The road to model-based and connected PLM)

Digital PLM, if implemented correctly, could serve people along the full product lifecycle, from marketing/portfolio management until service and, if relevant, decommissioning). The bigger challenge is even connecting eco-systems to the same infrastructure, in particular suppliers & partners but also customers. This is the new platform paradigm.

Some years ago, people stated IoT is the new PLM  (IoT is the new PLM – PTC 2017). Or MBSE is the foundation for a new PLM (Will MBSE be the new PLM instead of IoT? A discussion @ PLM Roadmap conference 2018).

Even Digital Transformation was mentioned at that time. I don’t believe Digital Transformation is pointing to a domain, more to an ongoing process that most companies have t go through. And because it is so commonly used, it becomes too vague for the specifics of our domain. I liked Monica Schnitger‘s LinkedIn post: Digital Transformation? Let’s talk. There is enough to talk about; we have to learn and be more specific.

 

What is the difference?

The challenge is that we need more in-depth thinking about what a “digital transformed” company would look like. What would impact their business, their IT infrastructure, and their organization and people? As I discussed with Oleg Shilovitsky, a data-driven approach does not necessarily mean simplification.

I just finished recording a podcast with Nina Dar while writing this post. She is even more than me, active in the domain of PLM and strategic leadership toward a digital and sustainable future. You can find the pre-announcement of our podcast here (it was great fun to talk), and I will share the result later here too.

What is clear to me is that a new future data-driven environment becomes like a System of Engagement. You can simulate assumptions and verify and qualify trade-offs in real-time in this environment. And not only product behavior, but you can also simulate and analyze behaviors all along the lifecycle, supporting business decisions.

This is where I position the digital twin. Modern PLM infrastructures are in real-time connected to the business. Still, PLM will have its system of record needs; however, the real value will come from the real-time collaboration.

The traditional PLM consultant should transform into a business consultant, understanding technology. Historically this was the opposite, creating friction in companies.

Starting from the business needs

In my interactions with customers, the focus is no longer on traditional PLM; we discuss business scenarios where the company will benefit from a data-driven approach. You will not obtain significant benefits if you just implement your serial processes again in a digital PLM infrastructure.

Efficiency gains are often single digit, where new ways of working can result in double-digit benefits or new opportunities.

Besides traditional pressure on companies to remain competitive, there is now a new additional driver that I have been discussing in my previous post, the Innovation Dilemma. To survive on our planet, we and therefore also companies, need to switch to sustainable products and business models.

This is a push for innovation; however, it requires a coordinated, end-to-end change within companies.

Be the change

When do you decide to change your business model from pushing products to the marker into a business model of Product as a Service? When do you choose to create repairable and upgradeable products? It is a business need. Sustainability does not start with the engineer. It must be part of the (new) DNA of a company.

Interesting to read is this article from Jan Bosch that I read this morning: Resistance to Change. Read the article as it makes so much sense, but we need more than sense – we need people to get involved. My favorite quote from the article:

“The reasonable man adapts himself to the world; the unreasonable one persists in trying to adapt the world to himself. Therefore, all progress depends on the unreasonable man”.

Conclusion

PLM consultants should retrain themselves in System Thinking and start from the business. PLM technology alone is no longer enough to support companies in their (digital/sustainable) transformation. Therefore, I would like to introduce BLM (Business Lifecycle Management) as the new TLA.

However, BLM has been already framed as Black Lives Matter. I agree with that, extending it to ALM (All Lives Matter).

What do you think should we leave the comfortable term PLM behind us for a new frame?

After two quiet weeks of spending time with my family in slow motion, it is time to start the year.

First of all, I wish you all a happy, healthy, and positive outcome for 2022, as we need energy and positivism together. Then, of course, a good start is always cleaning up your desk and only leaving the relevant things for work on the desk.

Still, I have some books at arm’s length, either physical or on my e-reader, that I want to share with you – first, the non-obvious ones:

The Innovators Dilemma

A must-read book was written by Clayton Christensen explaining how new technologies can overthrow established big companies within a very short period. The term Disruptive Innovation comes up here. Companies need to remain aware of what is happening outside and ready to adapt to your business. There are many examples even recently where big established brands are gone or diminished in a short period.

In his book, he wrote about DEC (Digital Equipment Company)  market leader in minicomputers, not having seen the threat of the PC. Or later Blockbuster (from video rental to streaming), Kodak (from analog photography to digital imaging) or as a double example NOKIA (from paper to market leader in mobile phones killed by the smartphone).

The book always inspired me to be alert for new technologies, how simple they might look like, as simplicity is the answer at the end. I wrote about in 2012: The Innovator’s Dilemma and PLM, where I believed cloud, search-based applications and Facebook-like environments could disrupt the PLM world. None of this happened as a disruption; these technologies are now, most of the time, integrated by the major vendors whose businesses are not really disrupted. Newcomers still have a hard time to concur marketspace.

In 2015 I wrote again about this book, The Innovator’s dilemma and Generation change. – image above. At that time, understanding disruption will not happen in the PLM domain. Instead, I predict there will be a more evolutionary process, which I would later call: From Coordinated to Connected.

The future ways of working address the new skills needed for the future. You need to become a digital native, as COVID-19 pushed many organizations to do so. But digital native alone does not bring success. We need new ways of working which are more difficult to implement.

Sapiens

The book Sapiens by Yuval Harari made me realize the importance of storytelling in the domain of PLM and business transformation. In short, Yuval Harari explains why the human race became so dominant because we were able to align large groups around an abstract theme. The abstract theme can be related to religion, the power of a race or nation, the value of money, or even a brand’s image.

The myth (read: simplified and abstract story) hides complexity and inconsistencies. It allows everyone to get motivated to work towards one common goal. A Yuval says: “Fiction is far more powerful because reality is too complex”.

Too often, I have seen well-analyzed PLM projects that were “killed” by management because it was considered too complex. I wrote about this in 2019  PLM – measurable or a myth? claiming that the real benefits of PLM are hard to predict, and we should not look isolated only to PLM.

My 2020 follow-up post The PLM ROI Myth, eludes to that topic. However, even if you have a soundproof business case at the management level, still the myth might be decisive to justify the investment.

That’s why PLM vendors are always working on their myths: the most cost-effective solution, the most visionary solution, the solution most used by your peers and many other messages to influence your emotions, not your factual thinking. So just read the myths on their websites.

If you have no time to read the book, look at the above 2015 Ted to grasp the concept and use it with a PLM -twisted mind.

Re-use your CAD

In 2015, I read this book during a summer holiday (meanwhile, there is a second edition). Although it was not a PLM book, it was helping me to understand the transition effort from a classical document-driven enterprise towards a model-based enterprise.

Jennifer Herron‘s book helps companies to understand how to break down the (information) wall between engineering and manufacturing.

At that time, I contacted Jennifer to see if others like her and Action Engineering could explain Model-Based Definition comprehensively, for example, in Europe- with no success.

As the Model-Based Enterprise becomes more and more the apparent future for companies that want to be competitive or benefit from the various Digital Twin concepts. For that reason, I contacted Jennifer again last year in my post: PLM and Model-Based Definition.

As you can read, the world has improved, there is a new version of the book, and there is more and more information to share about the benefits of a model-based approach.

I am still referencing Action Engineering and their OSCAR learning environment for my customers. Unfortunately, many small and medium enterprises do not have the resources and skills to implement a model-based environment.

Instead, these companies stay on their customers’ lowest denominator: the 2D Drawing. For me, a model-based definition is one of the first steps to master if your company wants to provide digital continuity of design and engineering information towards manufacturing and operations. Digital twins do not run on documents; they require model-based environments.

The book is still on my desk, and all the time, I am working on finding the best PLM practices related to a Model-Based enterprise.

It is a learning journey to deal with a data-driven, model-based environment, not only for PLM but also for CM experts, as you might have seen from my recent dialogue with CM experts: The future of Configuration Management.

Products2019

This book was an interesting novelty published by John Stark in 2020. John is known for his academic and educational books related to PLM. However, during the early days of the COVID-pandemic, John decided to write a novel. The novel describes the learning journey of Jane from Somerset, who, as part of her MBA studies, is performing a research project for the Josef Mayer Maschinenfabrik. Her mission is to report to the newly appointed CEO what happens with the company’s products all along the lifecycle.

Although it is not directly a PLM book, the book illustrates the complexity of PLM. It Is about people and culture; many different processes, often disconnected. Everyone has their focus on their particular discipline in the center of importance. If you believe PLM is all about the best technology only, read this book and learn how many other aspects are also relevant.

I wrote about the book in 2020: Products2019 – a must-read if you are new to PLM if you want to read more details. An important point to pick up from this book is that it is not about PLM but about doing business.

PLM is not a magical product. Instead, it is a strategy to support and improve your business.

System Lifecycle Management

Another book, published a little later and motivated by the extra time we all got during the COVID-19 pandemic, was Martin Eigner‘s book System Lifecycle Management.

A 281-page journey from the early days of data management towards what Martin calls System Lifecycle Management (SysLM). He was one of the first to talk about System Lifecycle Management instead of PLM.

I always enjoyed Martin’s presentations at various PLM conferences where we met. In many ways, we share similar ideas. However, during his time as a professor at the University of Kaiserslautern (2003-2017), he explored new concepts with his students.

I briefly mentioned the book in my series The road to model-based and connected PLM (Part 5) when discussing SLM or SysLM. His academic research and analysis make this book very valuable. It takes you in a very structured way through the times that mechatronics becomes important, next the time that systems (hardware and software) become important.

We discussed in 2015 the applicability of the bimodal approach for PLM. However, as many enterprises are locked in their highly customized PDM/PLM environments, their legacy blocks the introduction of modern model-based and connected approaches.

Where John Stark’s book might miss the PLM details, Martin’s book brings you everything in detail and with all its references.

It is an interesting book if you want to catch up with what has happened in the past 20 years.

More Books …..

More books on my desk have helped me understand the past or that helped me shape the future. As this is a blog post, I will not discuss more books this time reaching my 1500 words.

Still books worthwhile to read – click on their images to learn more:

I discussed this book two times last year. An introduction in PLM and Modularity and a discussion with the authors and some readers of the book: The Modular Way – a follow-up discussion

x

x

A book I read this summer contributed to a better understanding of sustainability. I mentioned this book in my presentation for the Swedish CATIA Forum in October last year – slide 29 of The Challenges of model-based and traditional plm. So you could see it as an introduction to System Thinking from an economic point of view.

System Thinking becomes crucial for a sustainable future, as I addressed in my post PLM and Sustainability.

Sustainability is my area of interest at the PLM Green Global Alliance, an international community of professionals working with Product Lifecycle Management (PLM) enabling technologies and collaborating for a more sustainable decarbonized circular economy.

Conclusion

There is a lot to learn. Tell us something about your PLM bookshelf – which books would you recommend. In the upcoming posts, I will further focus on PLM education. So stay tuned and keep on learning.

As promised in my early November post – The road to model-based and connected PLM (part 9 – CM), I come back with more thoughts and ideas related to the future of configuration management. Moving from document-driven ways of working to a data-driven and model-based approach fundamentally changes how you can communicate and work efficiently.

Let’s be clear: configuration management’s target is first of all about risk management. Ensuring your company’s business remains sustainable, efficient, and profitable.

By providing the appropriate change processes and guidance,  configuration management either avoids costly mistakes and iterations during all phases of a product lifecycle or guarantees the quality of the product and information to ensure safety.

Companies that have not implemented CM practices probably have not observed these issues. Or they have not realized that the root cause of these issues is a lack of CM.

Similar to what is said in smaller companies related to PLM, CM is often seen as an overhead, as employees believe they thoroughly understand their products. In addition, CM is seen as a hurdle to innovation because of the standardization of practices. So yes, they think it is normal that there are sometimes problems. That’s life.

I already wrote about this topic in 2010 PLM, CM and ALM – not sexy 😦 – where ALM means Asset Lifecycle Management – my focus at that time.

Hear it from the experts

To shape the discussion related to the future of Configuration Management, I had a vivid discussion with three thought leaders in this field: Lisa Fenwick, Martijn Dullaart and Maxime Gravel. A short introduction of the three of them:

Lisa Fenwick, VP Product Development at CMstat, a leading company in Configuration Management and Data Management software solutions and consulting services for aviation, aerospace & defense, marine, and other high-tech industries. She has over 25 years of experience with CM and Deliverables Management, including both government and commercial environments.

Ms. Fenwick has achieved CMPIC SME, CMPIC CM Assessor, and CMII-C certifications. Her experience includes implementing CM software products, CM-related consulting and training, and participation in the SAE and IEEE standards development groups

Martijn Dullaart is the Lead Architect for Enterprise Configuration Management at ASML (Our Dutch national pride) and chairperson of the Industry 4.0 committee of the Institute  Process Excellence (IPX) Congress. Martijn has his own blog mdux.net, and you might have seen him recently during the PLM Roadmap & PDT Fall conference in November – his thoughts about the CM future can be found on his blog here

Maxime Gravel, Manager Model-Based Engineering at Moog Inc., a worldwide designer, manufacturer, and integrator of advanced motion control products. Max has been the director of the model-based enterprise at the Institute for Process Excellence (IPX) and Head of Configuration and Change Management at Gulfstream Aerospace which certified the first aircraft in a 3D Model-Based Environment.

What we discussed:

We had an almost one-hour discussion related to the following points:

  • The need for Enterprise Configuration Management – why and how
  • The needed change from document-driven to model-based – the impact on methodology and tools
  • The “neural network” of data – connecting CM to all other business domains, a similar view as from the PLM domain,

I kept from our discussion the importance of planning – as seen in the CMstat image on the left.

To plan which data you need to manage and how you will manage the data. How often are you doing this in your company’s projects?

Next, all participants stressed the importance of education and training on this topic – get educated. Configuration Management is not a topic that is taught at schools. Early next year, I will come back on education as the benefits of education are often underestimated. Not everything can be learned by “googling.”

 Conclusion

The journey towards a model-based and data-driven future is not a quick one to be realized by new technologies. However, it is interesting to learn that the future of connected data (the “neural network”) allows organizations to implement both CM and PLM in a similar manner, using graph databases and automation. When executed at the enterprise level, the result will be that CM and PLM become natural practices instead of other siloed system-related disciplines.

Most of the methodology is there; the implementation to make it smooth and embedded in organizations will be the topics to learn. Join us in discussing and learning!

 

When I started this series in July, I expected to talk mostly about new ways of working, enabled through a data-driven and model-based approach. However, when analyzing what is needed for such a future (part 3), it became apparent that many of these new ways of working are dependent on technology.

From coordinated to connected sounds like a business change;

however, it all depends on technology. And here I have to thank Marc Halpern (Gartner’s Research VP, Engineering and Design Technologies)  again, who came with this brilliant scheme below:

So now it is time to address the last point from my starting post:

Configuration Management requires a new approach. The current methodology is very much based on hardware products with labor-intensive change management. However, the world of software products has different configuration management and change procedures. Therefore, we need to merge them into a single framework. Unfortunately, this cannot be the BOM framework due to the dynamics in software changes.

Configuration management at this moment

PLM and CM are often considered overlapping. My March 2019 post: PLM and Configuration Management – a happy marriage? shares some thoughts related to this point

Does having PLM or PDM installed mean you have implemented CM? There is this confusion because revision management is considered the same as configuration management. Read my March 2020 post: What the FFF is happening? Based on a vivid discussion launched by  Yoann Maingon, CEO and founder of Ganister, an example of a modern, graph database-based, flexible PLM solution.

To hear it from a CM-side,  I discussed it with Martijn Dullaart in my February 2021 post: PLM and Configuration Management. We also zoomed in on CM2 in this post as a methodology.

Martijn is the Lead Architect for Enterprise Configuration Management at ASML (Our Dutch national pride) and chairperson of the Industry 4.0 committee of the Integrated Process Excellence (IPX) Congress.

As mentioned before in a previous post (part 6), he will be speaking at the PLM Roadmap & PDT Fall conference starting this upcoming week.

In this post, I want to talk about the CM future. For understanding the current situation, you can find a broad explanation here on Wikipedia. Have a look at CM in the context of the product lifecycle, ensuring that the product As-Specified and As-Designed information matches the As-Built and As-Operated product information.

A mismatch or inconsistency between these artifacts can lead to costly errors, particularly in later lifecycle stages. CM originated from the Aerospace and Defense industry for that reason. However, companies in other industries might have implemented CM practices too. Either due to regulations or thanks to the understanding that configuration mistakes can cause significant damage to the company.

Historically configuration management addresses the needs of “slow-moving” products. For example, the design of an airplane could take years before manufacturing started. Tracking changes and ensuring consistency of all referenced datasets was often a manual process.

On purpose, I wrote “referenced datasets,” as the information was not connected in a single environment most of the time. The identifier of a dataset ( an item or a document) was the primary information carrier used for mentally connecting other artifacts to keep consistency.

The Institute of Process Excellence (IPX) has been one of the significant contributors to configuration management methodology. They have been providing (and still offer) CM2 training and certification.

As mentioned before, PLM vendors or implementers suggest that a PLM system could fully support Configuration Management. However, CM is more than change management, release management and revision management.

As the diagram from Martijn Dullaart shows, PLM is one facet of configuration management.

Of course, there are also (a few) separate CM tools focusing on the configuration management process. CMstat’s EPOCH CM tool is an example of such software. In addition, on their website, you can find excellent articles explaining the history and their future thoughts related to CM.

The future will undoubtedly be a connected, model-based, software-driven environment. Naturally, therefore, configuration management processes will have to change. (Impressive buzz word sentence, still I hope you get the message).

From coordinated to connected has a severe impact on CM. Let’s have a look at the issues.

Configuration Management – the future

The transition to a data-driven and model-based infrastructure has raised the following questions:

  • How to deal with the granularity of data – each dataset needs to be validated. For example, a document (a collection of datasets) needs to be validated in the document-based approach. How to do this efficiently?
  • The behavior of a product (or system) will more and more dependent on software. Product CM practices have been designed for the hardware domain; now, we need a mix of hardware and software CM practices.
  • Due to the increased complexity of products (or systems) and the rapid changes due to software versions, how do we guarantee the As-Operated product is still matching the As-Designed / As-Certified definitions.

I don’t have answers to these questions. I only share observations and trends I see in my actual world.

Granularity of data

The concept of datasets has been discussed in my post (part 6). Now it is about how to manage the right sets of connected data.

The image on the left, borrowed from Erik Herzog’s presentation at the PDM Roadmap & PDT Fall conference in 2020, is a good illustration of the challenge.

At that time, Erik suggested that OSLC could be the enabler of a digital CM backbone for an enterprise. Therefore, it was a pleasure to see Erik providing an update at the yearly OSLC Fest conference this week.

You can find the agenda and Erik’s presentation here on day 2.

OSLC as a framework seems to be a good candidate for supporting modern CM scenarios. It allows a company to build full traceability between all relevant artifacts (if digital available). I can see the beauty of the technical infrastructure.

Still, it is about people and processes first. Therefore, I am curious to learn from my readers who believe and experiment with such a federated infrastructure.

More software

Traditional working companies might believe that software should be treated as part of the Bill of Materials. In this theory, you treat software code as a part, with a part number and revision. In this way, you might believe configuration management practices do not have to change. However, there are some fundamental differences in why we should decouple hardware and software.

First, for the same hardware solution, there might be a whole collection of valid software codes. Just like your computer. How many valid software codes, even from the same application, can you run on this hardware? Managing a computer system and its software through a Bill of Materials is unimaginable.

A computer, of course, is designed for running all kinds of software versions. However, modern products in the field, like cars, machines, electrical devices, all will have a similar type of software-driven flexibility.

For that reason, I believe that companies that deliver software-driven products should design a mechanism to check if the combination of hardware and software is valid. For a computer system, a software mismatch might not be costly or painful; for an industrial system, it might be crucial to ensure invalid combinations can exist. Click on the image to learn more.

Solutions like Configit or pure::variants might lead to a solution. In Feb 2021, I discussed in PLM and Configuration Lifecycle Management with Henrik Hulgaard, the CTO from Configit, the unique features of their solution.

I hope to have a similar post shortly with Pure Systems to understand their added value to configuration management.

Software change management is entirely different from hardware change management. The challenge is to have two different change management approaches under one consistent umbrella without creating needless overhead.

Increased complexity – the digital twin?

With the increased complexity of products and many potential variants of a solution, how can you validate a configuration? Perhaps we should investigate the digital twin concept, with a twin for each instance we want to validate.

Having a complete virtual representation of a product, including the possibility to validate the software behavior on the virtual product, would allow you to run (automated) validation tests to certify and later understand a product in the field.

No need for inspection on-site or test and fix upgrades in the physical world. Needed for space systems for sure, but why not for every system in the long term. When we are able to define and maintain a virtual twin of our physical product (on-demand), we can validate.

I learned about this concept at the 2020 Digital Twin conference in the Netherlands. Bart Theelen from Canon Production Printing explained that they could feed their simulation models with actual customer data to simulate and analyze the physical situation. In some cases, it is even impossible to observe the physical behavior. By tuning the virtual environment, you might understand what happens in the physical world.

An eye-opener and an advocate for the model-based approach. Therefore, I am looking forward to the upcoming PLM Roadmap & PDT Fall conference. Hopefully, Martijn Dullaart will share his thoughts on combining CM and working in a model-based environment. See you there?

Conclusion

Finally, we have reached in this series the methodology part, particularly the one related to configuration management and traceability in a very granular, digital environment.  

After the PLM Roadmap & PDT fall conference, I plan to follow up with three thought leaders on this topic: Martijn Dullaart (ASML), Maxime Gravel (Moog) and Lisa Fenwick (CMstat).  What would you ask them?

In my last post, I zoomed in on a preferred technical architecture for the future digital enterprise. Drawing the conclusion that it is a mission impossible to aim for a single connected environment. Instead, information will be stored in different platforms, both domain-oriented (PLM, ERP, CRM, MES, IoT) and value chain oriented (OEM, Supplier, Marketplace, Supply Chain hub).

In part 3, I posted seven statements that I will be discussing in this series. In this post, I will zoom in on point 2:

Data-driven does not mean we do not need any documents anymore. Read electronic files for documents. Likely, document sets will still be the interface to non-connected entities, suppliers, and regulatory bodies. These document sets can be considered a configuration baseline.

 

System of Record and System of Engagement

In the image below, a slide from 2016,  I show a simplified view when discussing the difference between the current, coordinated approach and the future, connected approach.  This picture might create the wrong impression that there are two different worlds – either you are document-driven, or you are data-driven.

In the follow-up of this presentation, I explained that companies need both environments in the future. The most efficient way of working for operations will be infrastructure on the right side, the platform-based approach using connected information.

For traceability and disconnected information exchanges, the left side will be there for many years to come. Systems of Record are needed for data exchange with disconnected suppliers, disconnected regulatory bodies and probably crucial for configuration management.

The System of Record will probably remain as a capability in every platform or cross-section of platform information. The Systems of Engagement will be the configured real-time environment for anyone involved in active company processes, not only ERP or MES, all execution.

Introducing SysML and SML

This summer, I received a copy of Martin Eigner’s System Lifecycle Management book, which I am reading at his moment in my spare moments. I always enjoyed Martin’s presentations. In many ways, we share similar ideas. Martin from his profession spent more time on the academic aspects of product and system lifecycle than I. But, on the other hand, I have always been in the field observing and trying to make sense of what I see and learn in a coherent approach. I am halfway through the book now, and for sure, I will come back on the book when I have finished.

A first impression: A great and interesting book for all. Martin and I share the same history of data management. Read all about this in his second chapter: Forty Years of Product Data Management

From PDM via PLM to SysLM, is a chapter that everyone should read when you haven’t lived it yourself. It helps you to understand the past (Learning for the past to understand the future). When I finish this series about the model-based and connected approach for products and systems, Martin’s book will be highly complementary given the content he describes.

There is one point for which I am looking forward to is feedback from the readers of this blog.

Should we, in our everyday language, better differentiate between Product Lifecycle Management (PLM) and System Lifecycle Management(SysLM)?

In some customer situations, I talk on purpose about System Lifecycle Management to create the awareness that the company’s offering is more than an electro/mechanical product. Or ultimately, in a more circular economy, would we use the term Solution Lifecycle Management as not only hardware and software might be part of the value proposition?

Martin uses consistently the abbreviation SysLM, where I would prefer the TLA SLM. The problem we both have is that both abbreviations are not unique or explicit enough. SysLM creates confusion with SysML (for dyslectic people or fast readers). SLM already has so many less valuable meanings: Simulation Lifecycle Management, Service Lifecycle Management or Software Lifecycle Management.

For the moment, I will use the abbreviation SLM, leaving it in the middle if it is System Lifecycle Management or Solution Lifecycle Management.

 

How to implement both approaches?

In the long term, I predict that more than 80 percent of the activities related to SLM will take place in a data-driven, model-based environment due to the changing content of the solutions offered by companies.

A solution will be based on hardware, the solid part of the solution, for which we could apply a BOM-centric approach. We can see the BOM-centric approach in most current PLM implementations. It is the logical result of optimizing the product lifecycle management processes in a coordinated manner.

However, the most dynamic part of the solution will be covered by software and services. Changing software or services related to a solution has completely different dynamics than a hardware product.

Software and services implementations are associated with a data-driven, model-based approach.

The management of solutions, therefore, needs to be done in a connected manner. Using the BOM-centric approach to manage software and services would create a Kafkaesque overhead.

Depending on your company’s value proposition to the market, the challenge will be to find the right balance. For example, when you keep on selling disconnectedhardware, there is probably no need to change your internal PLM processes that much.

However, when you are moving to a connected business model providing solutions (connected systems / Outcome-based services), you need to introduce new ways of working with a different go-to-market mindset. No longer linear, but iterative.

A McKinsey concept, I have been promoting several times, illustrates a potential path – note the article was not written with a PLM mindset but in a business mindset.

What about Configuration Management?

The different datasets defining a solution also challenge traditional configuration management processes. Configuration Management (CM) is well established in the aerospace & defense industry. In theory, proper configuration management should be the target of every industry to guarantee an appropriate performance, reduced risk and cost of fixing issues.

The challenge, however, is that configuration management processes are not designed to manage systems or solutions, where dynamic updates can be applied whether or not done by the customer.

This is a topic to solve for the modern Connected Car (system) or Connected Car Sharing (solution)

For that reason, I am inquisitive to learn more from Martijn Dullaart’s presentation at the upcoming PLM Roadmap/PDT conference. The title of his session: The next disruption please …

In his abstract for this session, Martijn writes:

From Paper to Digital Files brought many benefits but did not fundamentally impact how Configuration Management was and still is done. The process to go digital was accelerated because of the Covid-19 Pandemic. Forced to work remotely was the disruption that was needed to push everyone to go digital. But a bigger disruption to CM has already arrived. Going model-based will require us to reexamine why we need CM and how to apply it in a model-based environment. Where, from a Configuration Management perspective, a digital file still in many ways behaves like a paper document, a model is something different. What is the deliverable? How do you manage change in models? How do you manage ownership? How should CM adopt MBx, and what requirements to support CM should be considered in the successful implementation of MBx? It’s time to start unraveling these questions in search of answers.

One of the ideas I am currently exploring is that we need a new layer on top of the current configuration management processes extending the validation to software and services. For example, instead of describing every validated configuration, a company might implement the regular configuration management processes for its hardware.

Next, the systems or solutions in the field will report (or validate) their configuration against validation rules. A topic that requires a long discussion and more than this blog post, potentially a full conference.

Therefore I am looking forward to participating in the CIMdata/PDT FALL conference and pick-up the discussions towards a data-driven, model-based future with the attendees.  Besides CM, there are several other topics of great interest for the future. Have a look at the agenda here

 

Conclusion

A data-driven and model-based infrastructure still need to be combined with a coordinated, document-driven infrastructure.  Where the focus will be, depends on your company’s value proposition.

If we discuss hardware products, we should think PLM. When you deliver systems, you should perhaps talk SysML (or SLM). And maybe it is time to define Solution Lifecycle Management as the term for the future.

Please, share your thoughts in the comments.

 

My previous post introducing the concept of connected platforms created some positive feedback and some interesting questions. For example, the question from Maxime Gravel:

Thank you, Jos, for the great blog. Where do you see Change Management tool fit in this new Platform ecosystem?

is one of the questions I try to understand too. You can see my short comment in the comments here. However, while discussing with other experts in the CM-domain, we should paint the path forward. Because if we cannot solve this type of question, the value of connected platforms will be disputable.

It is essential to realize that a digital transformation in the PLM domain is challenging. No company or vendor has the perfect blueprint available to provide an end-to-end answer for a connected enterprise. In addition, I assume it will take 10 – 20 years till we will be familiar with the concepts.

It takes a generation to move from drawings to 3D CAD. It will take another generation to move from a document-driven, linear process to data-driven, real-time collaboration in an iterative manner.  Perhaps we can move faster, as the Automotive, Aerospace & Defense, and Industrial Equipment industries are not the most innovative industries at this time. Other industries or startups might lead us faster into the future.

Although I prefer discussing methodology, I believe before moving into that area, I need to clarify some more technical points before moving forward. My apologies for writing it in such a simple manner. This information should be accessible for the majority of readers.

What means data-driven?

I often mention a data-driven environment, but what do I mean precisely by that. For me, a data-driven environment means that all information is stored in a dataset that contains a single aspect of information in a standardized manner, so it becomes accessible by outside tools.

A document is not a dataset, as often it includes a collection of datasets. Most of the time, the information it is exposed to is not standardized in such a manner a tool can read and interpret the exact content. We will see that a dataset needs an identifier, a classification, and a status.

An identifier to be able to create a connection between other datasets – traceability or, in modern words, a digital thread.
A classification as the classification identifier will determine the type of information the dataset contains and potential a set of mandatory attributes

A status to understand if the dataset is stable or still in work.

Examples of a data-driven approach – the item

The most common dataset in the PLM world is probably the item (or part) in a Bill of Material. The identifier is the item number (ID + revision if revisions are used). Next, the classification will tell you the type of part it is.

Part classification can be a topic on its own, and every industry has its taxonomy.

Finally, the status is used to identify if the dataset is shareable in the context of other information (released, in work, obsolete), allowing tools to expose only relevant information.

In a data-driven manner, a part can occur in several Bill of Materials – an example of a single definition consumed in other places.

When the part information changes, the accountable person has to analyze the relations to the part, which is easy in a data-driven environment. It is normal to find this functionality in a PDM or ERP system.

When the part would change in a document-driven environment, the effort is much higher.

First, all documents need to be identified where this part occurs. Then the impact of change needs to be managed in document versions, which will lead to other related changes if you want to keep the information correct.

Examples of a data-driven approach – the requirement

Another example illustrating the benefits of a data-driven approach is implementing requirements management, where requirements become individual datasets.  Often a product specification can contain hundreds of requirements, addressing the needs of different stakeholders.

In addition, several combinations of requirements need to be handled by other disciplines, mechanical, electrical, software, quality and legal, for example.

As requirements need to be analyzed and ranked, a specification document would never be frozen. Trade-off analysis might lead to dropping or changing a single requirement. It is almost impossible to manage this all in a document, although many companies use Excel. The disadvantages of Excel are known, in particular in a dynamic environment.

The advantage of managing requirements as datasets is that they can be grouped. So, for example, they can be pushed to a supplier (as a specification).

Or requirements could be linked to test criteria and test cases, without the need to manage documents and make sure you work with them last updated document.

As you will see, also requirements need to have an Identifier (to manage digital relations), a classification (to allow grouping) and a status (in work / released /dropped)

Data-driven and Models – the 3D CAD model

3D PDF Model

When I launched my series related to the model-based approach in 2018, the first comments I got came from people who believed that model-based equals the usage of 3D CAD models – see Model-based – the confusion. 3D Models are indeed an essential part of a model-based infrastructure, as the 3D model provides an unambiguous definition of the physical product. Just look at how most vendors depict the aspects of a virtual product using 3D (wireframe) models.

Although we use a 3D representation at each product lifecycle stage, most companies do not have a digital continuity for the 3D representation. Design models are often too heavy for visualization and field services support. The connection between engineering and manufacturing is usually based on drawings instead of annotated models.

I wrote about modern PLM and Model-Based Definition, supported by Jennifer Herron from Action Engineering – read the post PLM and Model-Based Definition here.

If your company wants to master a data-driven approach, this is one of the most accessible learning areas. You will discover that connecting engineering and manufacturing requires new technology, new ways of working and much more coordination between stakeholders.

Implementing Model-Based Definition is not an easy process. However, it is probably one of the best steps to get your digital transformation moving. The benefits of connected information between engineering and manufacturing have been discussed in the blog post PLM and Model-Based Definition

Essential to realize all these exciting capabilities linked to Industry 4.0 require a data-driven, model-based connection between engineering and manufacturing.

If this is not the case, the projected game-changers will not occur as they become too costly.

Data-driven and mathematical models

To manage complexity, we have learned that we have to describe the behavior in models to make logical decisions. This can be done in an abstract model, purely based on mathematical equations and relations. For example, suppose you look at climate models, weather models or COVID infections models.

In that case, we see they all lead to discussions from so-called experts that believe a model should be 100 % correct and any exception shows the model is wrong.

It is not that the model is wrong; the expectations are false.

For less complex systems and products, we also use models in the engineering domain. For example, logical models and behavior models are all descriptive models that allow people to analyze the behavior of a product.

For example, how software code impacts the product’s behavior. Usually, we speak about systems when software is involved, as the software will interact with the outside world.

There can be many models related to a product, and if you want to get an impression, look at this page from the SEBoK wiki: Types of Models. The current challenge is to keep the relations between these models by sharing parameters.

The sharable parameters then again should be datasets in a data-driven environment. Using standardized diagrams, like SysML or UML,  enables the used objects in the diagram to become datasets.

I will not dive further into the modeling details as I want to remain at a high level.

Essential to realize digital models should connect to a data-driven infrastructure by sharing relevant datasets.

What does data-driven imply?

 

I want to conclude this time with some statements to elaborate on further in upcoming posts and discussions

  1. Data-driven does not imply there needs to be a single environment, a single database that contains all information. Like I mentioned in my previous post, it will be about managing connected datasets in a federated manner. It is not anymore about owned the data; it is about access to reliable data.
  2. Data-driven does not mean we do not need any documents anymore. Read electronic files for documents. Likely, document sets will still be the interface to non-connected entities, suppliers, and regulatory bodies. These document sets can be considered a configuration baseline.
  3. Data-driven means that we need to manage data in a much more granular manner. We have to look different at data ownership. It becomes more data accountability per role as the data can be used and consumed throughout the product lifecycle.
  4. Data-driven means that you need to have an enterprise architecture, data governance and a master data management (MDM) approach. So far, the traditional PLM vendors have not been active in the MDM domain as they believe their proprietary data model is leading. Read also this interesting McKinsey article: How enterprise architects need to evolve to survive in a digital world
  5. A model-based approach with connected datasets seems to be the way forward. Managing data in documents will become inefficient as they cannot contribute to any digital accelerator, like applying algorithms. Artificial Intelligence relies on direct access to qualified data.
  6. I don’t believe in Low-Code platforms that provide ad-hoc solutions on demand. The ultimate result after several years might be again a new type of spaghetti. On the other hand, standardized interfaces and protocols will probably deliver higher, long-term benefits. Remember: Low code: A promising trend or a Pandora’s Box?
  7. Configuration Management requires a new approach. The current methodology is very much based on hardware products with labor-intensive change management. However, the world of software products has different configuration management and change procedure. Therefore, we need to merge them in a single framework. Unfortunately, this cannot be the BOM framework due to the dynamics in software changes. An interesting starting point for discussion can be found here: Configuration management of industrial products in PDM/PLM

 

Conclusion

Again, a long post, slowly moving into the future with many questions and points to discuss. Each of the seven points above could be a topic for another blog post, a further discussion and debate.

After my summer holiday break in August, I will follow up. I hope you will join me in this journey by commenting and contributing with your experiences and knowledge.

 

 

 

 

After “The Doctor is IN,” now again a written post in the category of PLM and complementary practices/domains. In January, I discussed together with Henrik Hulgaard from Configit the complementary value of PLM and CLM (Configuration Lifecycle Management). For me, CLM is a synonym for Product Configuration Management.

PLM and Complementary Practices (feedback)

As expected, readers were asking the question:

“What is the difference between CLM (Configuration Lifecycle Management) and CM(Configuration Management)?”

Good question.

As the complementary role of CM is also a part of the topics to discuss, I am happy to share this blog today with Martijn Dullaart. You probably know Martijn if you are actively following topics on PLM and CM.

Martijn has his own blog mdux.net, and you might have seen him recently in Jenifer Moore’s PLM TV-episode: Why CM2 for Faster Change and Better Documentation. Martijn is the Lead Architect for Enterprise Configuration Management at ASML (Our Dutch national pride) and chairperson of the Industry 4.0 committee of the Integrated Process Excellence (IPX) Congress. Let us start.

Configuration Management and CM2

Martijn, first of all, can you bring some clarity in terminology. When discussing Configuration Management, what is the pure definition, what is CM2 as a practice, and what is IpX‘s role and please explain where you fit in this picture?

Classical CM focuses mainly on the product, the product definition, and actual configurations like as-built and as-maintained of the product. CM2 extends the focus to the entire enterprise, e.g., the processes and procedures (ways of working) of a company, including the IT and facilities, to support the company’s value stream.

CM2 expands the scope to all information that could impact safety, security, quality, schedule, cost, profit, the environment, corporate reputation, or brand recognition.

Basically, CM2 shifts the focus to Integrated Process Excellence and promotes continual improvement.

Next to this, CM2 provides the WHAT and the HOW, something most standards lack. My main focus is still around the product and promoting the use of CM outside the product domain.

For all CM related documentation, we are already doing this.

Configuration Management and PLM

People claim that if you implement PLM as an enterprise backbone, not as an engineering tool, you can do Configuration Management with your PLM environment.

What is your opinion?

Yes, I think that this is possible, provided that the PLM tool has the right capabilities. Though the question should be: Is this the best way to go about it. For instance, some parts of Configuration Management are more transactional oriented, e.g., registering the parts you build in or out of a product.

Other parts of CM are more iterative in nature, e.g., doing impact analysis and making an implementation plan. I am not saying this cannot be done in a PLM tool as an enterprise backbone. Still, the nature of most PLM tools is to support iterative types of work rather than a transactional type of work.

I think you need some kind of enterprise backbone that manages the configuration as an As-Planned/As-Released baseline. A baseline that shows not only the released information but also all planned changes to the configuration.

Because the source of information in such a baseline comes from different tools, you need an overarching tool to connect everything. For most companies, this means that they require an overarching system with their current state of enterprise applications.

Preferably I would like to use the data directly from the sources. Still, connectivity and performance are not yet to a level that we can do this. Cloud and modern application and database architectures are very promising to this end.

 

Configuration Management for Everybody?

I can imagine companies in the Aerospace industry need to have proper configuration management for safety reasons. Also, I can imagine that proper configuration management can be relevant for other industries. Do they need to be regulated, or are there other reasons for a company to start implementing CM processes?

I will focus the first part of my answer within the context of CM for products only.

Basically, all products are regulated to some degree. Aerospace & Defense and Medical Device and Pharma are highly regulated for obvious reasons. Other industries are also regulated, for example, through environmental regulations like REACH, RoHS, WEEE or safety-related regulations like the CE marking or FCC marking.

Customers can also be an essential driver for the need for CM. If, as a customer, you buy expensive equipment, you expect that the supplier of that equipment can deliver per commitment. The supplier can also maintain and upgrade the equipment efficiently with as few disruptions to your operations as possible.

Not just customers but also consumers are critical towards the traceability of the product and all its components.

Even if you are sitting on a rollercoaster, you presume the product is well designed and maintained. In other words, there is often a case to be made to apply proper configuration management in any company. Still, the extent to which you need to implement it may vary based on your needs.

 

The need for Enterprise Configuration Management is even more significant because one of the hardest things is to change the way an organization works and operates.

Often there are different ways of doing the same thing. There is a lot of tribal knowledge, and ways of working are not documented so that people can easily find it, let alone that it is structured and linked so that you can do an impact analysis when you want to introduce a change in your organization.

 

CM and Digital Transformation

One of the topics that we both try to understand better is how CM will evolve in the future when moving to a more model-based approach. In the CM-terminology, we still talk about documents as information objects to be managed. What is your idea of CM and a model-based future?

It is indeed a topic where probably new or changed methodology is required, and I started already describing CM topics in several posts on my enterprise MDUX blog. Some of the relevant posts in this context are:

First, let me say that model-based has the future, although, at the same time, the CM aspects are often overlooked.

When managing changes, too much detail makes estimating cost and effort for a business case more challenging, and planning information that is too granular is not desirable. Therefore, CM2 looks at datasets. Datasets should be as small as possible but not smaller. Datasets are sets of information that need to be released as a whole. Still, they can be released independently from other datasets. For example, a bill of materials, a BOM line item is not a dataset, but the complete set of BOM line items that make up the BoM of an assembly is considered a dataset. I can release a BoM independent from a test plan.

Data models need to facilitate this. However, today, in many PLM systems, a BOM and the metadata of a part are using the same revision. This means that to change the metadata, I need a revision of the BoM, while the BoM might not change. Some changes to metadata might not be relevant for a supplier. Communicating the changes to your supplier could create confusion.

I know some people think this is about document vs. model-centric, but it is not. A part is identified in the ‘physical world’ by its part ID. Even if you talk about allowing revisions in the supply chain, including the part ID’s revision, you create a new identifier. Now every new revision will end up in a different stock location. Is that what we want?

In any case, we are still in the early days, and the thinking about this topic has just begun and needs to take shape in the coming year(s).

 

CM and/or CLM?

As in my shared blog post with Henrik Hulgaard related to CLM, can you make a clear differentiation between the two domains for the readers?

 

Configuration Lifecycle Management (CLM)  is mainly positioned towards Configurable Products and the configurable level of the product.

 

Why I think this, even though Configit’s  CLM declaration states that “Configuration Lifecycle Management (CLM) is the management of all product configuration definitions and configurations across all involved business processes applied throughout the lifecycle of a product.”,
it also states:

  • “CLM differs from other Enterprise Business Disciplines because it focuses on cross-functional use of configurable products.”
  • “Provides a Single Source of Truth for Configurable Data
  • “handles the ever-increasing complexity of Configurable Products“.

I find Configuration Lifecycle Management is a core Configuration Management practice you need to have in place for configurable products. The dependencies you need to manage are enormously complex. Software parameters that depend on specific hardware, hardware to hardware dependencies, commercial variants, and options.

Want to learn more?

In this post, we just touched the surface of PLM and Configuration Management. Where can an interested reader find more information related to CM for their company?

 

For becoming trained in CM2, people can reach out to the Institute for Process Excellence, a company that focuses on consultancy and methodology for many aspects of a modern, digital enterprise, including Configuration Management.

And there is more out there, e.g.:

Conclusion

Thanks, Martijn, for your clear explanations. People working seriously in the PLM domain managing the full product lifecycle should also learn and consider Configuration Management best practices. I look forward to a future discussion on how to perform Configuration Management in a model-based environment.

PLM, CLM, and CM – mind the overlap

 

 

 

 

Translate

Categories

  1. As a complement, even if more and more of the diversity of a product is managed at the software level…

  2. 1) A wiring diagram stores information (wires between ports of the electrical components) that does not exist in most of…

  3. BOM has NEVER been the sole "master" of the Product. The DEFINITION FILE is ! For example the wiring of…

%d bloggers like this: