You are currently browsing the tag archive for the ‘PLM evolution’ tag.

In my last post, I zoomed in on a preferred technical architecture for the future digital enterprise. Drawing the conclusion that it is a mission impossible to aim for a single connected environment. Instead, information will be stored in different platforms, both domain-oriented (PLM, ERP, CRM, MES, IoT) and value chain oriented (OEM, Supplier, Marketplace, Supply Chain hub).

In part 3, I posted seven statements that I will be discussing in this series. In this post, I will zoom in on point 2:

Data-driven does not mean we do not need any documents anymore. Read electronic files for documents. Likely, document sets will still be the interface to non-connected entities, suppliers, and regulatory bodies. These document sets can be considered a configuration baseline.

 

System of Record and System of Engagement

In the image below, a slide from 2016,  I show a simplified view when discussing the difference between the current, coordinated approach and the future, connected approach.  This picture might create the wrong impression that there are two different worlds – either you are document-driven, or you are data-driven.

In the follow-up of this presentation, I explained that companies need both environments in the future. The most efficient way of working for operations will be infrastructure on the right side, the platform-based approach using connected information.

For traceability and disconnected information exchanges, the left side will be there for many years to come. Systems of Record are needed for data exchange with disconnected suppliers, disconnected regulatory bodies and probably crucial for configuration management.

The System of Record will probably remain as a capability in every platform or cross-section of platform information. The Systems of Engagement will be the configured real-time environment for anyone involved in active company processes, not only ERP or MES, all execution.

Introducing SysML and SML

This summer, I received a copy of Martin Eigner’s System Lifecycle Management book, which I am reading at his moment in my spare moments. I always enjoyed Martin’s presentations. In many ways, we share similar ideas. Martin from his profession spent more time on the academic aspects of product and system lifecycle than I. But, on the other hand, I have always been in the field observing and trying to make sense of what I see and learn in a coherent approach. I am halfway through the book now, and for sure, I will come back on the book when I have finished.

A first impression: A great and interesting book for all. Martin and I share the same history of data management. Read all about this in his second chapter: Forty Years of Product Data Management

From PDM via PLM to SysLM, is a chapter that everyone should read when you haven’t lived it yourself. It helps you to understand the past (Learning for the past to understand the future). When I finish this series about the model-based and connected approach for products and systems, Martin’s book will be highly complementary given the content he describes.

There is one point for which I am looking forward to is feedback from the readers of this blog.

Should we, in our everyday language, better differentiate between Product Lifecycle Management (PLM) and System Lifecycle Management(SysLM)?

In some customer situations, I talk on purpose about System Lifecycle Management to create the awareness that the company’s offering is more than an electro/mechanical product. Or ultimately, in a more circular economy, would we use the term Solution Lifecycle Management as not only hardware and software might be part of the value proposition?

Martin uses consistently the abbreviation SysLM, where I would prefer the TLA SLM. The problem we both have is that both abbreviations are not unique or explicit enough. SysLM creates confusion with SysML (for dyslectic people or fast readers). SLM already has so many less valuable meanings: Simulation Lifecycle Management, Service Lifecycle Management or Software Lifecycle Management.

For the moment, I will use the abbreviation SLM, leaving it in the middle if it is System Lifecycle Management or Solution Lifecycle Management.

 

How to implement both approaches?

In the long term, I predict that more than 80 percent of the activities related to SLM will take place in a data-driven, model-based environment due to the changing content of the solutions offered by companies.

A solution will be based on hardware, the solid part of the solution, for which we could apply a BOM-centric approach. We can see the BOM-centric approach in most current PLM implementations. It is the logical result of optimizing the product lifecycle management processes in a coordinated manner.

However, the most dynamic part of the solution will be covered by software and services. Changing software or services related to a solution has completely different dynamics than a hardware product.

Software and services implementations are associated with a data-driven, model-based approach.

The management of solutions, therefore, needs to be done in a connected manner. Using the BOM-centric approach to manage software and services would create a Kafkaesque overhead.

Depending on your company’s value proposition to the market, the challenge will be to find the right balance. For example, when you keep on selling disconnectedhardware, there is probably no need to change your internal PLM processes that much.

However, when you are moving to a connected business model providing solutions (connected systems / Outcome-based services), you need to introduce new ways of working with a different go-to-market mindset. No longer linear, but iterative.

A McKinsey concept, I have been promoting several times, illustrates a potential path – note the article was not written with a PLM mindset but in a business mindset.

What about Configuration Management?

The different datasets defining a solution also challenge traditional configuration management processes. Configuration Management (CM) is well established in the aerospace & defense industry. In theory, proper configuration management should be the target of every industry to guarantee an appropriate performance, reduced risk and cost of fixing issues.

The challenge, however, is that configuration management processes are not designed to manage systems or solutions, where dynamic updates can be applied whether or not done by the customer.

This is a topic to solve for the modern Connected Car (system) or Connected Car Sharing (solution)

For that reason, I am inquisitive to learn more from Martijn Dullaart’s presentation at the upcoming PLM Roadmap/PDT conference. The title of his session: The next disruption please …

In his abstract for this session, Martijn writes:

From Paper to Digital Files brought many benefits but did not fundamentally impact how Configuration Management was and still is done. The process to go digital was accelerated because of the Covid-19 Pandemic. Forced to work remotely was the disruption that was needed to push everyone to go digital. But a bigger disruption to CM has already arrived. Going model-based will require us to reexamine why we need CM and how to apply it in a model-based environment. Where, from a Configuration Management perspective, a digital file still in many ways behaves like a paper document, a model is something different. What is the deliverable? How do you manage change in models? How do you manage ownership? How should CM adopt MBx, and what requirements to support CM should be considered in the successful implementation of MBx? It’s time to start unraveling these questions in search of answers.

One of the ideas I am currently exploring is that we need a new layer on top of the current configuration management processes extending the validation to software and services. For example, instead of describing every validated configuration, a company might implement the regular configuration management processes for its hardware.

Next, the systems or solutions in the field will report (or validate) their configuration against validation rules. A topic that requires a long discussion and more than this blog post, potentially a full conference.

Therefore I am looking forward to participating in the CIMdata/PDT FALL conference and pick-up the discussions towards a data-driven, model-based future with the attendees.  Besides CM, there are several other topics of great interest for the future. Have a look at the agenda here

 

Conclusion

A data-driven and model-based infrastructure still need to be combined with a coordinated, document-driven infrastructure.  Where the focus will be, depends on your company’s value proposition.

If we discuss hardware products, we should think PLM. When you deliver systems, you should perhaps talk SysML (or SLM). And maybe it is time to define Solution Lifecycle Management as the term for the future.

Please, share your thoughts in the comments.

 

My previous post introducing the concept of connected platforms created some positive feedback and some interesting questions. For example, the question from Maxime Gravel:

Thank you, Jos, for the great blog. Where do you see Change Management tool fit in this new Platform ecosystem?

is one of the questions I try to understand too. You can see my short comment in the comments here. However, while discussing with other experts in the CM-domain, we should paint the path forward. Because if we cannot solve this type of question, the value of connected platforms will be disputable.

It is essential to realize that a digital transformation in the PLM domain is challenging. No company or vendor has the perfect blueprint available to provide an end-to-end answer for a connected enterprise. In addition, I assume it will take 10 – 20 years till we will be familiar with the concepts.

It takes a generation to move from drawings to 3D CAD. It will take another generation to move from a document-driven, linear process to data-driven, real-time collaboration in an iterative manner.  Perhaps we can move faster, as the Automotive, Aerospace & Defense, and Industrial Equipment industries are not the most innovative industries at this time. Other industries or startups might lead us faster into the future.

Although I prefer discussing methodology, I believe before moving into that area, I need to clarify some more technical points before moving forward. My apologies for writing it in such a simple manner. This information should be accessible for the majority of readers.

What means data-driven?

I often mention a data-driven environment, but what do I mean precisely by that. For me, a data-driven environment means that all information is stored in a dataset that contains a single aspect of information in a standardized manner, so it becomes accessible by outside tools.

A document is not a dataset, as often it includes a collection of datasets. Most of the time, the information it is exposed to is not standardized in such a manner a tool can read and interpret the exact content. We will see that a dataset needs an identifier, a classification, and a status.

An identifier to be able to create a connection between other datasets – traceability or, in modern words, a digital thread.
A classification as the classification identifier will determine the type of information the dataset contains and potential a set of mandatory attributes

A status to understand if the dataset is stable or still in work.

Examples of a data-driven approach – the item

The most common dataset in the PLM world is probably the item (or part) in a Bill of Material. The identifier is the item number (ID + revision if revisions are used). Next, the classification will tell you the type of part it is.

Part classification can be a topic on its own, and every industry has its taxonomy.

Finally, the status is used to identify if the dataset is shareable in the context of other information (released, in work, obsolete), allowing tools to expose only relevant information.

In a data-driven manner, a part can occur in several Bill of Materials – an example of a single definition consumed in other places.

When the part information changes, the accountable person has to analyze the relations to the part, which is easy in a data-driven environment. It is normal to find this functionality in a PDM or ERP system.

When the part would change in a document-driven environment, the effort is much higher.

First, all documents need to be identified where this part occurs. Then the impact of change needs to be managed in document versions, which will lead to other related changes if you want to keep the information correct.

Examples of a data-driven approach – the requirement

Another example illustrating the benefits of a data-driven approach is implementing requirements management, where requirements become individual datasets.  Often a product specification can contain hundreds of requirements, addressing the needs of different stakeholders.

In addition, several combinations of requirements need to be handled by other disciplines, mechanical, electrical, software, quality and legal, for example.

As requirements need to be analyzed and ranked, a specification document would never be frozen. Trade-off analysis might lead to dropping or changing a single requirement. It is almost impossible to manage this all in a document, although many companies use Excel. The disadvantages of Excel are known, in particular in a dynamic environment.

The advantage of managing requirements as datasets is that they can be grouped. So, for example, they can be pushed to a supplier (as a specification).

Or requirements could be linked to test criteria and test cases, without the need to manage documents and make sure you work with them last updated document.

As you will see, also requirements need to have an Identifier (to manage digital relations), a classification (to allow grouping) and a status (in work / released /dropped)

Data-driven and Models – the 3D CAD model

3D PDF Model

When I launched my series related to the model-based approach in 2018, the first comments I got came from people who believed that model-based equals the usage of 3D CAD models – see Model-based – the confusion. 3D Models are indeed an essential part of a model-based infrastructure, as the 3D model provides an unambiguous definition of the physical product. Just look at how most vendors depict the aspects of a virtual product using 3D (wireframe) models.

Although we use a 3D representation at each product lifecycle stage, most companies do not have a digital continuity for the 3D representation. Design models are often too heavy for visualization and field services support. The connection between engineering and manufacturing is usually based on drawings instead of annotated models.

I wrote about modern PLM and Model-Based Definition, supported by Jennifer Herron from Action Engineering – read the post PLM and Model-Based Definition here.

If your company wants to master a data-driven approach, this is one of the most accessible learning areas. You will discover that connecting engineering and manufacturing requires new technology, new ways of working and much more coordination between stakeholders.

Implementing Model-Based Definition is not an easy process. However, it is probably one of the best steps to get your digital transformation moving. The benefits of connected information between engineering and manufacturing have been discussed in the blog post PLM and Model-Based Definition

Essential to realize all these exciting capabilities linked to Industry 4.0 require a data-driven, model-based connection between engineering and manufacturing.

If this is not the case, the projected game-changers will not occur as they become too costly.

Data-driven and mathematical models

To manage complexity, we have learned that we have to describe the behavior in models to make logical decisions. This can be done in an abstract model, purely based on mathematical equations and relations. For example, suppose you look at climate models, weather models or COVID infections models.

In that case, we see they all lead to discussions from so-called experts that believe a model should be 100 % correct and any exception shows the model is wrong.

It is not that the model is wrong; the expectations are false.

For less complex systems and products, we also use models in the engineering domain. For example, logical models and behavior models are all descriptive models that allow people to analyze the behavior of a product.

For example, how software code impacts the product’s behavior. Usually, we speak about systems when software is involved, as the software will interact with the outside world.

There can be many models related to a product, and if you want to get an impression, look at this page from the SEBoK wiki: Types of Models. The current challenge is to keep the relations between these models by sharing parameters.

The sharable parameters then again should be datasets in a data-driven environment. Using standardized diagrams, like SysML or UML,  enables the used objects in the diagram to become datasets.

I will not dive further into the modeling details as I want to remain at a high level.

Essential to realize digital models should connect to a data-driven infrastructure by sharing relevant datasets.

What does data-driven imply?

 

I want to conclude this time with some statements to elaborate on further in upcoming posts and discussions

  1. Data-driven does not imply there needs to be a single environment, a single database that contains all information. Like I mentioned in my previous post, it will be about managing connected datasets in a federated manner. It is not anymore about owned the data; it is about access to reliable data.
  2. Data-driven does not mean we do not need any documents anymore. Read electronic files for documents. Likely, document sets will still be the interface to non-connected entities, suppliers, and regulatory bodies. These document sets can be considered a configuration baseline.
  3. Data-driven means that we need to manage data in a much more granular manner. We have to look different at data ownership. It becomes more data accountability per role as the data can be used and consumed throughout the product lifecycle.
  4. Data-driven means that you need to have an enterprise architecture, data governance and a master data management (MDM) approach. So far, the traditional PLM vendors have not been active in the MDM domain as they believe their proprietary data model is leading. Read also this interesting McKinsey article: How enterprise architects need to evolve to survive in a digital world
  5. A model-based approach with connected datasets seems to be the way forward. Managing data in documents will become inefficient as they cannot contribute to any digital accelerator, like applying algorithms. Artificial Intelligence relies on direct access to qualified data.
  6. I don’t believe in Low-Code platforms that provide ad-hoc solutions on demand. The ultimate result after several years might be again a new type of spaghetti. On the other hand, standardized interfaces and protocols will probably deliver higher, long-term benefits. Remember: Low code: A promising trend or a Pandora’s Box?
  7. Configuration Management requires a new approach. The current methodology is very much based on hardware products with labor-intensive change management. However, the world of software products has different configuration management and change procedure. Therefore, we need to merge them in a single framework. Unfortunately, this cannot be the BOM framework due to the dynamics in software changes. An interesting starting point for discussion can be found here: Configuration management of industrial products in PDM/PLM

 

Conclusion

Again, a long post, slowly moving into the future with many questions and points to discuss. Each of the seven points above could be a topic for another blog post, a further discussion and debate.

After my summer holiday break in August, I will follow up. I hope you will join me in this journey by commenting and contributing with your experiences and knowledge.

 

 

 

 

So far, I have been discussing PLM experiences and best practices that have changed due to introducing electronic drawings and affordable 3D CAD systems for the mainstream. From vellum to PDM to item-centric PLM to manage product designs and manufacturing specifications.

Although the technology has improved, the overall processes haven’t changed so much. As a result, disciplines could continue to work in their own comfort zone, most of the time hidden and disconnected from the outside world.

Now, thanks to digitalization, we can connect and format information in real-time. Now we can provide every stakeholder in the company’s business to have almost real-time visibility on what is happening (if allowed). We have seen the benefits of platformization, where the benefits come from real-time connectivity within an ecosystem.

Apple, Amazon, Uber, Airbnb are the non-manufacturing related examples. Companies are trying to replicate these models for other businesses, connecting the concept owner (OEM ?), with design and manufacturing (services), with suppliers and customers. All connected through information, managed in data elements instead of documents – I call it connected PLM

Vendors have already shared their PowerPoints, movies, and demos from how the future would be in the ideal world using their software. The reality, however, is that implementing such solutions requires new business models, a new type of organization and probably new skills.

The last point is vital, as in schools and organizations, we tend to teach what we know from the past as this gives some (fake) feeling of security.

The reality is that most of us will have to go through a learning path, where skills from the past might become obsolete; however, knowledge of the past might be fundamental.

In the upcoming posts, I will share with you what I see, what I deduct from that and what I think would be the next step to learn.

I firmly believe connected PLM requires the usage of various models. Not only the 3D CAD model, as there are so many other models needed to describe and analyze the behavior of a product.

I hope that some of my readers can help us all further on the path of connected PLM (with a model-based approach). This series of posts will be based on the max size per post (avg 1500 words) and the ideas and contributes coming from you and me.

What is platformization?

In our day-to-day life, we are more and more used to direct interaction between resellers and services providers on one side and consumers on the other side. We have a question, and within 24 hours, there is an answer. We want to purchase something, and potentially the next day the goods are delivered. These are examples of a society where all stakeholders are connected in a data-driven manner.

We don’t have to create documents or specialized forms. An app or a digital interface allows us to connect. To enable this type of connectivity, there is a need for an underlying platform that connects all stakeholders. Amazon and Salesforce are examples for commercial activities, Facebook for social activities and, in theory, LinkedIn for professional job activities.

The platform is responsible for direct communication between all stakeholders.

The same applies to businesses. Depending on the products or services they deliver, they could benefit from one or more platforms. The image below shows five potential platforms that I identified in my customer engagements. Of course, they have a PLM focus (in the middle), and the grouping can be made differently.

Five potential business platforms

The 5 potential platforms

The ERP platform
is mainly dedicated to the company’s execution processes – Human Resources, Purchasing, Finance, Production scheduling, and potentially many more services. As platforms try to connect as much as possible all stakeholders. The ERP platform might contain CRM capabilities, which might be sufficient for several companies. However, when the CRM activities become more advanced, it would be better to connect the ERP platform to a CRM platform. The same logic is valid for a Product Innovation Platform and an ERP platform.  Examples of ERP platforms are SAP and Oracle (and they will claim they are more than ERP)

Note: Historically, most companies started with an ERP system, which is not the same as an ERP platform.  A platform is scalable; you can add more apps without having to install a new system. In a platform, all stored data is connected and has a shared data model.

The CRM platform

a platform that is mainly focusing on customer-related activities, and as you can see from the diagram, there is an overlap with capabilities from the other platforms. So again, depending on your core business and products, you might use these capabilities or connect to other platforms. Examples of CRM platforms are Salesforce and Pega, providing a platform to further extend capabilities related to core CRM.

The MES platform
In the past, we had PDM and ERP and what happened in detail on the shop floor was a black box for these systems. MES platforms have become more and more important as companies need to trace and guide individual production orders in a data-driven manner. Manufacturing Execution Systems (and platforms) have their own data model. However, they require input from other platforms and will provide specific information to other platforms.

For example, if we want to know the serial number of a product and the exact production details of this product (used parts, quality status), we would use an MES platform. Examples of MES platforms (none PLM/ERP related vendors) are Parsec and Critical Manufacturing

The IoT platform

these platforms are new and are used to monitor and manage connected products. For example, if you want to trace the individual behavior of a product of a process, you need an IoT platform. The IoT platform provides the product user with performance insights and alerts.

However, it also provides the product manufacturer with the same insights for all their products. This allows the manufacturer to offer predictive maintenance or optimization services based on the experience of a large number of similar products.  Examples of IoT platforms (none PLM/ERP-related vendors) are Hitachi and Microsoft.

The Product Innovation Platform (PIP)

All the above platforms would not have a reason to exist if there was not an environment where products were invented, developed, and managed. The Product Innovation Platform PIP – as described by CIMdata  -is the place where Intellectual Property (IP) is created, where companies decide on their portfolio and more.

The PIP contains the traditional PLM domain. It is also a logical place to manage product quality and technical portfolio decisions, like what kind of product platforms and modules a company will develop. Like all previous platforms, the PIP cannot exist without other platforms and requires connectivity with the other platforms is applicable.

Look below at the CIMdata definition of a Product Innovation Platform.

You will see that most of the historical PLM vendors aiming to be a PIP (with their different flavors): Aras, Dassault Systèmes, PTC and Siemens.

Of course, several vendors sell more than one platform or even create the impression that everything is connected as a single platform. Usually, this is not the case, as each platform has its specific data model and combining them in a single platform would hurt the overall performance.

Therefore, the interaction between these platforms will be based on standardized interfaces or ad-hoc connections.

Standard interfaces or ad-hoc connections?

Suppose your role and information needs can be satisfied within a single platform. In that case, most likely, the platform will provide you with the right environment to see and manipulate the information.

However, it might be different if your role requires access to information from other platforms. For example, it could be as simple as an engineer analyzing a product change who needs to know the actual stock of materials to decide how and when to implement a change.

This would be a PIP/ERP platform collaboration scenario.

Or even more complex, it might be a product manager wanting to know how individual products behave in the field to decide on enhancements and new features. This could be a PIP, CRM, IoT and MES collaboration scenario if traceability of serial numbers is needed.

The company might decide to build a custom app or dashboard for this role to support such a role. Combining in real-time data from the relevant platforms, using standard interfaces (preferred) or using API’s, web services, REST services, microservices (for specialists) and currently in fashion Low-Code development platforms, which allow users to combine data services from different platforms without being an expert in coding.

Without going too much in technology, the topics in this paragraph require an enterprise architecture and vision. It is opportunistic to think that your existing environment will evolve smoothly into a digital highway for the future by “fixing” demands per user. Your infrastructure is much more likely to end up congested as spaghetti.

In that context, I read last week an interesting post Low code: A promising trend or Pandora’s box. Have a look and decide for yourself

I am less focused on technology, more on methodology. Therefore, I want to come back to the theme of my series: The road to model-based and connected PLM. For sure, in the ideal world, the platforms I mentioned, or other platforms that run across these five platforms, are cloud-based and open to connect to other data sources. So, this is the infrastructure discussion.

In my upcoming blog post, I will explain why platforms require a model-based approach and, therefore, cause a challenge, particularly in the PLM domain.

It took us more than fifty years to get rid of vellum drawings. It took us more than twenty years to introduce 3D CAD for design and engineering. Still primarily relying on drawings. It will take us for sure one generation to switch from document-based engineering to model-based engineering.

Conclusion

In this post, I tried to paint a picture of the ideal future based on connected platforms. Such an environment is needed if we want to be highly efficient in designing, delivering, and maintaining future complex products based on hardware and software. Concepts like Digital Twin and Industry 4.0 require a model-based foundation.

In addition, we will need Digital Twins to reach our future sustainability goals efficiently. So, there is work to do.

Your opinion, Your contribution?

 

 

 

 

 

 

Last summer, I wrote a series of blog posts grouped by the theme “Learning from the past to understand the future”. These posts took you through the early days of drawings and numbering practices towards what we currently consider the best practice: PLM BOM-centric backbone for product lifecycle information.

You can find an overview and links to these posts on the page Learning from the past.

If you have read these posts, or if you have gone yourself through this journey, you will realize that all steps were more or less done evolutionarily. There were no disruptions. Affordable 3D CAD systems, new internet paradigms (interactive internet),  global connectivity and mobile devices all introduced new capabilities for the mainstream. As described in these posts, the new capabilities sometimes created friction with old practices. Probably the most popular topics are the whole Form-Fit-Function interpretation and the discussion related to meaningful part numbers.

What is changing?

In the last five to ten years, a lot of new technology has come into our lives. The majority of these technologies are related to dealing with data. Digital transformation in the PLM domain means moving from a file-based/document-centric approach to a data-driven approach.

A Bill of Material on the drawing has become an Excel-like table in a PLM system. However, an Excel file is still used to represent a Bill of Material in companies that have not implemented PLM.

Another example, the specification document has become a collection of individual requirements in a system. Each requirement is a data object with its own status and content. The specification becomes a report combining all valid requirement objects.

Related to CAD, the 2D drawing is no longer the deliverable as a document; the 3D CAD model with its annotated views becomes the information carrier for engineering and manufacturing.

And most important of all, traditional PLM methodologies have been based on a mechanical design and release process. Meanwhile, modern products are systems where the majority of capabilities are defined by software. Software has an entirely different configuration and lifecycle approach conflicting with a mechanical approach, which is too rigid for software.

The last two aspects, from 2D drawings to 3D Models and Mechanical products towards Systems (hardware and software), require new data management methods.  In this environment, we need to learn to manage simulation models, behavior models, physics models and 3D models as connected as possible.

I wrote about these changes three years ago:  Model-Based – an introduction, which led to a lot of misunderstanding (too advanced – too hypothetical).

I plan to revisit these topics in the upcoming months again to see what has changed over the past three years.

What will I discuss in the upcoming weeks?

My first focus is on participating and contributing to the upcoming PLM Roadmap  & PDS spring 2021 conference. Here speakers will discuss the need for reshaping the PLM Value Equation due to new emerging technologies. A topic that contributes perfectly to the future of PLM series.

My contribution will focus on the fact that technology alone cannot disrupt the PLM domain. We also have to deal with legacy data and legacy ways of working.

Next, I will discuss with Jennifer Herron from Action Engineering the progress made in Model-Based Definition, which fits best practices for today – a better connection between engineering and manufacturing. We will also discuss why Model-Based Definition is a significant building block required for realizing the concepts of a digital enterprise, Industry 4.0 and digital twins.

Another post will focus on the difference between the digital thread and the digital thread. Yes, it looks like I am writing twice the same words. However, you will see based on its interpretation, one definition is hanging on the past, the other is targeting the future. Again here, the differentiation is crucial if the need for a maintainable Digital Twin is required.

Model-Based Systems Engineering (MBSE) in all its aspects needs to be discussed too. MBSE is crucial for defining complex products. Model-Based Systems Engineering is seen as a discipline to design products. Understanding data management related to MBSE will be the foundation for understanding data management in a Model-Based Enterprise. For example, how to deal with configuration management in the future?

 

Writing Learning from the past was an easy job as explaining with hindsight is so much easier if you have lived it through. I am curious and excited about the outcome of “The Future of PLM”. Writing about the future means you have digested the information coming to you, knowing that nobody has a clear blueprint for the future of PLM.

There are people and organizations are working on this topic more academically, for example read this post from Lionel Grealou related to the Place of PLM in the Digital Future. The challenge is that an academic future might be disrupted by unpredictable events, like COVID, or disruptive technologies combined with an opportunity to succeed. Therefore I believe, it will be a learning journey for all of us where we need to learn to give technology a business purpose. Business first – then technology.

 

No Conclusion

Normally I close my post with a conclusion. At this moment. there is no conclusion as the journey has just started. I look forward to debating and learning with practitioners in the field. Work together on methodology and concepts that work in a digital enterprise. Join me on this journey. I will start sharing my thoughts in the upcoming months

 

 

 

On March 22 this year, I wrote Time to Think (and act differently) in de middle of a changing world. We were entering a lockdown in the Netherlands due to the COVID-19 virus. As it was such a disruptive change, it was an opportunity to adapt their current ways of working.

The reason for that post was my experience when discussing PLM-initiatives with companies. Often they have no time to sit down, discuss and plan their PLM targets as needed. Crucial people are too busy, leading to an implementation of a system that, in the best case, creates (some) benefits.

The well-known cartoon says it all. We are often too busy doing business as usual, making us feel comfortable. Only when it is too late, people are forced to act.  As the second COVID-19 wave seems to start in the Netherlands, I want to look back on what has happened so far in my eco-system.

Virtual Conferences

As people could not travel anymore, traditional PLM-conferences could not be organized anymore. What was going to be the new future for conferences? TECHNIA, apparently clairvoyant, organized their virtual PLM Innovation Forum as one of the first, end of April.

A more sustainable type of PLM-conference was already a part of their plans, given the carbon footprint a traditional conference induces.  The virtual conference showed that being prepared for a virtual conference pays off during a pandemic with over 1000 participants.

Being first does not always mean being the best,  as we have to learn. While preparing my session for the conference, I felt the same excitement as for a traditional conference. You can read about my initial experience here: The weekend after the PLM Innovation Forum.

Some weeks later, having attended some other virtual conferences, I realized that some points should be addressed/solved:

  • Video conferencing is a must – without seeing people talking, it becomes a podcast.
  • Do not plan long conference days. It is hard to sit behind a screen for a full day. A condensed program makes it easier to attend.
  • Virtual conferences mean that they can be attended live from almost all around the globe. Therefore, finding the right timeslots is crucial for the audience – combined with the previous point – shorter programs.
  • Playing prerecorded sessions without a Q&A session should be avoided. It does not add value.
  • A conference is about networking and discussion – I have not seen a solution for this yet. Fifty percent of the conference value for me comes from face-to-face discussions and coffee meetings. A virtual conference needs to have private chat opportunities between attendees.

In the last quarter of this year, I will present at several merely local conferences, sometimes a mix between “live” with a limited number of attendees, if it will be allowed.

And then there is the upcoming PLM Road Map & PDT Fall 2020 (virtual) conference on 17-18-19 November.

This conference has always been my favorite conference thanks to its continued focus on sharing experiences, most of the time, based on industry standards. We discuss topics and learn from each other. See my previous posts: The weekend after 2019 Day 1, 2019 Day 2, 2018 Day 1, 2018 Day2, 2017 Day 1, 2017 Day 2, etc.

The theme Digital Thread—the PLM Professionals’ Path to Delivering Innovation, Efficiency, and Quality has nothing to do with marketing. You can have a look at the full schedule here. Although there is a lot of buzz around Digital Thread, presenters discuss the reality and their plans

Later in this post, see the paragraph Digital Thread is not a BOM, I will elaborate on this theme.

Getting tired?

I discovered I am getting tired as I am missing face-to-face interaction with people. Working from home, having video calls, is probably a very sustainable way of working.  However, non-planned social interaction, meeting each other at the coffee machine, or during the breaks at a conference or workshop, is also crucial for informal interaction.

Apparently, several others in my eco-system are struggling too. I noticed a tsunami of webinars and blog posts where many of them were an attempt to be noticed. Probably the same reason: traditionally businesses have stalled. And it is all about Digital Transformation and SaaS at this moment. Meaningless if there is no interaction.

In this context, I liked Jan Bosch’s statement in his article: Does data-driven decision-making make you boring? An article not directly addressing the PLM-market; however, there is a lot of overlap related to people’s reluctance to imagine a different future.

My favorite quote:

 I still meet people that continue to express beliefs about the world, their industry, their customers or their own performance that simply aren’t true. Although some, like Steve Jobs, were known for their “reality distortion field,” for virtually all of us, just wishing for something to be true doesn’t make it so. As William Edwards Deming famously said: in God we trust; all others must bring data.

I fully concur with this statement and always get suspicious when someone claims the truth.

Still, there are some diamonds.

I enjoyed all episodes from Minerva PLM TV – Jennifer Moore started these series in the early COVID19-days (coincidence?). She was able to have a collection of interviews with known and less-known people in the PLM-domain. As most of them were vendor-independent, these episodes are a great resource to get educated.

The last episode with Angela Ippisch illustrates how often PLM in companies depends on a few enthusiastic persons, who have the energy to educate themselves. Angela mentions there is a lot of information on the internet; the challenge is to separate the useful information from marketing.

I have been publishing the past five months a series of posts under the joint theme learning from the past to understand the future. In these posts, I explained the evolution from PDM to PLM, resulting in the current item-centric approach with an EBOM, MBOM, and SBOM.

On purpose, one post per every two weeks – to avoid information overflow. Looking back, it took more posts than expected, and they are an illustration of the many different angles there are in the PLM domain – not a single truth.

Digital Thread is not a BOM

I want to address this point because I realized that in the whole blogging world there appear to be two worlds when discussing PLM terminology. Oleg Shilovitsky, CEO@OpenBOM, claims that Digital Thread and Digital Twin topics are just fancy marketing terms. I was even more surprised to read his post: 3 Reasons Why You Should Avoid Using The Word “Model” In PLM. Read the comments and discussion in these posts (if LinkedIn allows you to navigate)

Oleg’s posts have for me most of the time, always something to discuss. I would be happier if other people with different backgrounds would participate in these discussions too – A “Like” is not a discussion. The risk in a virtual world is that it becomes a person-to-person debate, and we have seen the damage such debates can do for an entire community.

In the discussion we had related to Digital Thread and BOM, I realized that when we talk about traditional products, the BOM and the Digital Thread might be the same. This is how we historically released products to the market. Once produced, there were no more changes. In these situations, you could state a PLM-backbone based on BOM-structures/views, the EBOM, MBOM, and SBOM provide a Digital Thread.

The different interpretation comes when talking about products that contain software defining its behavior. Like a computer, the operating system can be updated on the fly; meanwhile, the mechanical system remains the same. To specify and certify the behavior of the computer, we cannot rely on the BOM anymore.

Having software in the BOM and revise the BOM every time there is a software change is a mission impossible. A mistake suggested ten years ago when we started to realize the different release cycles of hardware and software. Still, it is all about the traceability of all information related to a product along its whole lifecycle.

In a connected environment, we need to manage relationships between the BOM and relations to other artifacts. Managing these relations in a connected environment is what I would call the Digital Thread – a layer above PLM. While writing this post, I saw Matthias Ahrens’ post stating the same (click on the image to see the post)

When we discuss managing all the relations, we touch the domain of Configuration Management.  Martijn Dullaart/Martin Haket’s picture shares the same mindset – here, CM is the overlapping layer.

However, in their diagram, it is not a system picture; the different systems do not need to be connected. Configuration Management is the discipline that maintains the correct definition of every product – CM maintains the Thread. When it becomes connected, it is a Digital Thread.

As I have reached my 1500 words, I will not zoom in on the PLM and Model discussion – build your opinion yourself. We have to realize that the word Model always requires a context. Perhaps many of us coming from the traditional PDM/PLM world (managing CAD data) think about CAD models. As I studied physics before even touching CAD, I grew up with a different connotation

Lars Taxén’s comment in this discussion perhaps says it all (click on the image to read it). If you want to learn and discuss more about the Digital Thread and Models, register for the PLM Roadmap & PDT2020 event as many of the sessions are in this context (and not about 3D CAD).

Conclusion

I noticed I am getting tired of all the information streams crying for my attention and look forward to real social discussions, not broadcasted. Time to think differently requires such discussion, and feel free to contact me if you want to reflect on your thoughts. My next action will be a new series named Painting the future to stay motivated. (As we understand the past).

I believe we are almost at the end of learning from the past. We have seen how, from an initial serial CAD-driven approach with PDM, we evolved to PLM-managed structures, the EBOM and the MBOM. Or to illustrate this statement, look at the image below, where I use a Tech-Clarity image from Jim Brown.

The image on the right describes perfectly the complementary roles of PLM and ERP. The image on the left shows the typical PDM-approach. PDM feeding ERP in a linear process. The image on the right, I believe it is from 2004, shows the best practice before digital transformation. PLM is supporting product innovation in an iterative approach, pushing released information to ERP for execution.

As I think in images, I like the concept of a circle for PLM and an arrow for ERP. I am always using those two images in discussions with my customers when we want to understand if a particular activity should be in the PLM or ERP-domain.

Ten years ago, the PLM-domain was conceptually further extended by introducing support for products in operations and service. Similar to the EBOM (engineering) and the MBOM (manufacturing), the SBOM (service) was introduced to support product information for products in operation. In theory a full connected cicle.

Asset Lifecycle Management

At the same time, I was promoting PLM-practices for owners/operators to enhance Asset Lifecycle Management. My first post from June 2010 was called: PLM for Asset Lifecycle Management and Asset Development introduces this approach.

Conceptually the SBOM and Asset Lifecycle Management have a lot in common. There is a design product, in this case, an asset (plant, machine) running in the field, and we need to make sure operators have the latest information about the asset. And in case of asset changes, which can be a maintenance operation, a repair or complete overall, we need to be sure the changes are based on the correct information from the as-built environment. This requires full configuration management.

Asset changes can be based on extensive projects that need to be treated like new product development projects, with a staged approach that can take weeks, months, sometimes years. These activities are typical activities performed in PLM-systems, not in MRO-systems that are designed to manage the actual operation. Again here we see the complementary roles of PLM (iterative) and MRO (execution).

Since 2008, I have worked a lot in this environment, mainly in the nuclear and process industry. If you want to learn more about this aspect of PLM, I recommend looking at the PLMpartner website, where Bjørn Fidjeland, in cooperation with SharePLM, published a course on Plant Information Management. We worked together in several projects and Bjørn has done a great effort to describe the logical model to be used instead of a function-feature story.

Ten years ago, we were not calling this concept the “Digital Twin,” as the aim was to provide end-to-end support of asset information from engineering, procurement, and construction towards operation in a coordinated manner. The breaking point in the relation between the EPCs and Owner/Operators is the data-handover – how much of your IP can/do you expose and what is needed. Nowadays, we would call striving for end-to-end data continuity the Digital Thread.

Hot from the press in this context, CIMdata just published a commentary Managing the Digital Thread in Global Value Chains describing Eurostep’s ShareAspace capabilities and experiences in managing an end-to-end information flow (Digital Thread) in a heterogeneous environment based on exchange standards like ISO 10303-239 PLCS.  Their solution is based on what I consider a more modern approach for managing digital continuity compared to the traditional approach I described before. Compare the two images in this paragraph. The first image represents the old/current way with a disconnected handover, the second represents ShareAspace connected approach based on a real digital thread.

The Service BOM

As discussed with Asset Lifecycle Management, there is a disconnect between the engineering disciplines and operations in the field, looking from the point of view of an Asset owner/operator.

Now when we look from the perspective of a manufacturing company that produces assets to be serviced, we can identify a different dataflow and a new structure, the Service BOM (SBOM).

The SBOM provides information on how a product needs to be serviced. What are the parts that require service, and what are the service kits that are possible for that product? For that reason, service engineering should be done in parallel to product engineering. When designing a product, the engineer needs to identify which the wearing parts (always require service in time) and which parts might be serviceable.

There are different ways to look at the SBOM. Conceptually, the SBOM could be created in close relation with the EBOM. At the moment you define your product, you also should specify how the product will be services. See the image below

From this example, it is clear that part standardization and modularization have a considerable benefit for services downstream. What if you have only one serviceable part that applies to many products? The number of parts to have in stock will be strongly reduced instead of having many similar parts that only fit in a single product?

Depending on the type of product, the SBOM can be generic, serving many products in the field. In that case, the company has to deal with catalogs, to be defined in PLM. Or the SBOM can be aligned with the As-Built of a capital product in the field. In that case, the concepts of Asset Lifecycle Management apply. Click on the image to see a clear picture.

The SBOM on its own,  in such an environment, will have links to specific documents, service instructions, operating manuals.

If your PLM-system allows it, extending the EBOM and MBOM with an SBOM is not a complex effort. What is crucial to understand is that the SBOM has its own lifecycle, which can even last longer than the active product sold. So sometimes, manufacturing specifications, related to service parts need to be maintained too, creating a link between the SBOM and potential MBOM(s).

ECM = Enterprise Change Management

When I discussed ECM in my previous post in the context of Engineering Change Management, I got the feedback that nowadays, everyone talks about Enterprise Change Management. Engineering Change Management is old school.

In the past, and even in a 2014 benchmark, a customer had two change management systems. One in PLM and one in ERP, and companies were looking into connecting these two processes. Like the BOM-interaction between PLM and ERP, this is technology-wise, never a real problem.

The real problem in such situations was to come to a logical flow of events. Many times the company insisted that every change should start from the ERP-system as we like to standardize. This means that even an engineering change had to be registered first in the ERP-system

Luckily the reach of PLM has grown. PLM is no longer the engineering tool (IT-system thinking). PLM has become the information backbone for product information all along the product lifecycle. Having the MBOM and SBOM available through a PLM-infrastructure allows organizations to streamline their processes.

Aras – digital thread through connected structures

And in this modern environment, enterprise change management might take place mostly in a PLM-infrastructure. The PLM-infrastructure providing a digital thread, as the Aras picture above illustrates, provides the full traceability to support configuration management.

However, we still have to remember that configuration management and engineering change management, first of all, are based on methodology and processes. Next, the combination of tools to be used will vary.

I like to conclude this topic with a quote from Lee Perrin’s comment on my previous blog post

I would add that aerospace companies implemented CM, to avoid fatal consequences to their companies, but also to their flying customers.

PLM provides the framework within which to carry out Configuration Management. CM can indeed be carried out without PLM, as was done in the old paper-based days. As you have stated, PLM makes the whole CM process much more efficient. I think more transparent too.

Conclusion

After nine posts around the theme Learning from the past to understand the future, I walked through the history of CAD, PDM and PLM in a fast mode, pointing to practices and friction points. In the blogging space, it is hard to find this information as most blog posts are coming from software vendors explaining why their tool is needed. Hopefully, these series have helped many of you to understand a broader context. Now I want to focus on the future again in my upcoming blog posts.

Still, feel free to contact me and discuss methodology topics.

Picture by Christi Wijnen – a good friend and photographer in the Netherlands

In the previous seven posts, learning from the past to understand the future, we have seen the evolution from manual 2D drawing handling. Next, the emerge of ERP and CAD followed by data management systems (PDM/PLM) and methodology (EBOM/MBOM) to create an infrastructure for product data from concept towards manufacturing.

Before discussing the extension to the SBOM-concept, I first want to discuss Engineering Change Management and Configuration Management.

ECM and CM – are they the same?

Often when you talk with people in my PLM bubble, the terms Change Management and Configuration Management are mixed or not well understood.

When talking about Change Management, we should clearly distinguish between OCM (Organizational Change Management) and ECM (Engineering Change Management). In this post, I will focus on Engineering Change Management (ECM).

When talking about Configuration Management also here we find two interpretations of it.

The first one is a methodology describing technically how, in your PLM/CAD-environment, you can build the most efficient way connected data structures, representing all product variations. This technology varies per PLM/CAD-vendor, and therefore I will not discuss it here. The other interpretation of Configuration Management is described on Wiki as follows:

Configuration management (CM) is a systems engineering process for establishing and maintaining consistency of a product’s performance, functional, and physical attributes with its requirements, design, and operational information throughout its life.

This is also the area where I will focus on this time.

And as-if great minds think alike and are synchronized, I was happy to see Martijn Dullaart’s recent blog post, referring to a poll and follow-up article on CM.

Here Martijn precisely touches the topic I address in this post. I recommend you to read his post: Configuration Management done right = Product-Centric first and then follow with the rest of this article.

Engineering Change Management

Initially, engineering change management was a departmental activity performed by engineering to manage the changes in a product’s definition. Other stakeholders are often consulted when preparing a change, which can be minor (affecting, for example, only engineering) or major (affecting engineering and manufacturing).

The way engineering change management has been implemented varies a lot. Over time companies all around the world have defined their change methodology, and there is a lot of commonality between these approaches. However, terminology as revision, version, major change, minor change all might vary.

I described the generic approach for engineering change processes in my blog post: ECR / ECO for Dummies from 2010.

The fact that companies have defined their own engineering change processes is not an issue when it works and is done manually. The real challenge came with PDM/PLM-systems that need to provide support for engineering change management.

Do you leave the methodology 100 % open, or do you provide business logic?

I have seen implementations where an engineer with a right-click could release an assembly without any constraints. Related drawings might not exist, parts in the assembly are not released, and more. To obtain a reliable engineering change management process, the company had to customize the PLM-system to its desired behavior.

An exercise excellent for a system integrator as there was always a discussion with end-users that do not want to be restricted in case of an emergency  (“we will complete the definition later” / “too many clicks” / “do I have to approve 100 parts ?”). In many cases, the system integrator kept on customizing the system to adapt to all wishes. Often the engineering change methodology on paper was not complete or contained contradictions when trying to digitize the processes.

For that reason, the PLM-vendors that aim to provide Out-Of-The-Box solutions have been trying to predefine certain behaviors in their system. For example, you cannot release a part, when its specifications (drawings/documents) are not released. Or, you cannot update a released assembly without creating a new revision.

These rules speed-up the implementation; however, they require more OCM (Organizational Change Management) as probably naming and methodology has to change within the company. This is the continuous battle in PLM-implementations. In particular where the company has a strong legacy or lack of business understanding, when implementing PLM.

There is an excellent webcast in this context on Minerva PLM TV – How to Increase IT Project Success with Organizational Change Management.

Click on the image or link to watch this recording.

Configuration Management

When we talk about configuration management, we have to think about managing the consistency of product data along the whole product lifecycle, as we have seen from the Wiki-definition before.

Wiki – the configuration Activity Model

Configuration management existed long before we had IT-systems. Therefore, configuration management is more a collection of activities (see diagram above) to ensure the consistency of information is correct for any given product. Consistent during design, where requirements match product capabilities. Consistent with manufacturing, where the manufacturing process is based on the correct engineering specifications. And consistent with operations, meaning that we have the full definition of product in the field, the As-Built, in correct relation to its engineering and manufacturing definition.

Source: Configuration management in aerospace industry

This consistency is crucial for products where the cost of an error can have a massive impact on the manufacturer. The first industries that invested heavily in configuration management were the Aerospace and Defense industries. Configuration management is needed in these industries as the products are usually complex, and failure can have a fatal impact on the company. Combined with many regulatory constraints, managing the configuration of a product and the impact of changes is a discipline on its own.

Other industries have also introduced configuration management nowadays. The nuclear power industry and the pharmaceutical industry use configuration management as part of their regulatory compliance. The automotive industry requires configuration management partly for compliance, mainly driven by quality targets. An accident or a recall can be costly for a car manufacturer. Other manufacturing companies all have their own configuration management strategies, mainly depending on their own risk assessment. Configuration management is a pro-active discipline – it costs money – time, people and potential tools to implement it. In my experience, many of these companies try to do “some” configuration management, always hoping that a real disaster will not happen (or can happen). Proper configuration management allows you to perform reliable impact analysis for any change (image above)

What happens in the field?

When introducing PLM in mid-market companies, often, the dream was that with the new PLM-system configuration, management would be there too.

Management believes the tools will fix the issue.

Partly because configuration management deals with a structured approach on how to manage changes, there was always confusion with engineering change management. Modern PLM-systems all have an impact analysis capability. However, most of the time, this impact analysis only reaches the content that is in the PLM-system. Configuration Management goes further.

If you think that configuration management is crucial for your company, start educating yourselves first before implementing anything in a tool. There are several places where you can learn all about configuration management.

  • Probably the best-known organization is IpX (Institute for Process Excellence), teaching the CM2 methodology. Have a look here: CM2 certification and courses
  • Closely related to IpX, Martijn Dullaart shares his thoughts coming from the field as Lead Architect for Enterprise Configuration Management at ASML (one of the Dutch crown jewels) in his blog: MDUX
  • CMstat, a configuration and data management solution provider, provides educational posts from their perspective. Have a look at their posts, for example, PLM or PDM or CM
  • If you want to have a quick overview of Configuration Management in general, targeted for the mid-market, have a look at this (outdated) course: Training for Small and Medium Enterprises on CONFIGURATION MANAGEMENT. Good for self-study to get an understanding of the domain.

 

To summarize

In regulated industries, Configuration Management and PLM are a must to ensure compliance and quality. Configuration management and (engineering) change management are, first of all, required methodologies that guarantee the quality of your products. The more complex your products are, the higher the need for change and configuration management.

PLM-systems require embedded engineering change management – part of the PDM domain. Performing Engineering Change Management in a system is something many users do not like, as it feels like overhead. Too much administration or too many mouse clicks.

So far, there is no golden egg that performs engineering change management automatically. Perhaps in a data-driven environment, algorithms can speed-up change management processes. Still, there is a need for human decisions.

Similar to configuration management. If you have a PLM-system that connects all the data from concept, design, and manufacturing in a single environment, it does not mean you are performing configuration management. You need to have processes in place, and depending on your product and industry, the importance will vary.

Conclusion

In the first seven posts, we discussed the design and engineering practices, from CAD to EBOM, ending with the MBOM. Engineering Change Management and, in particular, Configuration Management are methodologies to ensure the consistency of data along the product lifecycle. These methodologies are connected and need to be fit for the future – more on this when we move to modern model-based approaches.

Closing note:

While finishing this blog post today I read Jan Bosch’s post: Why you should not align. Jan touches the same topic that I try to describe in my series Learning from the Past ….., as my intention is to make us aware that by holding on to practices from the past we are blocking our future. Highly recommended to read his post – a quote:

The problem is, of course, that every time you resist change, you get a bit behind. You accumulate some business, process and technical debt. You become a little less “fitting” to the environment in which you’re operating

This time a short post (for me) as I am in the middle the series “Learning from the past to understand the future” and currently collecting information for next week’s post. However, recently Rob Ferrone, the original Digital Plumber, pointed me to an interesting post from Scott Taylor, the Data Whisperer.

In code: The Virtual Dutchman discovered the Data Whisperer thanks to the original Digital Plumber.

Scott’s article with the title: “Data Management Hasn’t Failed, but Data Management Storytelling Has” matches precisely the discussion we have in the PLM community.

Please read his article, and just replace the words Data Management by PLM, and it could have been written for our community. In a way, PLM is a specific application of data management, so not a real surprise.

Scott’s conclusions give food for thought in the PLM community:

To win over business stakeholders, Data Management leadership must craft a compelling narrative that builds urgency, reinvigorates enthusiasm, and evangelizes WHY their programs enable the strategic intentions of their enterprise. If the business leaders whose support and engagement you seek do not understand and accept the WHY, they will not care about the HOW. When communicating to executive leadership, skip the technical details, the feature functionality, and the reference architecture and focus on:

  • Establishing an accessible vocabulary
  • Harmonizing to a common voice
  • Illuminating the business vision

When you tell your Data Management story with that perspective, it can end happily ever after.

It all resonates well with what I described in the PLM ROI Myth – it is clear that when people hear the word Myth, they have a bad connotation, same btw for PLM.

The fact that we still need to learn storytelling is because most of us are so much focused on technology and sometimes on discovering the new name for PLM in the future.

Last week I pointed to a survey from the PLMIG (PLM Interest Group) and XLifcycle, inviting you to help to define the future definition of PLM.

You are still welcome here: Towards a digital future: the evolving role of PLM in the future digital world.

Also, I saw a great interview with Martin Eigner on Minerva PLM TV interview by Jennifer Moore. Martin is well known in the PLM world and has done foundational work for our community

. According to Jennifer, he is considered as The Godfather of PLM.  This tittle fits nicely in today’s post. Those who have seen his presentations in recent years will remember Martin is talking about SysLM (System Lifecycle Management) as the future for PLM.

It is an interesting recording to watch – click on the image above to see it. Martin explains nicely why we often do not get the positive feedback from PLM implementations – starting at minute 13 for those who cannot wait.

In the interview, you will discover we often talk too much about our discipline capabilities where the real discussion should be talking business. Strategy and objectives are discussed and decided at the management level of a company. By using storytelling, we can connect to these business objectives.

The end result will be more likely that a company understands why to invest significantly in PLM as now PLM is part of its competitiveness and future continuity.

Conclusion

I shared links to two interesting posts from the last weeks. Studying them will help you to create a broader view. We have to learn to tell the right story. People do not want PLM – they have personal objectives. Companies have business objectives, and they might lead to the need for a new and changing PLM. Connecting to the management in an organization, therefore, is crucial.

Next week again more about learning from the past to understand the future

Last week I shared my thoughts related to my observation that the ROI of PLM is not directly visible or measurable, and I explained why. Also, I explained that the alignment of an organization requires a myth to make it happen. A majority of readers agreed with these observations. Some others either misinterpreted the headlines or twisted the story in favor of their opinion.

A few came from Oleg Shilovitsky and as Oleg is quite open in his discussions, it allows me to follow-up on his statements. Other people might share similar thoughts but they haven’t had the time or opportunity to be vocal. Feel free to share your thoughts/experiences too.

Some misinterpretations from Oleg’s post: PLM circa 2020 – How to stop selling Myths

  • The title “How to stop selling Myths” is the first misinterpretation.
    We are not selling myths – more below.
  • “Jos Voskuil’s recommendation is to create a myth. In his PLM ROI Myths article, he suggests that you should not work on a business case, value, or even technology” is the second misinterpretation, you still need a business case, you need value and you need technology.

And I got some feedback from Lionel Grealou, who’s post was a catalyst for me to write the PLM ROI Myth post. I agree I took some shortcuts based on his blog post. You can read his comments here. The misinterpretation is:

  • “Good luck getting your CFO approve the business change or PLM investment based on some “myth” propaganda :-)” as it is the opposite, make your plan, support your plan with a business case and then use the myth to align

I am glad about these statements as they allow me to be more precise, avoiding misperceptions/myth-perceptions.

A Myth is bad

Some people might think that a myth is bad, as the myth is most of the time abstract.  I think these people do not realize that there a lot of myths that they are following; it is a typical social human behavior to respond to myths. Some myths:

  • How can you be religious without believing in myths?
  • In this country/world, you can become anything if you want?
  • In the past, life was better
  • I make this country great again

The reason human beings need myths is that without them, it is impossible to align people around abstract themes. Try for each of the myths above to create an end-to-end logical story based on factual and concrete information. Impossible!

Read Yuval Harari’s book Sapiens about the power of myths. Read Steven Pinker’s book Enlightenment Now to understand that statistics show a lot of current myths are false. However, this does not mean a myth is bad. Human beings are driven by social influences and myths – it is our brain.

Unless you have no social interaction, you might be immune to myths. With brings me to quoting Oleg once more time:

“A long time ago when I was too naive and too technical, I thought that the best product (or technology) always wins. Well… I was wrong. “

I went through the same experience, having studied physics and mathematics makes you think extremely logical. Something I enjoyed while developing software. Later, when I started my journey as the virtualdutchman mediating in PLM implementations, I discovered logical alone does not work in businesses. The majority of decisions are done based on “gut feelings” still presented as reasonable cases.

Unless you have an audience of Vulcans, like Mr. Spock, you need to deal with the human brain. Consider the myth as the envelope to pass the PLM-project to the management. C-level acts by myths as so far I haven’t seen C-level management spending serious time on understanding PLM. I will end with a quote from Paul Empringham:

I sometimes wish companies would spend 6 months+ to educate themselves on what it takes to deliver incremental PLM success BEFORE engaging with software providers

You don’t need a business case

Lionel is also skeptical about some “Myth-propaganda” and I agree with him. The Myth is the envelope, inside needs to be something valuable, the strategy, the plan, and the business case. Here I want to stress one more time that most business cases for PLM are focusing on tool and collaboration efficiency. And from there projecting benefits. However, how well can we predict the future?

If you implement a process, let’s assume BOM-collaboration done with Excel by BOM-collaboration based on an Excel-on-the-cloud-like solution, you can measure the differences, assuming you can measure people’s efficiency. I guess this is what Oleg means when he explains OpenBOM has a real business case.

However, if you change the intent for people to work differently, for example, consult your supplier or manufacturing earlier in the design process, you touch human behavior. Why should I consult someone before I finish my job, I am measured on output not on collaboration or proactive response? Here is the real ROI challenge.

I have participated in dozens of business cases and at the end, they all look like the graph below:

The ROI is fantastic – after a little more than 2 years, we have a positive ROI, and the ROI only gets bigger. So if you trust the numbers, you would be a fool not to approve this project. Right?

And here comes the C-level gut-feeling. If I have a positive feeling (I follow the myth), then I will approve. If I do not like it, I will say I do not trust the numbers.

Needless to say that if there was a business case without ROI, we do not need to meet the C-level. Unless, and it happens incidental, at C-level, there was already a decision we need PLM from Vendor X because we played golf together, we are condemned together or we believe the same myths.

In reality, the old Gartner graph from realized benefits says it all. The impact of culture, processes, and people can make or break a plan.

You do not need an abstract story for PLM

Some people believe PLM on its own is a myth. You just need the right technology and people will start using it, spreading it out and see how we have improved business. Sometimes email is used as an example. Email is popular because you can with limited effort, collaborate with people, no matter where they are. Now twenty years later, companies are complaining about the lack of traceability, the lack of knowledge and understanding related to their products and processes.

PLM will always have the complexity of supporting traceability combined with real-time collaboration. If you focus only on traceability, people will complain that they are not a counter clerk. If you focus solely on collaboration, you miss the knowledge build-up and traceability.

That’s why PLM is a mix of governance, optimized processes to guarantee quality and collaboration, combined with a methodology to tune the existing processes implemented in tools that allow people to be confident and efficient. You cannot translate a business strategy into a function-feature list for a tool.

Conclusion

Myths are part of the human social alignment of large groups of people. If a Myth is true or false, I will not judge. You can use the Myth as an envelope to package your business case. The business case should always be a combination of new ways of working (organizational change), optimized processes and finally, the best tools. A PLM tool-only business case is to my opinion far from realistic

 

Now preparing for PI PLMx London on 3-4 February – discussing Myths, Single BOMs and the PLM Green Alliance

It is the holiday season many groups, religions have their celebrations in this period, mostly due to the return of the light on the Northern hemisphere – Christmas, Diwali, Hanukkah, Kwanzaa and Santa Lucia are a few of them. Combined with a new decade upcoming, also a time for me to reflect on what have we learned and what can we imagine in the next ten years.

Looking back the last decade

Globalization is probably one of the most significant changes we have seen. Almost the whole world is connected now through all kinds of social and digital media. Information is instantly available and influences our behavior dramatically. The amount of information coming to us is so huge that we only filter the information that touches us emotionally. The disadvantage of that, opinions become facts and different opinions become enemies. A colorful society becomes more black and white.

These trends have not reached us, in the same manner, the domain of PLM. There have been several startups in the past ten years, explaining that PLM should be as easy as social communication with Facebook. Most of these startups focused on integrations with MCAD-systems, as-if PLM is about developing mechanical products.

In my opinion, the past decade has shown that mechanical design is no longer a unique part of product development. Products have become systems, full of electronics and driven by software. Systems Engineering became more and more important, defining the overall concept of a product, moving the focus to the earlier stages of the product design – see image below.

During Ideation and system definition, we need “social” collaboration between all stakeholders. To get a grip on this collaboration, we use models (SysML, UML, Logical, 2D, 3D) to have an unambiguous representation, simulation and validation of the product already in the virtual world.

Actually, there is nothing social about that collaboration. It is a company-driven demand to deliver competitive products to remain in business. If our company does not improve multidisciplinary collaboration, global competitors with less nostalgic thoughts to the past will conquer the market space.

Currently in most companies there are two worlds, hardware and software, almost 100 %-separated managed by either PLM or ALM (Application Lifecycle Management). It is clear that “old” PLM – item-driven with related documents cannot match the approach required for ALM – data-elements (code) based on software models (and modules).

In the hardware world a change can have a huge effect on the cost (or waste) of the product, however, implementing a hardware change can take months. Think about new machinery, tooling in the worst case. In the software domain, a change can be executed almost immediately. However here testing the impact of the change can have serious effects. The software fix for the Boeing 737 Max is not yet proven and delivered.

Therefore, I would like to conclude that in the past decade we learned in the PLM-domain to work in a Coordinated manner – leaving silos mostly in place.

Next week I will look forward to our challenging upcoming decade. Topics on my list for the next decade are:

  • From Coordinated to Connected – Generation Change
  • Sustainability of our Earth (and how PLM can help)
  • Understanding our human behavior to understand how to explain PLM to your execs- PI PLMx London 2020
  • All combined with restoring trust in science

 

I wish you all a happy and healthy New Year – take time to listen and learn as we need dialogue for the future, not opinions.

See you in 2020

Translate

Email subscription to this blog

Categories

%d bloggers like this: