You are currently browsing the tag archive for the ‘PLM’ tag.

Last week I shared my plans for 2021 related to my blog, virtualdutchman.com. Those of you who follow my blog might have noticed my posts are never short as I try to discuss or explain a topic from various aspects. This sometimes requires additional research from my side. The findings will provide benefits for all of us. We keep on learning.

At the end of the post, I asked you to participate in a survey to provide feedback on the proposed topics. So far, only one percent of my readers have responded to this short survey. The last time I shared a short survey in 2018, the response was much more significant.

Perhaps you are tired of the many surveys; perhaps you did not make it to the end. Please make an effort this time. Here is on more time the survey

The results so far

To understand the topics below, please make sure you have read the previous blog post to understand each paragraph’s context.

PLM understanding

For PLM-related topics that I proposed, Product Configuration Management, Supplier Collaboration Management, and  Digital Twin Management got the most traction. I started preparing for them, combined with a few new suggested topics that I will further explore. You can click on the images below to read the details.

PLM Deep dive

From the suggested topics for a PLM deep-dive, it is interesting to see most respondents want to learn more about Product Portfolio Management and Systems Engineering within PLM. Traditional topics like Enterprise/Engineering Change Management, BOM Management, or PLM implementation methodologies have been considered less relevant.

The PLM Doctor is in

Several questions were coming in for the “PLM Doctor,” and I started planning the first episodes. The formula: A single question and an answer through a video recording – max. 2 – 3 minutes. Suitable for fast consumers of information.

PLM and Sustainability

Here we can see the majority is observing what is happening. Only a few persons reported interest in sustainability and probably not disconnected; they work for a company that takes sustainability seriously.

 

 

PLM and digitization

When discussing PLM’s digitization, I believe one of the fundamental changes that we need to implement (and learn to master) is a more Model-Based approach for each phase of the product life cycle. Also, most respondents have a notion of what model-based means and want to apply these practices to engineering and manufacturing.

 

Your feedback

I think you all have heard this statement before about Lies and Statistics. Especially with social media, there are billions of people digging for statistics to support their theories. Don’t worry about my situation; I would like to make my statement based on some larger numbers, so please take the survey here if you haven’t done so.

 

Conclusion

I am curious about your detailed inputs, and the next blog post will be the first of the 2021 series.

 

 

 

 

 

For those living in the Northern Hemisphere: This week, we had the shortest day, or if you like the dark, the longest night. This period has always been a moment of reflection. What have we done this year?

Rob Ferrone (Quick Release), the Santa on the left (the leftist), and Jos Voskuil (TacIT), the Santa on the right (the rightist), share in a dialogue their highlights from 2020

Wishing you all a great moment of reflection and a smooth path into a Corona-proof future.

It will be different; let’s make it better.

 

About a year ago we started the PLM Global Green Alliance, further abbreviated as the PGGA. Rich McFall, the main driver behind the PGGA started the website, The PLM Green Alliance, to have a persistent place to share information.

Also, we launched the PLM Global Alliance LinkedIn group to share our intentions and create a community of people who would like to share knowledge through information or discussion.

Our mission statement is:

The mission of the new PLM Green Alliance is to create global connection, communication, and community between professionals who use, develop, market, or support Product Lifecycle Management (PLM) related technologies and software solutions that have value in addressing the causes and consequences of climate change due to human-generated greenhouse gas emissions. We are motivated by the technological challenge to help create a more sustainable and green future for our economies, industries, communities, and all life forms on our planet that depend on healthy ecosystems.

My motivation

My personal motivation to support and join the PGGA was driven by the wish to combine my PLM-world with interest to create a more sustainable society for anyone around the world. It is a challenging combination. For example, PLM is born in the Aerospace and Defense industries, probably not the most sustainable industries.

Having worked with some companies in the Apparel and Retail industry, I have seen that these industries care more about their carbon footprint. Perhaps because they are “volume-industries” closely connected to their consumers, these industries actively build practices to reduce their carbon footprint and impact societies. The sense or non-sense of recycling is such a topic to discuss and analyze.

At that time, I got inspired by a session during the PLM Roadmap / PDT 2019 conference.

Graham Aid‘s from the Ragn-Sells group was a call to action. Sustainability and a wealthy economy go together; however, we have to change our habits & think patterns.  You can read my review from this session in this blog post: The weekend after PLM Roadmap / PDT 2019 – Day 1

Many readers of this post have probably never heard of the Ragn-Sells group or followed up on a call for action.  I have the same challenge. Being motivated beyond your day-to-day business (the old ways of working) and giving these activities priority above exploring and learning more about applying sustainability in my PLM practices.

And then came COVID-19.

I think most of you have seen the image on the left, which started as a joke. However, looking back, we all have seen that COVID-19 has led to a tremendous push for using digital technologies to modernize existing businesses.

Personally, I was used to traveling every 2 – 3 weeks to a customer, now I have left my home office only twice for business. Meanwhile, I invested in better communication equipment and a place to work. And hé, it remains possible to work and communicate with people.

Onboarding new people, getting to know new people takes more social interaction than a camera can bring.

In the PGGA LinkedIn community, we had people joining from all over the world. We started to organize video meetings to discuss their expectations and interest in this group with some active members.

We learned several things from these calls.

First of all, finding a single timeslot that everyone worldwide could participate in is a challenge. A late Friday afternoon is almost midnight in Asia and morning in the US. And is Friday the best day – we do not know yet.

Secondly, we realized that posts published in our LinkedIn group did not appear in everyone’s LinkedIn feed due to LinkedIn’s algorithms. For professionals, LinkedIn becomes less and less attractive as the algorithms seem to prefer frequency/spam above content.

For that reason, we are probably moving to the PLM Green Alliance website and combine this environment with a space for discussion outside the LinkedIn scope. More to come on the PGGA website.

Finally, we will organize video discussion sessions to ask the participants to prepare themselves for a discussion. Any member of the PGGA can bring in the discussion topics.

It might be a topic you want to clarify or better understand.

What’s next

For December 4th, we have planned a discussion meeting related to the Exponential Roadmap 2019 report, where  36  solutions to halve carbon emission by 2030 are discussed. In our video discussion, we want to focus on the chapter: Digital Industries.

We believe that this topic comes closest to our PLM domain and hopes that participants will share their thinking and potential activities within their companies.

You can download the Exponential Roadmap here or by clicking on the image. More details about the PLM Global Green Alliance you will find in the LinkedIn group. If you want to participate, let us know.

The PGGA website will be the place where more and more information will be collected per theme, to help you understand what is happening worldwide and the place where you can contribute to let us know what is happening at your side.

Conclusion

The PLM Global Green Alliance exists now for a year with 192 members. With approximately five percent active members, we have the motivation to grow our efforts and value. We learned from COVID-19 there is a need to become proactive as the costs of prevention are always lower than the costs of (trying) fixing afterward.

And each of us has the challenge to behave a little differently than before.

Will you be one of them ?

In the last two weeks, three events were leading to this post.

First, I read John Stark’s recent book Products2019. A must-read for anyone who wants to understand the full reach of product lifecycle related activities. See my recent post: Products2019, a must-read if you are new to PLM

Afterwards, I talked with John, discussing the lack of knowledge and teaching of PLM, not to be confused by PLM capabilities and features.

Second, I participated in an exciting PI DX USA 2020 event. Some of the sessions and most of the roundtables provided insights to me and, hopefully, many other participants. You can get an impression in the post: The Weekend after PI DX 2020 USA.

A small disappointment in that event was the closing session with six vendors, as I wrote. I know it is evident when you put a group of vendors in the arena, it will be about scoring points instead of finding alignment. Still, having criticism does not mean blaming, and I am always open to having a dialogue. For that reason, I am grateful for their sponsorship and contribution.

Oleg Shilovitsky mentioned cleverly that this statement is a contradiction.

“How can you accuse PLM vendors of having a limited view on PLM and thanking them for their contribution?”

I hope the above explanation says it all, combined with the fact that I grew up in a Dutch culture of not hiding friction, meanwhile being respectful to others.

We cannot simplify PLM by just a better tool or technology or by 3D for everybody. There are so many more people and processes related to product lifecycle management involved in this domain if you want a real conference, however many of them will not sponsor events.

It is well illustrated in John Stark’s book. Many disciplines are involved in the product lifecycle. Therefore, if you only focus on what you can do with your tool, it will lead to an incomplete understanding.

If your tool is a hammer, you hope to see nails everywhere around you to demonstrate your value

The thirds event was a LinkedIn post from John Stark  – 16 groups needing Product Lifecycle Knowledge, which for me was a logical follow-up on the previous two events. I promised John to go through these 16 groups and provide my thoughts.

Please read his post first as I will not rewrite what has been said by John already.

CEOs and CTOs

John suggested that they should read his book, which might take more than eight hours.  CEOs and CTOs, most of the time, do not read this type of book with so many details, so probably mission impossible.

They want to keep up with the significant trends and need to think about future business (model).

New digital and technical capabilities allow companies to move from a linear, coordinated business towards a resilient, connected business. This requires exploring future business models and working methods by experimenting in real-life, not Proof of Concept. Creating a learning culture and allowing experiments to fail is crucial, as you only learn by failing.

CDO, CIOs and Digital Transformation Executives

They are the crucial people to help the business to imagine what digital technologies can do. They should educate the board and the business teams about the power of having reliable, real-time data available for everyone connected. Instead of standardizing on systems and optimizing the siloes, they should assist and lead in new infrastructure for connected services, end-to-end flows delivered on connected platforms.

These concepts won’t be realized soon. However, doing nothing is a big risk, as the traditional business will decline in a competitive environment. Time to act.

Departmental Managers

These are the people that should worry about their job in the long term. Their current mission might be to optimize their department within its own Profit & Loss budget. The future is about optimizing the information flow for the whole value chain, including suppliers and customers.

I wrote about it in “The Middle Management Dilemma.” Departmental Managers should become more team leaders inspiring and supporting the team members instead of controlling the numbers.

Products Managers

This is a crucial role for the future, assuming a product manager is not only responsible for the marketing or development side of the product but also gets responsibility for understanding what happens with the product during production and sales performance. Understanding the full lifecycle performance and cost should be their mission, supported by a digital infrastructure.

Product Developers

They should read the book Products2019 to be aware there is so much related to their work. From this understanding, a product developer should ask the question:

“What can I do better to serve my internal and external customers ?”

This question will no arise in a hierarchical organization where people are controlled by managers that have a mission to optimize their silo. Product Developers should be trained and coached to operate in a broader context, which should be part of your company’s mission.  Too many people complain about usability in their authoring and data management systems without having a holistic understanding of why you need change processes and configuration management.

Product Lifecycle Management (PLM) deployers

Here I have a little bit of the challenge that this might be read as PLM-system users. However, it should be clear that we mean here people using product data at any moment along the product lifecycle, not necessarily in a single system.

This is again related to your company’s management culture. In the ideal world, people work with a purpose and get informed on how their contribution fits the company’s strategy and execution.

Unfortunately, in most hierarchical organizations, the strategy and total overview get lost, and people become measured resources.

New Hires and others

John continues with five other groups within the organization. I will not comment on them, as the answers are similar to the ones above – it is about organization and culture.

Educators and Students

This topic is very close to my heart, and one of the reasons I continue blogging about PLM practices. There is not enough attention to product development methodology or processes. Engineers can get many years of education in specific domains, like product design principles, available tools and technologies, performing physical and logical simulations.

Not so much time is spent on educating current best practices, business models for product lifecycle management.

Check in your country how many vendor-independent methodology-oriented training you can find. Perhaps the only consistent organization I know is CIMdata, where the challenge is that they deliver training to companies after students have graduated. It would be great if education institutes would embed serious time for product lifecycle management topics in their curriculum. The challenge, of course, the time and budget needed to create materials and, coming next, prioritizing this topic on the overall agenda.

I am happy to participate to a Specialized Master education program aiming at the Products and Buildings Digital Engineering Manager (INGENUM). This program organized by Arts Et Metiers in France helps create the overview for understanding PLM and BIM – in the French language as before COVID-19 this was an on-site training course in Paris.

Hopefully, there are more institutes offering PLM eductation – feel free to add them in the comments of this post.

Consultants, Integrators and Software Company Employees

Of course, it would be nice if everyone in these groups understands the total flow and processes within an organization and how they relate to each other. Too often, I have seen experts in a specific domain, for example, a 3D CAD-system having no clue about revisioning, the relation of CAD to the BOM, or the fundamentals of configuration management.

Consultants, Integrators and Software Company Employees have their own challenges as their business model is often looking for specialized skills they can sell to their clients, where a broader and general knowledge will come from experience on-the-job.

And if you are three years working full-time on a single project or perhaps work in three projects, your broader knowledge does not grow fast. You might become the hammer that sees nails everywhere.

For that reason, I recommend everyone in my ecosystem to invest your personal time to read related topics of interest. Read LinkedIn-posts from others and learn to differentiate between marketing messages and people willing to share experiences. Don’t waste your time on the marketing messages and react and participate in the other discussions. A “Like” is not enough. Ask questions or add your insights.

In the context of my personal learning, I mentioned that I participated in the DigitalTwin-conference in the Netherlands this week. Unfortunately, due to the partial lockdown, mainly a virtual event.

I got several new insights that I will share with you soon. An event that illustrated Digital Twin as a buzzword might be hype, however several of the participants illustrated examples of where they applied or plan to apply Digital Twin concepts. A great touch with reality.

Another upcoming conference that will start next week in the PLM Roadmap 2020 – PDT conference. The theme: Digital Thread—the PLM Professionals’ Path to Delivering Innovation, Efficiency, and Quality is not a marketing theme as you can learn from the agenda. Step by step we are learning here from each other.

 

Conclusion

John Stark started with the question of who should need Product Lifecycle Knowledge. In general, Knowledge is power, and it does not come for free. Either by consultancy, reading or training. Related to Product Lifecycle Management, everyone must understand the bigger picture. For executives as they will need to steer the company in the right direction. For everyone else to streamline the company and enjoy working in a profitable environment where you contribute and can even inspire others.

An organization is like a human body; you cannot have individual cells or organs that optimize themselves only – we have a name for that disease. Want to learn more? Read this poem: Who should be the boss?

 

 

This time it is again about learning. Last week, I read John Stark’s book: Products2019: A project to map and blueprint the flow and management of products across the product lifecycle: Ideation; Definition; Realisation; Support of Use; Retirement and Recycling. John, a well-known PLM consultant and writer of academic books related to PLM, wrote this book during his lockdown due to the COVID-19 virus. The challenge with PLM (books) is that it is, in a way boring from the outside. Remember my post: How come PLM and CM are boring? (reprise) ?

This time John wrapped the “boring” part into a story related to Jane from Somerset, who, as part of her MBA studies, is performing a research project for Josef Mayer Maschinenfabrik. The project is to describe for the newly appointed CEO what happens with the company’s products all along the lifecycle.

A story with a cliffhanger:

What happened to Newt from Cleveland?

 

Seven years in seven weeks

Poor Jane, in seven weeks, she is interviewing people on three sites. Two sites in Germany and one in France, and she is doing over a hundred interviews on her own. I realized that thanks to relation to SmarTeam at that time, it took me probably seven years to get in front of all these stakeholders in a company.

I had much more fun most of the time as you can see below. My engagements were teamwork, where you had some additional social relief after work. Jane works even at the weekends.

However, there are also many similarities. Her daily rhythm during working days. Gasthaus Adler reflects many of the typical guesthouses that I have visited. People staying there with a laptop were signs of the new world. Like Jane, I enjoyed the weissbier and noticed that sometimes overhearing other guests is not good for their company’s reputation. A lot of personal and human experiences are wrapped into the storyline.

Spoiler: Tarzan meets Jane!

Cultural differences

The book also illustrates the cultural difference between countries (Germany/France/US) nicely and even between regions (North & South). Just check the breakfast at your location to see it.

Although most of the people interviewed by Jane contributed to her research, she also meets that either for personal or political reasons, do not cooperate.

Having worked worldwide, including in Asian countries, I learned that understanding people and culture is crucial for successful PLM engagements.

John did an excellent job of merging cultural and human behavior in the book. I am sure we share many similar experiences, as both this book and my blog posts, do not mention particular tools. It is about the people and the processes.

Topics to learn

You will learn that 3D CAD is not the most important topic, as perhaps many traditional vendor-related PDM consultants might think.

Portfolio Management is a topic well addressed. In my opinion, to be addressed in every PLM roadmap, as here, the business goals get connected to the products.

New Product Introduction, a stage-gate governance process, and the importance of Modularity are also topics that pop up in several cases.

The need for innovation, Industry 4.0 and AI (Artificial Intelligene) buzz, the world of software development and the “War for Talent” can all be found in the book.

And I was happy that even product Master Data Management was addressed. In my opinion, not enough companies realize that a data-driven future requires data quality and data governance. I wrote about this topic last year: PLM and PIM – the complementary value in a digital enterprise.

There are fantastic technology terms, like APIs, microservices, Low Code platforms. They all rely on reliable and sharable data.

What’s next

Products2019 is written as the starting point for a sequel. In this book, you quickly learn all the aspects of a linear product lifecycle, as the image below shows

I see an opportunity for Products2020 (or later). What is the roadmap for a company in the future?

How to deal with more data-driven, more agile in their go-to-market strategy, as software, will be more and more defining the product’s capabilities?

How to come from a linear siloed approach towards a horizontal flow of information, market-driven and agile?

Perhaps we will learn what happened with Newt from Cleveland?

Meanwhile, we have to keep on learning to build the future.

My learning continues this week with PI DX USA 2020. Usually, a conference I would not attend as traveling to the USA would have too much impact on my budget and time. Now I can hopefully learn and get inspired – you can do the same! Feel free to apply for a free registration if you are a qualified end-user – check here.

And there is more to learn, already mentioned in my previous post:

Conclusion

John Stark wrote a great book to understand what is currently in most people’s heads in mid-size manufacturing companies. If you are relatively new to PLM, or if you have only been active in PDM, read it  –  it is affordable!  With my series Learning from the past, I also shared twenty years of experience, more a quick walkthrough, and a more specialized view on some of the aspects of PLM. Keep on learning!

After the series about “Learning from the past,” it is time to start looking towards the future.  I learned from several discussions that I am probably working most of the time with advanced companies. I believe this would motivate companies that lag behind even to look into the future even more.

If you look into the future for your company, you need new or better business outcomes. That should be the driver for your company. A company does not need PLM or a Digital Twin. A company might want to reduce its time to market, improve collaboration between all stakeholders. These objectives can be realized by different ways of working and an IT-infrastructure to allow these processes to become digital and connected.

That is the “game”. Coming back to the future of PLM.  We do not need a discussion about definitions; I leave this to the academics and vendors. We will see the same applies to the concept of a Digital Twin.

My statement: The digital twin is not new. Everybody can have their own digital twin as long as you interpret the definition differently. Does this sound like the PLM definition?

The definition

I like to follow the Gartner definition:

A digital twin is a digital representation of a real-world entity or system. The implementation of a digital twin is an encapsulated software object or model that mirrors a unique physical object, process, organization, person, or other abstraction. Data from multiple digital twins can be aggregated for a composite view across a number of real-world entities, such as a power plant or a city, and their related processes.

As you see, not a narrow definition. Now we will look at the different types of interpretations.

Single-purpose siloed Digital Twins

  1. Simple – data only

One of the most straightforward applications of a digital twin is, for example, my Garmin Connect environment. When cycling, my device registers performance parameters (speed, cadence, power, heartbeat, location). After every trip, I can analyze my performance. I can see changes in my overall performance; compare my performance with others in my category (weight, age, sex).

Based on that, I can decide if I want to improve my performance. My personal business goal is to maintain and improve my overall performance, knowing I cannot stop aging by upgrading my body.

On November 4th, 2020, I am participating in the (almost virtual) Digital Twin conference organized by Bits&Chips in the Netherlands. In the context of human performance, I look forward to Natal van Riel’s presentation: Towards the metabolic digital twin – for sure, this direction is not simple. Natal is a full professor at the Technical University in Eindhoven, the “smart city” in the Netherlands

  1. Medium – data and operating models

Many connected devices in the world use the same principle. An airplane engine, an industrial robot, a wind turbine, a medical device, and a train carriage; all track the performance based on this connection between physical and virtual, based on some sort of digital connectivity.

The business case here is also monitoring performance, predict maintenance, and upgrade the product when needed.

This is the domain of Asset Lifecycle Management, a practice that exists for decades. Based on financial and performance models, the optimal balance between maintaining and overhaul has to be found. Repairs are disruptive and can be extremely costly. A manufacturing site that cannot produce can costs millions per day. Connecting data between the physical and the virtual model allows us to have real-time insights and be proactive. It becomes a digital twin.

  1. Advanced – data and connected 3D model

The ditial twin we see the most in marketing videos is a virtual twin, using a 3D-representation for understanding and navigation.  The 3D-representation provides a Virtual Reality (VR) environment with connected data. When pointing at the virtual components, information might appear, or some animation takes place.

Building such a virtual representation is a significant effort; therefore, there needs to be a serious business case.

The simplest business case is to use the virtual twin for training purposes. A flight simulator provides a virtual environment and behavior as-if you are flying in the physical airplane – the behavior model behind the simulator should match as good as possible the real behavior. However, as it is a model, it will never be 100 % reality and requires updates when new findings or product changes appear.

A virtual model of a platform or plant can be used for training on Standard Operating Procedures (SOPs). In the physical world, there is no place or time to conduct such training. Here the complexity might be lower. There is a 3D Model; however, serious updates can only be expected after a major maintenance or overhaul activity.

These practices are not new either and are used in places where the physical training cannot be done.

More challenging is the Augmented Reality (AR) use case. Here the virtual model, most of the time, a lightweight 3D Model, connects to real-time data coming from other sources. For example, AR can be used when an engineer has to service a machine. The AR-environment might project actual data from the machine, indicate service points and service procedures.

The positive side of the business case is clear for such an opportunity, ensuring service engineers always work with the right information in a real-time context. The main obstacle for implementing AR, in reality, is the access to data, the presentation of the data and keeping the data in the AR-environment matching the reality.

And although there are 3D Models in use, they are, to my knowledge, always created in siloes, not yet connected to their design sources.Have a look at the Digital Twin conference from Bits&Chips, as mentioned before.

Several of the cases mentioned above will be discussed here. The conference’s target is to share real cases concluded by Q & A sessions, crucial for a virtual event.

Connected Virtual Twins along the product lifecycle

So far, we have been discussing the virtual twin concept, where we connect a product/system/person in the physical world to a virtual model. Now let us zoom in on the virtual twins relevant for the early parts of the product lifecycle, the manufacturing twin, and the development twin. This image from Siemens illustrates the concept:

On slides they imagine a complete integrated framework, which is the future vision. Let us first zoom in on the individual connected twins.

The digital production twin

This is the area of virtual manufacturing and creating a virtual model of the manufacturing plant. Virtual manufacturing planning is not a new topic. DELMIA (Dassault Systèmes) and Tecnomatix (Siemens) are already for a long time offering virtual manufacturing planning solutions.

At that time, the business case was based on the fact that the definition of a manufacturing plant and process done virtually allows you to optimize the plant before investing in physical assets.

Saving money as there is no costly prototype phase to optimize production. In a virtual world, you can perform many trade-off studies without extra costs. That was the past (and for many companies still the current situation).

With the need to be more flexible in manufacturing to address individual customer orders without increasing the overhead of delivering these customer-specific solutions, there is a need for a configurable plant that can produce these individual products (batch size 1).

This is where the virtual plant model comes into the picture again. Instead of having a virtual model to define the ultimate physical plant, now the virtual model remains an active model to propose and configure the production process for each of these individual products in the physical plant.

This is partly what Industry 4.0 is about. Using a model-based approach to configure the plant and its assets in a connected manner. The digital production twin drives the execution of the physical plant. The factory has to change from a static factory to a dynamic “smart” factory.

In the domain of Industry 4.0, companies are reporting progress. However, to my experience, the main challenge is still that the product source data is not yet built in a model-based, configurable manner. Therefore, requiring manual rework. This is the area of Model-Based Definition, and I have been writing about this aspect several times. Latest post: Model-Based: Connecting Engineering and Manufacturing

The business case for this type of digital twin, of course, is to be able to customer-specific products with extremely competitive speed and reduced cost compared to standard. It could be your company’s survival strategy. As it is hard to predict the future, as we see from COVID-19, it is still crucial to anticipate the future, instead of waiting.

The digital development twin

Before a product gets manufactured, there is a product development process. In the past, this was pure mechanical with some electronic components. Nowadays, many companies are actually manufacturing systems as the software controlling the product plays a significant role. In this context, the model-based systems engineering approach is the upcoming approach to defining and testing a system virtually before committing to the physical world.

Model-Based Systems Engineering can define a single complex product and perform all kinds of analysis on the system even before there is a physical system in place.  I will explain more about model-based systems engineering in future posts. In this context, I want to stress that having a model-based system engineering environment combined with modularity (do not confuse it with model-based) is a solid foundation for dealing with unique custom products. Solutions can be configured and validated against their requirements already during the engineering phase.

The business case for the digital development twin is easy to make. Shorter time to market, improved and validated quality, and reduced engineering hours and costs compared to traditional ways of working. To achieve these results,  for sure, you need to change your ways of working and the tools you are using. So it won’t be that easy!

For those interested in Industry 4.0 and the Model-Based System Engineering approach, join me at the upcoming PLM Road Map 2020 and PDT 2020 conference on 17-18-19 November. As you can see from the agenda, a lot of attention to the Digital Twin and Model-Based approaches.

Three digital half-days with hopefully a lot to learn and stay with our feet on the ground.  In particular, I am looking forward to Marc Halpern’s keynote speech: Digital Thread: Be Careful What you Wish For, It Just Might Come True

Conclusion

It has been very noisy on the internet related to product features and technologies, probably due to COVIC-19 and therefore disrupted interactions between all of us – vendors, implementers and companies trying to adjust their future. The Digital Twin concept is an excellent framing for a concept that everyone can relate to. Choose your business case and then look for the best matching twin.

I believe we are almost at the end of learning from the past. We have seen how, from an initial serial CAD-driven approach with PDM, we evolved to PLM-managed structures, the EBOM and the MBOM. Or to illustrate this statement, look at the image below, where I use a Tech-Clarity image from Jim Brown.

The image on the right describes perfectly the complementary roles of PLM and ERP. The image on the left shows the typical PDM-approach. PDM feeding ERP in a linear process. The image on the right, I believe it is from 2004, shows the best practice before digital transformation. PLM is supporting product innovation in an iterative approach, pushing released information to ERP for execution.

As I think in images, I like the concept of a circle for PLM and an arrow for ERP. I am always using those two images in discussions with my customers when we want to understand if a particular activity should be in the PLM or ERP-domain.

Ten years ago, the PLM-domain was conceptually further extended by introducing support for products in operations and service. Similar to the EBOM (engineering) and the MBOM (manufacturing), the SBOM (service) was introduced to support product information for products in operation. In theory a full connected cicle.

Asset Lifecycle Management

At the same time, I was promoting PLM-practices for owners/operators to enhance Asset Lifecycle Management. My first post from June 2010 was called: PLM for Asset Lifecycle Management and Asset Development introduces this approach.

Conceptually the SBOM and Asset Lifecycle Management have a lot in common. There is a design product, in this case, an asset (plant, machine) running in the field, and we need to make sure operators have the latest information about the asset. And in case of asset changes, which can be a maintenance operation, a repair or complete overall, we need to be sure the changes are based on the correct information from the as-built environment. This requires full configuration management.

Asset changes can be based on extensive projects that need to be treated like new product development projects, with a staged approach that can take weeks, months, sometimes years. These activities are typical activities performed in PLM-systems, not in MRO-systems that are designed to manage the actual operation. Again here we see the complementary roles of PLM (iterative) and MRO (execution).

Since 2008, I have worked a lot in this environment, mainly in the nuclear and process industry. If you want to learn more about this aspect of PLM, I recommend looking at the PLMpartner website, where Bjørn Fidjeland, in cooperation with SharePLM, published a course on Plant Information Management. We worked together in several projects and Bjørn has done a great effort to describe the logical model to be used instead of a function-feature story.

Ten years ago, we were not calling this concept the “Digital Twin,” as the aim was to provide end-to-end support of asset information from engineering, procurement, and construction towards operation in a coordinated manner. The breaking point in the relation between the EPCs and Owner/Operators is the data-handover – how much of your IP can/do you expose and what is needed. Nowadays, we would call striving for end-to-end data continuity the Digital Thread.

Hot from the press in this context, CIMdata just published a commentary Managing the Digital Thread in Global Value Chains describing Eurostep’s ShareAspace capabilities and experiences in managing an end-to-end information flow (Digital Thread) in a heterogeneous environment based on exchange standards like ISO 10303-239 PLCS.  Their solution is based on what I consider a more modern approach for managing digital continuity compared to the traditional approach I described before. Compare the two images in this paragraph. The first image represents the old/current way with a disconnected handover, the second represents ShareAspace connected approach based on a real digital thread.

The Service BOM

As discussed with Asset Lifecycle Management, there is a disconnect between the engineering disciplines and operations in the field, looking from the point of view of an Asset owner/operator.

Now when we look from the perspective of a manufacturing company that produces assets to be serviced, we can identify a different dataflow and a new structure, the Service BOM (SBOM).

The SBOM provides information on how a product needs to be serviced. What are the parts that require service, and what are the service kits that are possible for that product? For that reason, service engineering should be done in parallel to product engineering. When designing a product, the engineer needs to identify which the wearing parts (always require service in time) and which parts might be serviceable.

There are different ways to look at the SBOM. Conceptually, the SBOM could be created in close relation with the EBOM. At the moment you define your product, you also should specify how the product will be services. See the image below

From this example, it is clear that part standardization and modularization have a considerable benefit for services downstream. What if you have only one serviceable part that applies to many products? The number of parts to have in stock will be strongly reduced instead of having many similar parts that only fit in a single product?

Depending on the type of product, the SBOM can be generic, serving many products in the field. In that case, the company has to deal with catalogs, to be defined in PLM. Or the SBOM can be aligned with the As-Built of a capital product in the field. In that case, the concepts of Asset Lifecycle Management apply. Click on the image to see a clear picture.

The SBOM on its own,  in such an environment, will have links to specific documents, service instructions, operating manuals.

If your PLM-system allows it, extending the EBOM and MBOM with an SBOM is not a complex effort. What is crucial to understand is that the SBOM has its own lifecycle, which can even last longer than the active product sold. So sometimes, manufacturing specifications, related to service parts need to be maintained too, creating a link between the SBOM and potential MBOM(s).

ECM = Enterprise Change Management

When I discussed ECM in my previous post in the context of Engineering Change Management, I got the feedback that nowadays, everyone talks about Enterprise Change Management. Engineering Change Management is old school.

In the past, and even in a 2014 benchmark, a customer had two change management systems. One in PLM and one in ERP, and companies were looking into connecting these two processes. Like the BOM-interaction between PLM and ERP, this is technology-wise, never a real problem.

The real problem in such situations was to come to a logical flow of events. Many times the company insisted that every change should start from the ERP-system as we like to standardize. This means that even an engineering change had to be registered first in the ERP-system

Luckily the reach of PLM has grown. PLM is no longer the engineering tool (IT-system thinking). PLM has become the information backbone for product information all along the product lifecycle. Having the MBOM and SBOM available through a PLM-infrastructure allows organizations to streamline their processes.

Aras – digital thread through connected structures

And in this modern environment, enterprise change management might take place mostly in a PLM-infrastructure. The PLM-infrastructure providing a digital thread, as the Aras picture above illustrates, provides the full traceability to support configuration management.

However, we still have to remember that configuration management and engineering change management, first of all, are based on methodology and processes. Next, the combination of tools to be used will vary.

I like to conclude this topic with a quote from Lee Perrin’s comment on my previous blog post

I would add that aerospace companies implemented CM, to avoid fatal consequences to their companies, but also to their flying customers.

PLM provides the framework within which to carry out Configuration Management. CM can indeed be carried out without PLM, as was done in the old paper-based days. As you have stated, PLM makes the whole CM process much more efficient. I think more transparent too.

Conclusion

After nine posts around the theme Learning from the past to understand the future, I walked through the history of CAD, PDM and PLM in a fast mode, pointing to practices and friction points. In the blogging space, it is hard to find this information as most blog posts are coming from software vendors explaining why their tool is needed. Hopefully, these series have helped many of you to understand a broader context. Now I want to focus on the future again in my upcoming blog posts.

Still, feel free to contact me and discuss methodology topics.

Picture by Christi Wijnen – a good friend and photographer in the Netherlands

In the previous seven posts, learning from the past to understand the future, we have seen the evolution from manual 2D drawing handling. Next, the emerge of ERP and CAD followed by data management systems (PDM/PLM) and methodology (EBOM/MBOM) to create an infrastructure for product data from concept towards manufacturing.

Before discussing the extension to the SBOM-concept, I first want to discuss Engineering Change Management and Configuration Management.

ECM and CM – are they the same?

Often when you talk with people in my PLM bubble, the terms Change Management and Configuration Management are mixed or not well understood.

When talking about Change Management, we should clearly distinguish between OCM (Organizational Change Management) and ECM (Engineering Change Management). In this post, I will focus on Engineering Change Management (ECM).

When talking about Configuration Management also here we find two interpretations of it.

The first one is a methodology describing technically how, in your PLM/CAD-environment, you can build the most efficient way connected data structures, representing all product variations. This technology varies per PLM/CAD-vendor, and therefore I will not discuss it here. The other interpretation of Configuration Management is described on Wiki as follows:

Configuration management (CM) is a systems engineering process for establishing and maintaining consistency of a product’s performance, functional, and physical attributes with its requirements, design, and operational information throughout its life.

This is also the area where I will focus on this time.

And as-if great minds think alike and are synchronized, I was happy to see Martijn Dullaart’s recent blog post, referring to a poll and follow-up article on CM.

Here Martijn precisely touches the topic I address in this post. I recommend you to read his post: Configuration Management done right = Product-Centric first and then follow with the rest of this article.

Engineering Change Management

Initially, engineering change management was a departmental activity performed by engineering to manage the changes in a product’s definition. Other stakeholders are often consulted when preparing a change, which can be minor (affecting, for example, only engineering) or major (affecting engineering and manufacturing).

The way engineering change management has been implemented varies a lot. Over time companies all around the world have defined their change methodology, and there is a lot of commonality between these approaches. However, terminology as revision, version, major change, minor change all might vary.

I described the generic approach for engineering change processes in my blog post: ECR / ECO for Dummies from 2010.

The fact that companies have defined their own engineering change processes is not an issue when it works and is done manually. The real challenge came with PDM/PLM-systems that need to provide support for engineering change management.

Do you leave the methodology 100 % open, or do you provide business logic?

I have seen implementations where an engineer with a right-click could release an assembly without any constraints. Related drawings might not exist, parts in the assembly are not released, and more. To obtain a reliable engineering change management process, the company had to customize the PLM-system to its desired behavior.

An exercise excellent for a system integrator as there was always a discussion with end-users that do not want to be restricted in case of an emergency  (“we will complete the definition later” / “too many clicks” / “do I have to approve 100 parts ?”). In many cases, the system integrator kept on customizing the system to adapt to all wishes. Often the engineering change methodology on paper was not complete or contained contradictions when trying to digitize the processes.

For that reason, the PLM-vendors that aim to provide Out-Of-The-Box solutions have been trying to predefine certain behaviors in their system. For example, you cannot release a part, when its specifications (drawings/documents) are not released. Or, you cannot update a released assembly without creating a new revision.

These rules speed-up the implementation; however, they require more OCM (Organizational Change Management) as probably naming and methodology has to change within the company. This is the continuous battle in PLM-implementations. In particular where the company has a strong legacy or lack of business understanding, when implementing PLM.

There is an excellent webcast in this context on Minerva PLM TV – How to Increase IT Project Success with Organizational Change Management.

Click on the image or link to watch this recording.

Configuration Management

When we talk about configuration management, we have to think about managing the consistency of product data along the whole product lifecycle, as we have seen from the Wiki-definition before.

Wiki – the configuration Activity Model

Configuration management existed long before we had IT-systems. Therefore, configuration management is more a collection of activities (see diagram above) to ensure the consistency of information is correct for any given product. Consistent during design, where requirements match product capabilities. Consistent with manufacturing, where the manufacturing process is based on the correct engineering specifications. And consistent with operations, meaning that we have the full definition of product in the field, the As-Built, in correct relation to its engineering and manufacturing definition.

Source: Configuration management in aerospace industry

This consistency is crucial for products where the cost of an error can have a massive impact on the manufacturer. The first industries that invested heavily in configuration management were the Aerospace and Defense industries. Configuration management is needed in these industries as the products are usually complex, and failure can have a fatal impact on the company. Combined with many regulatory constraints, managing the configuration of a product and the impact of changes is a discipline on its own.

Other industries have also introduced configuration management nowadays. The nuclear power industry and the pharmaceutical industry use configuration management as part of their regulatory compliance. The automotive industry requires configuration management partly for compliance, mainly driven by quality targets. An accident or a recall can be costly for a car manufacturer. Other manufacturing companies all have their own configuration management strategies, mainly depending on their own risk assessment. Configuration management is a pro-active discipline – it costs money – time, people and potential tools to implement it. In my experience, many of these companies try to do “some” configuration management, always hoping that a real disaster will not happen (or can happen). Proper configuration management allows you to perform reliable impact analysis for any change (image above)

What happens in the field?

When introducing PLM in mid-market companies, often, the dream was that with the new PLM-system configuration, management would be there too.

Management believes the tools will fix the issue.

Partly because configuration management deals with a structured approach on how to manage changes, there was always confusion with engineering change management. Modern PLM-systems all have an impact analysis capability. However, most of the time, this impact analysis only reaches the content that is in the PLM-system. Configuration Management goes further.

If you think that configuration management is crucial for your company, start educating yourselves first before implementing anything in a tool. There are several places where you can learn all about configuration management.

  • Probably the best-known organization is IpX (Institute for Process Excellence), teaching the CM2 methodology. Have a look here: CM2 certification and courses
  • Closely related to IpX, Martijn Dullaart shares his thoughts coming from the field as Lead Architect for Enterprise Configuration Management at ASML (one of the Dutch crown jewels) in his blog: MDUX
  • CMstat, a configuration and data management solution provider, provides educational posts from their perspective. Have a look at their posts, for example, PLM or PDM or CM
  • If you want to have a quick overview of Configuration Management in general, targeted for the mid-market, have a look at this (outdated) course: Training for Small and Medium Enterprises on CONFIGURATION MANAGEMENT. Good for self-study to get an understanding of the domain.

 

To summarize

In regulated industries, Configuration Management and PLM are a must to ensure compliance and quality. Configuration management and (engineering) change management are, first of all, required methodologies that guarantee the quality of your products. The more complex your products are, the higher the need for change and configuration management.

PLM-systems require embedded engineering change management – part of the PDM domain. Performing Engineering Change Management in a system is something many users do not like, as it feels like overhead. Too much administration or too many mouse clicks.

So far, there is no golden egg that performs engineering change management automatically. Perhaps in a data-driven environment, algorithms can speed-up change management processes. Still, there is a need for human decisions.

Similar to configuration management. If you have a PLM-system that connects all the data from concept, design, and manufacturing in a single environment, it does not mean you are performing configuration management. You need to have processes in place, and depending on your product and industry, the importance will vary.

Conclusion

In the first seven posts, we discussed the design and engineering practices, from CAD to EBOM, ending with the MBOM. Engineering Change Management and, in particular, Configuration Management are methodologies to ensure the consistency of data along the product lifecycle. These methodologies are connected and need to be fit for the future – more on this when we move to modern model-based approaches.

Closing note:

While finishing this blog post today I read Jan Bosch’s post: Why you should not align. Jan touches the same topic that I try to describe in my series Learning from the Past ….., as my intention is to make us aware that by holding on to practices from the past we are blocking our future. Highly recommended to read his post – a quote:

The problem is, of course, that every time you resist change, you get a bit behind. You accumulate some business, process and technical debt. You become a little less “fitting” to the environment in which you’re operating

In my last post related to Learning from the past to understand the future, I discussed what happened when 3D CAD became available for the mid-market. In the large automotive or aerospace & defense companies, 3D CAD has been introduced along the path of defining processes and selecting tools. In the mid-market 3D CAD started from the other side, first as a productivity tool, not thinking further to change methodologies or processes.

The approach starting with 3D CAD without changing processes, has created several complexities. Every company that is aiming to move towards a digital future needs to reduce complexity to remain competitive. Now let us focus on the relation between the 3D CAD-structure and a BOM.

The 3D CAD-structure

When building a product in a 3D CAD system, the concept is that you have individual parts designed in 3D.  Every single part has a unique identifier.

If possible, the (file) name would equal the physical part number.

Next, a group of parts could be stored as a subassembly. Such an assembly is sometimes called a phantom assembly, in case they only group together several 3D parts. The usage of this type of assemblies increased CAD productivity. For data management reasons, these assemblies need to have a unique identifier, preferably not with the same numbering scheme for physical part numbers. It would consume part numbers that would never be used during manufacturing.

Note: in the early days of connecting 3D CAD to ERP, there was a considerable debate about which system could generate the part number.

ERP has always been the leading system for parts definition, why change ? And why generate part numbers that might not be used later in production. “Wasting” part numbers was a bad practice as historically, the part number was like a catalog number: 6 to 7 digits.

Next, there is also another group of subassemblies that represent one or more primary components of a product. For example, a pump assembly, that might be the combination of the pump, the motor, and the base frame. This type of assembly appears most of the time high in the CAD-structure. They can be considered as a phantom assembly too, regarding a required identifier for this subassembly.

Finally, there might be parts in the CAD-structure that will not exist in reality as part but need to be created during the manufacturing process. Sheet metal parts are created during the manufacturing process. Cappings, strips and cables shown in the CAD-structure might come from materials that are purchased in standardized sizes (1 meter / 2 meter / 10 meter) and need to be cut during manufacturing. Here the instances in the CAD-structure will have a unique identifier. What type of identifier to use depends on the manufacturing process. It might be a physical part number, as it is a half-fabricate,  or it remains a unique identifier for the CAD-structure only.

The reason I am coming back to these identifiers is that as described before, companies wanted to keep a relation between the part number and the file name.

There was a problem with flexible parts. A rubber hose with a specific length could be shaped differently in an assembly based on its connection. Two different shapes would create two files and therefore break the rule of a part number equals file name. The 3D CAD vendors “solved” this issue by storing configurable views of the same part inside one file and allow the user to select the active view.

Later we will see that management of views inside the 3D CAD model is not a wrong choice. This, contrary to managing different configurations of a part/product inside a single file, which creates complexity in the PLM domain.

In the end, the product became an assembly with several levels of subassemblies. At that time, when I worked a lot with CAD-integrations, the average depth of 3D CAD-structures was 6 to 7 levels deep, with exceptions in both directions.

The entire product CAD-structure is mainly used for a final digital mock-up, to allow engineers to analyze the full product behavior.  One of my favorite YouTube movies is the one from Airbus – seven years ago, they described the power of a full digital mock-up used for the A380.

In ETO-processes, the 3D CAD-structure is unique for a given customer solution – like the A380.

In the case of large assemblies with a lot of parts and subassemblies,  there were situations where the full product could not be resolved anymore. For Airbus a must, for the mid-market not always easy to reach.  Graphics memory, combined with the way graphics were represented, are the major constraint. This performance issue is resolved in the gaming world, however then the 3D representation had no longer the required accuracy or definition.

The Version pop-up problem

Working with a 3D CAD structure created a new problem when designers were sharing parts and assemblies between themselves and suppliers. The central storage of the files required a versioning mechanism, supported by a check-in and check-out mechanism.

Depending on the type of 3D CAD integration, the PDM system generated a new minor revision of the file after check-in again. In this way, there was full traceability of the changes before release. The image below shows an example of how SmarTeam was dealing with minor and major revisions combined with lifecycle stages.

When revising a part, all assemblies that contained the changed part need to be updated too, in case you want to have traceability and preventing others from overwriting your version. Making sure this assembly file points to the right file again. In the cases of a 6-level deep CAD-structure, this has led to a lot of methodology problems on how to deal (or not to deal) with file changes.

In the case of a unique delivery for a customer, the ETO-process, the issue might not be so big. As everything in the 3D CAD-structure is work in progress, you only need to be sure during the release process of the 3D CAD-structure that all parts and assemblies are resolved to the latest version (and verified)

Making changes on an existing product is way more complicated, as assemblies are released, and parts exist in production.  In that case, the Bill of Material is the leading structure to control the versions and the change impact, as we will see.

Note: Most CAD- and PLM-vendors loved to show you their demos, where starting from the CAD-structure, a product gets created (the ETO-process). The reality is that most companies do not start from the CAD-structure, but from an existing Bill of Material. In 2010, I wrote a few posts, discussing the relation between CAD and the BOM:

to explain there is more than a CAD-driven scenario.

The EBOM

In most PDM-systems with CAD-integrations, it is possible to create a Bill of Materials from the 3D CAD-structure. The Bill of Materials will be based on the parts inside the 3D CAD-structure. There is often the option to filter out phantom assemblies.

The structures are not the same. The 3D CAD-structure is instance-based, where the extracted Bill of Materials will summarize the part quantities on the same level.  See the image below. There are four Wheel instances in the CAD structure, in the EBOM-structure, we have only one Wheel reference with quantity 4.

I named the structure on the right the EBOM as the structure represents the Bill of Materials from the engineering point of view. This definition is a little arbitrary, as we will see. In companies that started to develop products based on a conceptual BOM, often, this conceptual BOM was an “early” EBOM that had to be developed further. This EBOM was more representing a logical or modular structure driving the design, instead of an extract from the 3D CAD-structure. In the next post, I will zoom in on these differences. I want to conclude this time with a critical methodology needed to manage the 3D CAD structure changes in relation to an EBOM.

Breaking the rule Drawing ID (Model ID)  = Part ID

Although I have been writing mostly about the 3D CAD structure, I want to remind us that the 3D Model in the mid-market is mainly used for design purposes. The primary delivery for manufacturing or a supplier is still a 2D-drawing for most companies. The 3D Model might be “nice to have” for CAM- or quality usage. Still, in case of a dispute, the 2D Drawing will be leading.

For that reason, in many mid-market companies, there was the following relation below:

In an environment without file versioning through check-in/check-out, this relation was easy to maintain. In the electronic world, every change in the 3D Model (which could be an assembly) triggers a new file version and, therefore, most of the time, a new version of the drawing and the physical part. However, you do not want to have a physical part with many revisions, in particular when this part could be again part of a Bill of Material.

To solve this issue, the Physical Part and the related Drawing/Model should have different lifecycles. The relation between the Physical Part and the Drawing Model should no longer be based on numbers but on a relation in the PDM/PLM-system. One of the main characteristics of a PDM/PLM-system is that it allows users to navigate through relations to find information in context.  For example, solving a Where Used – question is a (few) mouse-click(s) in a PDM/PLM-system.

Click on the image to see the details.

Breaking this one-to-one numbering rule is a must if you want to evolve to an item-centric or data-driven PLM-environment. When to introduce this change and how to implement this new behavior is a methodology exercise, not an implementation of a new tool.

There is a lot to read about this topic as it is related to the Form-Fit-Function-discussion we had earlier this year. A collection of information can be found in these two LinkedIn-post, where the comments are providing the insights:

 

I will not dive deeper into this theme (reached 1700 words ☹) – next time I will zoom in on the EBOM and leave the world of 3D CAD behind (for a while)

 

 

This time a short post (for me) as I am in the middle the series “Learning from the past to understand the future” and currently collecting information for next week’s post. However, recently Rob Ferrone, the original Digital Plumber, pointed me to an interesting post from Scott Taylor, the Data Whisperer.

In code: The Virtual Dutchman discovered the Data Whisperer thanks to the original Digital Plumber.

Scott’s article with the title: “Data Management Hasn’t Failed, but Data Management Storytelling Has” matches precisely the discussion we have in the PLM community.

Please read his article, and just replace the words Data Management by PLM, and it could have been written for our community. In a way, PLM is a specific application of data management, so not a real surprise.

Scott’s conclusions give food for thought in the PLM community:

To win over business stakeholders, Data Management leadership must craft a compelling narrative that builds urgency, reinvigorates enthusiasm, and evangelizes WHY their programs enable the strategic intentions of their enterprise. If the business leaders whose support and engagement you seek do not understand and accept the WHY, they will not care about the HOW. When communicating to executive leadership, skip the technical details, the feature functionality, and the reference architecture and focus on:

  • Establishing an accessible vocabulary
  • Harmonizing to a common voice
  • Illuminating the business vision

When you tell your Data Management story with that perspective, it can end happily ever after.

It all resonates well with what I described in the PLM ROI Myth – it is clear that when people hear the word Myth, they have a bad connotation, same btw for PLM.

The fact that we still need to learn storytelling is because most of us are so much focused on technology and sometimes on discovering the new name for PLM in the future.

Last week I pointed to a survey from the PLMIG (PLM Interest Group) and XLifcycle, inviting you to help to define the future definition of PLM.

You are still welcome here: Towards a digital future: the evolving role of PLM in the future digital world.

Also, I saw a great interview with Martin Eigner on Minerva PLM TV interview by Jennifer Moore. Martin is well known in the PLM world and has done foundational work for our community

. According to Jennifer, he is considered as The Godfather of PLM.  This tittle fits nicely in today’s post. Those who have seen his presentations in recent years will remember Martin is talking about SysLM (System Lifecycle Management) as the future for PLM.

It is an interesting recording to watch – click on the image above to see it. Martin explains nicely why we often do not get the positive feedback from PLM implementations – starting at minute 13 for those who cannot wait.

In the interview, you will discover we often talk too much about our discipline capabilities where the real discussion should be talking business. Strategy and objectives are discussed and decided at the management level of a company. By using storytelling, we can connect to these business objectives.

The end result will be more likely that a company understands why to invest significantly in PLM as now PLM is part of its competitiveness and future continuity.

Conclusion

I shared links to two interesting posts from the last weeks. Studying them will help you to create a broader view. We have to learn to tell the right story. People do not want PLM – they have personal objectives. Companies have business objectives, and they might lead to the need for a new and changing PLM. Connecting to the management in an organization, therefore, is crucial.

Next week again more about learning from the past to understand the future

%d bloggers like this: