In my previous posts dedicated to PLM education, I shared my PLM bookshelf, spoke with Peter Bilello from CIMdata about their education program and talked with Helena Gutierrez from SharePLM about their education mission.

In that last post, I promised this post will be dedicated to PLM education before s**t hits the fan. This statement came from my conversation with John Stark when we discussed where proper PLM education starts (before it hits the fan).

John is a well-known author of many books. You might have read my post about his book: Products2019: A project to map and blueprint the flow and management of products across the product lifecycle: Ideation; Definition; Realisation; Support of Use; Retirement and Recycling. A book with a very long title reflecting the complexity of a PLM environment.

John is also a long-time PLM consultant known in the early PLM community for his 2PLM e-zine. The 2PLM e-zine was an information letter he published between 1998 and 2017 before blogging and social interaction, updating everyone in the PLM community with the latest news. You probably were subscribed to this e-zine if you are my age.

So, let’s learn something more from John Stark

John Stark

John, first of all, thanks for this conversation. We have known each other for a long time. First of all, can you briefly introduce yourself and explain where your passion for PLM comes from?

The starting point for my PLM journey was that I was involved in developing a CAD system. But by the 1990s, I had moved on to being a consultant. I worked with companies in different industry sectors, with very different products.

I worked on application and business process issues at different product lifecycle stages – Ideation; Definition; Realization; Support of Use; Retirement and Recycling.

However, there was no name for the field I was working in at that time. So, I decided to call it Product Lifecycle Management and came up with the following definition:
‘PLM is the business activity of managing, in the most effective way, a company’s products all the way across their lifecycles; from the very first idea for a product, all the way through until it is retired and disposed of’.

PLM is the management system for a company’s products. It doesn’t just manage one of its products. It manages all of its parts and products and the product portfolio in an integrated way.’

I put that definition at the beginning of a book, ‘Product Lifecycle Management: Paradigm for 21st Century Product Realization’, published in 2004 and has since become the most cited book about PLM. I included my view of the five phases of the product lifecycle

and created the PLM Grid to show how everything (products, applications, product data, processes, people, etc.) fits together in PLM.

From about 2012, I started giving a blended course, The Basics of PLM, with the PLM Institute in Geneva.

As for the passion, I see PLM as important for Mankind. The planet’s 7 billion inhabitants all rely on products of various types, and the great majority would benefit from faster, easier access to better products. So PLM is a win-win for us all.

That’s interesting. I also had a nice definition picture I used in my early days. x

PI London 2011

and I had my view of the (disconnected) lifecycle.

PI Apparel London 2014

The education journey

John, as you have been active in PLM education for more than twenty years, do you feel that PLM Education and Training has changed.

PLM has only existed for about twenty years. Initially, it was so new that there was just one approach to PLM education and training, but that’s changed a lot.

Now there are specific programs for each of the different types of people interested or involved with PLM. So, for example, now there are specific courses for students, PLM application vendor personnel, PLM Managers, PLM users, PLM system integrators, and so on. Each of these groups has a different need for knowledge and skills, so they need different courses.

Another big change has been in the technologies used to support PLM Education and Training. Twenty years ago, the course was usually a deck of PowerPoint slides and an overhead projector. The students were in the same room as the instructor.

These days, courses are often online and use various educational apps to help course participants learn.

Who should be educated?

Having read several of your books, they are very structured and academic. Therefore, they will never be read by people at the C-level of an organization. Who are you targeting with your books, and why?

Initially, I wasn’t targeting anybody. I was just making my knowledge available. But as time went by, I found that my books were mainly used in further education and ongoing education courses.

So now, I focus on a readership of students in such organizations. For example, I’ve adapted some books to have 15 chapters to fit within a 15-week course.

Students make up a good readership because they want to learn to pass their exams. In addition, and it’s a worldwide market, the books are used in courses in more than twenty countries. Also, these courses are sufficiently long, maybe 150 hours, for the students to learn in-depth about PLM. That’s not possible with the type of very short PLM training courses that many companies provide for their employees.

PLM education

Looking at publicly available PLM education, what do you think we can do better to get PLM out of the framing of an engineering solution and become a point of discussion at the C-level

Even today, PLM is discussed at C-level in some companies. But in general, the answer is to provide more education about PLM. Unfortunately, that will take time, as PLM remains very low profile for most people.

For example, I’m not aware of a university with a Chair of Product Lifecycle Management. But then, PLM is only 20 years old, that’s very young.

It often takes two generations for new approaches and technologies to become widely accepted in the industry.

So another possibility would be for leading vendors of PLM applications to make the courses they offer about PLM available to a wider audience.

A career with PLM?

Educating students is a must, and like you and me, there are a lot of institutions that have specialized PLM courses. However, I also noticed a PLM expert at C-level in an organization is an exception; most of the time, people with a financial background get promoted. So, is PLM bad for your career?

No, people can have a good career in PLM, especially if they keep learning. There are many good master’s courses if they want to learn more outside the PLM area. I’ve seen people with a PLM background become a CIO or a CEO of a company with thousands of employees. And others who start their own companies, for example, PLM consulting or PLM training. And others become PLM Coaches.

PLM and Digital Transformation

A question I ask in every discussion. What is the impact of digital transformation on your area of expertise? In this case, how do you see PLM Education and Training looking in 2042, twenty years in the future?

I don’t see digital transformation really changing the concept of PLM over the next twenty years. In 2042, PLM will still be the business activity of managing a company’s products all the way across their lifecycles.

So, PLM isn’t going to disappear because of digital transformation.

On the other hand, the technologies and techniques of PLM Education and Training are likely to change – just as they have over the last twenty years. And I would expect to see some Chairs of Product Lifecycle Management in universities, with more students learning about PLM. And better PLM training courses available in companies.

I see digital transformation making it possible to have an entire connected lifecycle without a lot of overhead.

Digital Transformation – platforms working together

 Want to learn more?

My default closing question is always about giving the readers pointers to more relevant information. Maybe an overkill looking at your oeuvre as a writer. Still, the question is, where can readers from this blog learn more?

x
Three suggestions:
x

What I learned

By talking with John and learning his opinion, I see the academic approach to define PLM as a more scientific definition,  creating a space for the PLM professional.

We had some Blog /LinkedIn interaction related to PLM:  Should PLM become a Profession? In the past (2017).

When I search on LinkedIn, I find 87.000 persons with the “PLM Consultant” tag. From those, I know in my direct network, I am aware there is a great variety of skills these PLM Consultants have. However, I believe it is too late to establish the PLM Professional role definition.

John’s focus is on providing students and others interested in PLM a broad fundamental knowledge to get into business. In their day-to-day jobs, these people will benefit from knowing the bigger context and understanding the complexity of PLM.

This is also illustrated in Product2019, where the focus is on the experience – company culture and politics.

Due to the diversity of PLM, we will never be able to define the PLM professional job role compared to the Configuration Manager. Two disciplines are crucial and needed for a sustainable, profitable enterprise.

Conclusion

In this post, we explored a third dimension of PLM Education, focusing on a foundational approach, targeting in particular students to get educated on all the aspects of PLM. John is not the only publisher of educational books. I have several others in my network who have described PLM in their wording and often in their language. Unfortunately, there is no central point of reference, and I believe we are too late for that due to the tremendous variety in PLM.

Next week I will talk with a Learning & Development leader from a company providing PLM consultancy – let’s learn how they enable their employees to support their customers. 

In my previous posts dedicated to PLM education, I shared my and spoke with Peter Bilello from CIMdata about their education program. This time I am talking with Helena Gutierrez, one of the founders of Share PLM.

They are a young and energetic company with a mission to make PLM implementations successful, not through technology or customization, but through education and training.

Let’s discover their mission.

Share PLM

Helena, let me start with the brilliant name you have chosen for the company: Share PLM. Sharing (information) is the fundamental concept of PLM; if you don’t aim to share from the start, you won’t be able to fix it later. Can you tell us more about Share PLM’s mission and where you fit in the PLM ecosystem? 

Jos, first of all, thank you for the invitation to your blog! That’s a great question. In my previous job, as a young PLM director at the former Outotec, nowadays Metso Outotec, I realized how much I could learn from sharing experiences with other professionals.

I thought that by bringing people together from different companies with different backgrounds, PLM professionals could learn and get prepared for some of their projects.

In the beginning, I envisioned some kind of a marketplace, where people could also sell their own resources. A resource I often missed was some kind of POC template for a new deployment, these kinds of things.

I still remember my boss’s face at that time when I told him, Sami Grönstrand, that I wanted to sell templates. [laugh]

A lot has happened since then and we have evolved into a small niche where we can offer a lot of value.

Software vendors keep their PLM systems generic. And almost every company needs to adapt their systems to their company reality: their processes, their system architecture, and their people.

The key questions are: How can I map my company’s processes and the way we work to the new system? How can I make sense of the new systems and help people understand the big picture behind the system clicks?

That’s where we come in.

Education or Training

With Peter Bilello, we discussed the difference between education and training. Where would you position Share PLM?

This is an interesting differentiation – I must say I hadn’t heard of it before, but it makes sense.

I think we are in the middle of the two: theory and practice. You see, many consulting companies focus on the “WHY”, the business needs. But they don’t touch the systems. So don’t tell them to go into Teamcenter or OpenBOM because they want to stay at a theoretical level.

Some system integrators get into the system details, but they don’t connect the clicks to the “WHY” to the big picture.

The connection between the “WHY” and the “HOW” is really important to get the context, to understand how things work.

So that’s where we are very strong. We help companies connect the “WHY” and the “HOW”. And that’s powerful.

 

The success of training

We are both promoting the importance of adequate training as part of a PLM implementation. Can you share with us a best-in-class example where training really made a difference? Can we talk about ROI in the context of training?

Jos, I think when I look at our success stories, most good examples share some of the following characteristics:
xxxx

  • All start with “WHY”, and they have a story. 

In today’s world, people want to understand the “WHY”. So in practical terms, we work with customers to prepare a storyline that helps understand the “WHY” in a practical and entertaining manner.

  • All have a clear, top-down visualization of the process and related use cases.

This is simple, but it’s a game-changer. When people see the big picture, something “clicks,” and they feel “safe” at first sight. They know there is a blueprint for how things work and how they connect.

  • All have quick, online answers to their questions. 

A digital knowledge base where people can find quick answers and educate themselves.

This is one example of a knowledge base from one of our customers, OpenBOM. As you can see from the link below, they have documented how the system should be used in their knowledge base. In addition, they have a set of online eLearning courses that users can take to get started.

  • All involve people in the training and build a “movement”.  

People want to be heard and be a part of something. Engaging people in user communities is a great way to both learn from your users, and make them a part of your program. Bringing people together and putting them at the center of your training. I think this is key to success.

 

Training for all types of companies?

Do you see a difference between large enterprises and small and medium enterprises regarding training? Where would your approach fit best?

Yes, absolutely. And I think the most important difference is speed.

A big company can afford to work on all the elements I described before at the same time because they have the “horse-power” to drive different tracks. They can involve different project managers, and they can finance the effort.

Small companies start small and build their training environment slowly. Some might do some parts by themselves and use our services to guide them through the process.

I enjoy both worlds – the big corporations have big budgets, and you can do cool stuff.

But the small startups have big brains, and they often are very passionate about what they are doing. I enjoy working with startups because they dare to try new things and they are very creative.

Where Is Share PLM Training Different?

I see all system integrators selling PLM training. In my SmarTeam days I also built some “Express” training – Where are you different?

When I started Share PLM, we participated in a startup accelerator. When I was explaining our business model, they asked me the question: “Aren’t the software vendors or the system integrators doing exactly what you do?”

And the answer, incredibly, is that system integrators are often not interested in training and documentation, and they just don’t do it well as they have no didactical background.

Sometimes it’s even the same guy configuring the system who gets the task to create the training. Those people produce boring “technical’ manuals, using thousands of PowerPoint slides with no soul – who wants to read that?

No wonder PLM training has a bad reputation!

We are laser-focused on digital training, and our training is very high quality. We are good at connecting pieces of information and making sense of complex stuff. We also are strong at aesthetics, and our training looks good. The content is nicely presented when you open our courses, and people look forward to reading it!

Digital Transformation and PLM

I always ask when talking with peers in the PLM domain: How do you see the digital transformation happening at your customers, and how can you help them?

An interesting question. I see that boundaries between systems are getting thinner. For example, some time ago, you would have a program to deploy a PLM system.

Now I see a lot of “outcome-based” programs, where you focus on the business value and use adequate systems to get there.

For example, a program to speed up product deliveries or improve quality. That type of program involves many different systems and teams. It relates to your “connected enterprise” concept.

This transformation is happening, and I think we are well-positioned to help companies make sense of the connection between different systems and how they digitize their processes.

 

Want to learn more?

Thanks, Helena, for sharing your insights. Are there any specific links you want to provide to learn more? Perhaps some books to read or conferences to visit?

Thanks so much, Jos, for allowing me to discuss this with you today. Yes, I always recommend reading blogs and books to stay up-to-date.

  • We both have good blogging and reading lists on our websites. See on our blog the post The 12 Best PLM Blogs To Follow or the recommendations on your PLM Bookshelf
  • Conferences are also great for connecting with other people. In general, I think it’s very helpful to see examples from other companies to get inspired.
  • And we have our podcast, to my knowledge the only one when you search for PLM – because the interaction is new.

I’m happy to provide some customer references for people who want to learn more about how good training looks practically. Just get in touch with me on LinkedIn or through our website.

What I learned

I know the founders from Share PLM since they were active in Outotec, eager to discuss and learn new PLM concepts. It is impressive to see how they made the next step to launch their company Share PLM and find the niche place that somehow I try to cover too in a similar manner.

When I started my blog virtualdutchman.com in 2008, I wanted to share PLM experiences and knowledge.

Read my 2008 opening post here. It was a one-way sharing – modern at that time – probably getting outdated in the coming years.

However, Helena and the SharePLM team have picked up my mission in a modern manner. They are making PLM accessible and understandable in your company, using a didactical and modern approach to training.

SharePLM perhaps does not focus on the overall business strategy for PLM yet as their focus is on the execution level with a refreshing and modern approach – focussing on the end-user, didactics and attractiveness.  I expect in ten years from now, with the experience and the professional team, they will pick up this part too, allowing me to retire.

 Conclusion

This was the second post around PLM and Education, mainly focussing on what is happening in the field. Where I see CIMdata’s focus on education on the business strategy level, I see Share PLM’s focus on the execution level, making sure the PLM implementation is fun for the end-user and therefore beneficial for the company. The next post will be again about PLM Education, this time before the s** t hits the fan. Stay tuned.

 

In my previous post, “My PLM Bookshelf,” on LinkedIn, I shared some of the books that influenced my thinking related to PLM. As you can see in the LinkedIn comments, other people added their recommendations for PLM-related books to get inspired or more knowledgeable.

 

Where reading a book is a personal activity, now I want to share with you how to get educated in a more interactive manner related to PLM. In this post, I talk with Peter Bilello, President & CEO of CIMdata. If you haven’t heard about CIMdata and you are active in PLM, more to learn on their website HERE. Now let us focus on Education.

CIMdata

Peter, knowing CIMdata from its research valid for the whole PLM community, I am curious to learn what is the typical kind of training CIMdata is providing to their customers.

Jos, throughout much of CIMdata’s existence, we have delivered educational content to the global PLM industry. With a core business tenant of knowledge transfer, we began offering a rich set of PLM-related tutorials at our North American and pan-European conferences starting in the earlier 1990s.

Since then, we have expanded our offering to include a comprehensive set of assessment-based certificate programs in a broader PLM sense. For example, systems engineering and digital transformation-related topics. In total, we offer more than 30 half-day classes. All of which can be delivered in-person as a custom configuration for a specific client and through public virtual-live or in-person classes. We have certificated more than 1,000 PLM professionals since the introduction in 2009 of this PLM Leadership offering.

Based on our experience, we recommend that an organization’s professional education strategy and plans address the organization’s specific processes and enabling technologies. This will help ensure that it drives the appropriate and consistent operations of its processes and technologies.

For that purpose, we expanded our consulting offering to include a comprehensive and strategic digital skills transformation framework. This framework provides an organization with a roadmap that can define the skills an organization’s employees need to possess to ensure a successful digital transformation.

In turn, this framework can be used as an efficient tool for the organization’s HR department to define its training and job progression programs that align with its overall transformation.

 

The success of training

We are both promoting the importance of education to our customers. Can you share with us an example where Education really made a difference? Can we talk about ROI in the context of training?

Jos, I fully agree. Over the years, we have learned that education and training are often minimized (i.e., sub-optimized). This is unfortunate and has usually led to failed or partially successful implementations.

In our view, both education and training are needed, along with strong organizational change management (OCM) and a quality assurance program during and after the implementation.

In our terms, education deals with the “WHY” and training with the “HOW”. Why do we need to change? Why do we need to do things differently? And then “HOW” to use new tools within the new processes.

We have seen far too many failed implementations where sub-optimized decisions were made due to a lack of understanding (i.e., a clear lack of education). We have also witnessed training and education being done too early or too late.

This leads to a reduced Return on Investment (ROI).

Therefore a well-defined skills transformation framework is critical for any company that wants to grow and thrive in the digital world. Finally, a skills transformation framework needs to be tied directly to an organization’s digital implementation roadmap and structure, state of the process, and technology maturity to maximize success.

 

Training for every size of the company?

When CIMdata conducts PLM training, is there a difference, for example, when working with a big global enterprise or a small and medium enterprise?

You might think the complexity might be similar; however, the amount of internal knowledge might differ. So how are you dealing with that?

We basically find that the amount of training/education required mostly depends on the implementation scope. Meaning the scope of the proposed digital transformation and the current maturity level of the impacted user community.

It is important to measure the current maturity and establish appropriate metrics to measure the success of the training (e.g., are people, once trained, using the tools correctly).

CIMdata has created a three-part PLM maturity model that allows an organization to understand its current PLM-related organizational, process, and technology maturity.

The three-part PLM maturity model

The PLM maturity model provides an important baseline for identifying and/or developing the appropriate courses for execution.

This also allows us, when we are supporting the definition of a digital skills transformation framework, to understand how the level of internal knowledge might differ within and between departments, sites, and disciplines. All of which help define an organization-specific action plan, no matter its size.

 

Where is CIMdata training different?

Most of the time, PLM implementers offer training too for their prospects or customers. So, where is CIMdata training different?

 

For this, it is important to differentiate between education and training. So, CIMdata provides education (the why) and training and education strategy development and planning.

We don’t provide training on how to use a specific software tool. We believe that is best left to the systems integrator or software provider.

While some implementation partners can develop training plans and educational strategies, they often fall short in helping an organization to effectively transform its user community. Here we believe training specialists are better suited.

 

Digital Transformation and PLM

One of my favorite topics is the impact of digitization in the area of product development. CIMdata introduced the Product Innovation Platform concept to differentiate from traditional PDM/PLM. Who needs to get educated to understand such a transformation, and what does CIMdata contribute to this understanding.

We often start with describing the difference between digitalization and digitization. This is crucial to be understood by an organization’s management team. In addition, management must understand that digitalization is an enterprise initiative.

It isn’t just about product development, sales, or enabling a new service experience. It is about maximizing a company’s ROI in applying and leveraging digital as needed throughout the organization. The only way an organization can do this successfully is by taking an end-to-end approach.

The Product Innovation Platform is focused on end-to-end product lifecycle management. Therefore, it must work within the context of other enterprise processes that are focused on the business’s resources (i.e., people, facilities, and finances) and on its transactions (e.g., purchasing, paying, and hiring).

As a result, an organization must understand the interdependencies among these domains. If they don’t, they will ultimately sub-optimize their investment. It is these and other important topics that CIMdata describes and communicates in its education offering.

The Product Innovation Platform in a digital enterprise

More than Education?

As a former teacher, I know that a one-time education, a good book or slide deck, is not enough to get educated. How does CIMdata provide a learning path or coaching path to their customers?

Jos, I fully agree. Sustainability of a change and/or improved way of working (i.e., long-term sustainability) is key to true and maximized ROI. Here I am referring to the sustainability of the transformation, which can take years.

With this, organizational change management (OCM) is required. OCM must be an integral part of a digital transformation program and be embedded into a program’s strategy, execution, and long-term usage. That means training, education, communication, and reward systems all have to be managed and executed on an ongoing basis.

For example, OCM must be executed alongside an organization’s digital skills transformation program. Our OCM services focus on strategic planning and execution support. We have found that most companies understand the importance of OCM, often don’t fully follow through on it.

 

A model-based future?

During the CIMdata Roadmap & PDT conferences, we have often discussed the importance of Model-Based Systems Engineering methodology as a foundation of a model-based enterprise. What do you see? Is it only the big Aerospace and Defense companies that can afford this learning journey, or should other industries also invest? And if yes, how to start.

Jos, here I need to step back for a minute. All companies have to deal with increasing complexity for their organization, supply chain, products, and more.

So, to optimize its business, an organization must understand and employ systems thinking and system optimization concepts. Unfortunately, most people think of MBSE as an engineering discipline. This is unfortunate because engineering is only one of the systems of systems that an organization needs to optimize across its end-to-end value streams.

The reality is all companies can benefit from MBSE. As long as they consider optimization across their specific disciplines, in the context of their products and services and where they exist within their value chain.

The MBSE is not just for Aerospace and Defense companies. Still, a lot can be learned from what has already been done. Also, leading automotive companies are implementing and using MBSE to design and optimize semi- and high-automated vehicles (i.e., systems of systems).

The starting point is understanding your systems of systems environment and where bottlenecks exist.

There should be no doubt, education is needed on MBSE and how MBSE supports the organization’s Model-Based Enterprise requirements.

Published work from the CIMdata administrated A&D PLM Action Group can be helpful. Also, various MBE and systems engineering maturity models, such as one that CIMdata utilizes in its consulting work.

Want to learn more?

Thanks, Peter, for sharing your insights. Are there any specific links you want to provide to get educated on the topics discussed? Perhaps some books to read or conferences to visit?

x
Jos, as you already mentioned:

x

  • the CIMdata Roadmap & PDT conferences have provided a wealth of insight into this market for more than 25 years.
    [Jos: Search for my blog posts starting with the text: “The weekend after ….”]
  • In addition, there are several blogs, like yours, that are worth following, and websites, like CIMdata’s pages for education or other resources which are filled with downloadable reading material.
  • Additionally, there are many user conferences from PLM solution providers and third-party conferences, such as those hosted by the MarketKey organization in the UK.

These conferences have taken place in Europe and North America for several years. Information exchange and formal training and education are offered in many events. Additionally, they provide an excellent opportunity for networking and professional collaboration.

What I learned

Talking with Peter made me again aware of a few things. First, it is important to differentiate between education and training. Where education is a continuous process, training is an activity that must take place at the right time. Unfortunately, we often mix those two terms and believe that people are educated after having followed a training.

Secondly, investing in education is as crucial as investing in hard- or software. As Peter mentioned:

We often start with describing the difference between digitalization and digitization. This is crucial to be understood by an organization’s management team. In addition, management must understand that digitalization is an enterprise initiative.

System Thinking is not just an engineering term; it will be a mandate for managing a company, a product and even a planet into the future

Conclusion

This time a quote from Albert Einstein, supporting my PLM coaching intentions:

“Education is not the learning of facts
but the training of the mind to think.”

 

After two quiet weeks of spending time with my family in slow motion, it is time to start the year.

First of all, I wish you all a happy, healthy, and positive outcome for 2022, as we need energy and positivism together. Then, of course, a good start is always cleaning up your desk and only leaving the relevant things for work on the desk.

Still, I have some books at arm’s length, either physical or on my e-reader, that I want to share with you – first, the non-obvious ones:

The Innovators Dilemma

A must-read book was written by Clayton Christensen explaining how new technologies can overthrow established big companies within a very short period. The term Disruptive Innovation comes up here. Companies need to remain aware of what is happening outside and ready to adapt to your business. There are many examples even recently where big established brands are gone or diminished in a short period.

In his book, he wrote about DEC (Digital Equipment Company)  market leader in minicomputers, not having seen the threat of the PC. Or later Blockbuster (from video rental to streaming), Kodak (from analog photography to digital imaging) or as a double example NOKIA (from paper to market leader in mobile phones killed by the smartphone).

The book always inspired me to be alert for new technologies, how simple they might look like, as simplicity is the answer at the end. I wrote about in 2012: The Innovator’s Dilemma and PLM, where I believed cloud, search-based applications and Facebook-like environments could disrupt the PLM world. None of this happened as a disruption; these technologies are now, most of the time, integrated by the major vendors whose businesses are not really disrupted. Newcomers still have a hard time to concur marketspace.

In 2015 I wrote again about this book, The Innovator’s dilemma and Generation change. – image above. At that time, understanding disruption will not happen in the PLM domain. Instead, I predict there will be a more evolutionary process, which I would later call: From Coordinated to Connected.

The future ways of working address the new skills needed for the future. You need to become a digital native, as COVID-19 pushed many organizations to do so. But digital native alone does not bring success. We need new ways of working which are more difficult to implement.

Sapiens

The book Sapiens by Yuval Harari made me realize the importance of storytelling in the domain of PLM and business transformation. In short, Yuval Harari explains why the human race became so dominant because we were able to align large groups around an abstract theme. The abstract theme can be related to religion, the power of a race or nation, the value of money, or even a brand’s image.

The myth (read: simplified and abstract story) hides complexity and inconsistencies. It allows everyone to get motivated to work towards one common goal. A Yuval says: “Fiction is far more powerful because reality is too complex”.

Too often, I have seen well-analyzed PLM projects that were “killed” by management because it was considered too complex. I wrote about this in 2019  PLM – measurable or a myth? claiming that the real benefits of PLM are hard to predict, and we should not look isolated only to PLM.

My 2020 follow-up post The PLM ROI Myth, eludes to that topic. However, even if you have a soundproof business case at the management level, still the myth might be decisive to justify the investment.

That’s why PLM vendors are always working on their myths: the most cost-effective solution, the most visionary solution, the solution most used by your peers and many other messages to influence your emotions, not your factual thinking. So just read the myths on their websites.

If you have no time to read the book, look at the above 2015 Ted to grasp the concept and use it with a PLM -twisted mind.

Re-use your CAD

In 2015, I read this book during a summer holiday (meanwhile, there is a second edition). Although it was not a PLM book, it was helping me to understand the transition effort from a classical document-driven enterprise towards a model-based enterprise.

Jennifer Herron‘s book helps companies to understand how to break down the (information) wall between engineering and manufacturing.

At that time, I contacted Jennifer to see if others like her and Action Engineering could explain Model-Based Definition comprehensively, for example, in Europe- with no success.

As the Model-Based Enterprise becomes more and more the apparent future for companies that want to be competitive or benefit from the various Digital Twin concepts. For that reason, I contacted Jennifer again last year in my post: PLM and Model-Based Definition.

As you can read, the world has improved, there is a new version of the book, and there is more and more information to share about the benefits of a model-based approach.

I am still referencing Action Engineering and their OSCAR learning environment for my customers. Unfortunately, many small and medium enterprises do not have the resources and skills to implement a model-based environment.

Instead, these companies stay on their customers’ lowest denominator: the 2D Drawing. For me, a model-based definition is one of the first steps to master if your company wants to provide digital continuity of design and engineering information towards manufacturing and operations. Digital twins do not run on documents; they require model-based environments.

The book is still on my desk, and all the time, I am working on finding the best PLM practices related to a Model-Based enterprise.

It is a learning journey to deal with a data-driven, model-based environment, not only for PLM but also for CM experts, as you might have seen from my recent dialogue with CM experts: The future of Configuration Management.

Products2019

This book was an interesting novelty published by John Stark in 2020. John is known for his academic and educational books related to PLM. However, during the early days of the COVID-pandemic, John decided to write a novel. The novel describes the learning journey of Jane from Somerset, who, as part of her MBA studies, is performing a research project for the Josef Mayer Maschinenfabrik. Her mission is to report to the newly appointed CEO what happens with the company’s products all along the lifecycle.

Although it is not directly a PLM book, the book illustrates the complexity of PLM. It Is about people and culture; many different processes, often disconnected. Everyone has their focus on their particular discipline in the center of importance. If you believe PLM is all about the best technology only, read this book and learn how many other aspects are also relevant.

I wrote about the book in 2020: Products2019 – a must-read if you are new to PLM if you want to read more details. An important point to pick up from this book is that it is not about PLM but about doing business.

PLM is not a magical product. Instead, it is a strategy to support and improve your business.

System Lifecycle Management

Another book, published a little later and motivated by the extra time we all got during the COVID-19 pandemic, was Martin Eigner‘s book System Lifecycle Management.

A 281-page journey from the early days of data management towards what Martin calls System Lifecycle Management (SysLM). He was one of the first to talk about System Lifecycle Management instead of PLM.

I always enjoyed Martin’s presentations at various PLM conferences where we met. In many ways, we share similar ideas. However, during his time as a professor at the University of Kaiserslautern (2003-2017), he explored new concepts with his students.

I briefly mentioned the book in my series The road to model-based and connected PLM (Part 5) when discussing SLM or SysLM. His academic research and analysis make this book very valuable. It takes you in a very structured way through the times that mechatronics becomes important, next the time that systems (hardware and software) become important.

We discussed in 2015 the applicability of the bimodal approach for PLM. However, as many enterprises are locked in their highly customized PDM/PLM environments, their legacy blocks the introduction of modern model-based and connected approaches.

Where John Stark’s book might miss the PLM details, Martin’s book brings you everything in detail and with all its references.

It is an interesting book if you want to catch up with what has happened in the past 20 years.

More Books …..

More books on my desk have helped me understand the past or that helped me shape the future. As this is a blog post, I will not discuss more books this time reaching my 1500 words.

Still books worthwhile to read – click on their images to learn more:

I discussed this book two times last year. An introduction in PLM and Modularity and a discussion with the authors and some readers of the book: The Modular Way – a follow-up discussion

x

x

A book I read this summer contributed to a better understanding of sustainability. I mentioned this book in my presentation for the Swedish CATIA Forum in October last year – slide 29 of The Challenges of model-based and traditional plm. So you could see it as an introduction to System Thinking from an economic point of view.

System Thinking becomes crucial for a sustainable future, as I addressed in my post PLM and Sustainability.

Sustainability is my area of interest at the PLM Green Global Alliance, an international community of professionals working with Product Lifecycle Management (PLM) enabling technologies and collaborating for a more sustainable decarbonized circular economy.

Conclusion

There is a lot to learn. Tell us something about your PLM bookshelf – which books would you recommend. In the upcoming posts, I will further focus on PLM education. So stay tuned and keep on learning.

As promised in my early November post – The road to model-based and connected PLM (part 9 – CM), I come back with more thoughts and ideas related to the future of configuration management. Moving from document-driven ways of working to a data-driven and model-based approach fundamentally changes how you can communicate and work efficiently.

Let’s be clear: configuration management’s target is first of all about risk management. Ensuring your company’s business remains sustainable, efficient, and profitable.

By providing the appropriate change processes and guidance,  configuration management either avoids costly mistakes and iterations during all phases of a product lifecycle or guarantees the quality of the product and information to ensure safety.

Companies that have not implemented CM practices probably have not observed these issues. Or they have not realized that the root cause of these issues is a lack of CM.

Similar to what is said in smaller companies related to PLM, CM is often seen as an overhead, as employees believe they thoroughly understand their products. In addition, CM is seen as a hurdle to innovation because of the standardization of practices. So yes, they think it is normal that there are sometimes problems. That’s life.

I already wrote about this topic in 2010 PLM, CM and ALM – not sexy 😦 – where ALM means Asset Lifecycle Management – my focus at that time.

Hear it from the experts

To shape the discussion related to the future of Configuration Management, I had a vivid discussion with three thought leaders in this field: Lisa Fenwick, Martijn Dullaart and Maxime Gravel. A short introduction of the three of them:

Lisa Fenwick, VP Product Development at CMstat, a leading company in Configuration Management and Data Management software solutions and consulting services for aviation, aerospace & defense, marine, and other high-tech industries. She has over 25 years of experience with CM and Deliverables Management, including both government and commercial environments.

Ms. Fenwick has achieved CMPIC SME, CMPIC CM Assessor, and CMII-C certifications. Her experience includes implementing CM software products, CM-related consulting and training, and participation in the SAE and IEEE standards development groups

Martijn Dullaart is the Lead Architect for Enterprise Configuration Management at ASML (Our Dutch national pride) and chairperson of the Industry 4.0 committee of the Institute  Process Excellence (IPX) Congress. Martijn has his own blog mdux.net, and you might have seen him recently during the PLM Roadmap & PDT Fall conference in November – his thoughts about the CM future can be found on his blog here

Maxime Gravel, Manager Model-Based Engineering at Moog Inc., a worldwide designer, manufacturer, and integrator of advanced motion control products. Max has been the director of the model-based enterprise at the Institute for Process Excellence (IPX) and Head of Configuration and Change Management at Gulfstream Aerospace which certified the first aircraft in a 3D Model-Based Environment.

What we discussed:

We had an almost one-hour discussion related to the following points:

  • The need for Enterprise Configuration Management – why and how
  • The needed change from document-driven to model-based – the impact on methodology and tools
  • The “neural network” of data – connecting CM to all other business domains, a similar view as from the PLM domain,

I kept from our discussion the importance of planning – as seen in the CMstat image on the left.

To plan which data you need to manage and how you will manage the data. How often are you doing this in your company’s projects?

Next, all participants stressed the importance of education and training on this topic – get educated. Configuration Management is not a topic that is taught at schools. Early next year, I will come back on education as the benefits of education are often underestimated. Not everything can be learned by “googling.”

 Conclusion

The journey towards a model-based and data-driven future is not a quick one to be realized by new technologies. However, it is interesting to learn that the future of connected data (the “neural network”) allows organizations to implement both CM and PLM in a similar manner, using graph databases and automation. When executed at the enterprise level, the result will be that CM and PLM become natural practices instead of other siloed system-related disciplines.

Most of the methodology is there; the implementation to make it smooth and embedded in organizations will be the topics to learn. Join us in discussing and learning!

 

After all my writing about The road to model-based and connected PLM, a topic that interests me significantly is the positive contribution real PLM can have to sustainability.

To clarify this statement, I have to explain two things:

  • First, for me, real PLM is a strategy that concerns the whole product lifecycle from conception, creation, usage, and decommissioning.

Real PLM to articulate the misconception that PLM is considered as an engineering infrastructure of even system. We discussed this topic related to this post (7 easy tips nobody told you about PLM adoption) from my SharePLM peers.

  • Second, sustainability should not be equated with climate change, which gets most of the extreme attention.

However, the discussion related to climate change and carbon gas emissions drew most of the attention. Also, recently it seemed that the COP26 conference was only about reducing carbon emissions.

Unfortunately, reducing carbon gas emissions has become a political and economic discussion in many countries. As I am not a climate expert, I will follow the conclusions of the latest IIPC report.

However, I am happy to participate in science-based discussions, not in conversations about failing statistics (lies, damned lies and statistics) or the mixture of facts & opinions.

The topic of sustainability is more extensive than climate change. It is about understanding that we live on a limited planet that cannot support the unlimited usage and destruction of its natural resources.

Enough about human beings and emotions, back to the methodology

Why PLM and Sustainability

In the section PLM and Sustainability of the PLM Global Green Alliance website,  we explain the potential of this relation:

The goals and challenges of Product Lifecycle Management and Sustainability share much in common and should be considered synergistic. Where in theory, PLM is the strategy to manage a product along its whole lifecycle, sustainability is concerned not only with the product’s lifecycle but should also address sustainability of the users, industries, economies, environment and the entire planet in which the products operate.

If you read further, you will bump on the term System Thinking. Again there might be confusion here between Systems Thinking and Systems Engineering. Let’s look at the differences

Systems Engineering

For Systems Engineering, I use the traditional V-shape to describe the process. Starting from the Needs on the left side, we have a systematic approach to come to a solution definition at the bottom. Then going upwards on the right side, we validate step by step that the solution will answer the needs.

The famous Boeing “diamond” diagram shows the same approach, complementing the V-shape with a virtual mirrored V-shape. In this way providing insights in all directions between a virtual world and a physical world. This understanding is essential when you want to implement a virtual twin of one of the processes/solutions.

Still, systems engineering starts from the needs of a group of stakeholders. So it works to the best technical and beneficial solution, most of the time only measured by money.

System Thinking

The image below from the Ellen McArthur Foundation is an example of system thinking. But, as you can see, it is not only about delivering a product.

Systems Thinking is a more holistic approach to bringing products to the market. It is about how we deliver a product to the market and what happens during its whole life cycle. The drivers for system thinking, therefore, are not only focusing on product performance at the most economical price, but we also take into account the impact on resource extraction in the world, the environmental impact during its active life (more and more regulated) and ultimately also how to minimize the waste to the eco-system. This means more recycling or reuse.

If you want to read more about systems thinking more professionally, read this blog post from the Millennium Alliance for Humanity and the Biosphere (MAHB) related to Systems Thinking: A beginning conversation.

Product as a Service (PaaS)

To ensure more responsibility for the product lifecycle, one of the European Green Deal aspects is promoting Product as a Service. There is already a trend towards products as a service, and I mentioned Ken Webster’s presentation at the PLM Roadmap & PDT Fall 2021 conference: In the future, you will own nothing, and you will be happy.

Because if we can switch to such an economy, the manufacturer will have complete control over the product’s lifecycle and its environmental impact. The manufacturer will be motivated to deliver product upgrades, create repairable products instead of dumping old or broken stuff because this is cheap for selling. PaaS brings opportunities for manufacturers, like greater customer loyalty, but also pushes manufacturers to stay away from so-called “greenwashing”. They become fully responsible for the entire lifecycle.

A different type of growth

The concept of Product as a Service is not something that typical manufacturing companies endorse. Instead, it requires them to restructure their business and restructure their product.

Delivering a Product as a Service requires a fast feedback loop between the products in the field and R&D deciding on improving or adding new features.

In traditional manufacturing companies, the service department is far from engineering due to historical reasons. However, with the digitization of our product information and connected products, we should be able to connect all stakeholders related to our products, even our customers.

A few years ago, I was working with a company that wanted to increase their service revenue by providing maintenance as a service on their products on-site. The challenge they had was that the total installation delivered at the customer site was done through projects. There was some standard equipment in their solution; however, ultimately, the project organization delivered the final result, and product information was scattered all around the company.

There was some resistance when I proposed creating an enterprise product information backbone (a PLM infrastructure) with aligned processes.  It would force people to work upfront in a coordinated manner. Now with the digitization of operations, this is no longer a point of discussion.

In this context, I will participate on December 7th in an open panel discussion Creating a Digital Enterprise: What are the Challenges and Where to Start? As part of the PI DX spotlight series. I invite you to join this event if you are interested in hearing various digital enterprise viewpoints.

Doing both?

As companies cannot change overnight, the challenge is to define a transformation path. The push for transformation for sure will come from governments and investors in the following decades. Therefore doing nothing is not a wise strategy.

Early this year, the Boston Consultancy Group published this interesting article: The Next Generation of Climate Innovation, showing different pathways for companies.

A trend that they highlighted was the fact that Shareholder Returns over the past ten years are negative for the traditional Oil & Gas and Construction industries (-18 till -6 %). However, the big tech and first generation of green industries provide high shareholders returns (+30 %), and the latest green champions are moving in that direction. In this way, promoting investors will push companies to become greener.

The article talks about the known threat of disrupters coming from outside. Still, it also talks about the decisions companies can make to remain relevant. Either you try to reduce the damage, or you have to innovate. (Click on the image below on the left).

As described before, innovating your business is probably the most challenging part. In particular, if you have many years of history in your industry. Processes and people are engraved in an almost optimal manner (for now).

An example of reducing the damage could be, for example, what is happening in the steel industry. As making steel requires a lot of (cheap) energy, this industry is powered by burning coal. Therefore, an innovation to reduce the environmental impact would be to redesign the process with green energy as described in this Swedish example: The first fossil-free production of steel.

On December 9th, I will discuss both strategies with Henrik Hulgaard from Configit. We will discuss how Product Lifecycle Management and Configuration Lifecycle Management can play a role in the future. Feel free to subscribe to this session and share your questions. Click on the image to see the details.

Note:  you might remember Henrik from my earlier post this year in January: PLM and Product Configuration Management (CLM)

Conclusion

Sustainability is a topic that will be more and more relevant for all of us, locally and globally. Real PLM, covering the whole product lifecycle, preferably data-driven, allows companies to transform their current business to future sustainable business. Systems Thinking is the overarching methodology we have to learn – let’s discuss

This week I attended the PLM Roadmap & PDT Fall 2021 with great expectations based on my enthusiasm last year. Unfortunately, the excitement was less this time, and I will explain in my conclusions why. This time it was unfortunate again a virtual event which makes it hard to be interactive, something I realize I am missing a lot.

Over two hundred attendees connected for the two days, and you can find the agenda here. Typically I would discuss the relevant sessions; now, I want to group some of them related to a theme, as there was complementary information in these sessions.

Disruption

Again like in the spring, the theme was focusing on DISRUPTION. The word disruption can give you an uncomfortable feeling when you are not in power. It is more fun to disrupt than to be disrupted, as I mentioned in my spring presentation. Read The week after PLM Roadmap & PDT Spring 2021

In his keynote speech Peter Bilello (CIMdata) kicked off with: The Critical Dozen: 12 familiar, evolving trends and enablers of digital transformation that you cannot or should not live without.

You can see them on the slide below:

I believe many of them should be familiar to you as these themes have been “in the air” already for quite some time. Vendors first and slowly companies start to investigate them when relevant. You will find many of them back in my recent series: The road to model-based and connected PLM, where I explored the topics that would cross your path on that journey.

Like Peter said: “For most of the topics you cannot pick and choose as they are all connected.”

Another interesting observation was that we are more and more moving away from the concept of related structures (digital thread) but more to connected datasets (digital web). Marc Halpern first introduced this topic last year at the 2020 conference and has become an excellent image to frame what we should imagine in a connected world.

Digital web also has to do with the uprise of the graph database mentioned by Peter Bilello as a potentially disruptive technology during the fireside chat. Relational databases can be seen as rigid, associated with PLM structures. On the other hand, graph databases can be associated with flexible relations between different types of data – the image of the digital web.

Where Peter was mainly telling WHAT was happening, two presentations caught my attention because of the HOW.

First of all, Dr. Rodney Ewing (Cummins) ‘s session: A Balanced Strategy to Reap Continuous Business Value from Digital PLM was a great story of a transformational project. It contained both having a continuous delivery of business value in mind while moving to the connected enterprise.

As Rodney mentioned, the contribution of TCS was crucial here, which I can imagine. It is hard for a company to understand what is happening in the outside (PLM) world when applying it to your company. Their transformation roadmap is an excellent example of having the long-term vision in mind, meanwhile delivering value during the transformation.

Talking about the right partner and synergy, the second presentation I liked in this context of disruption was Ian Quest’s presentation (Quick Release): Open-source Disruption in Support of Audacious Goals. As a sponsor of the conference, they had ten minutes to pitch their area of expertise.

After Ian’s presentation, focused on audacious goals (for non-English natives translated as “brave” goals), there was only one word that stuck to my mind: pragmatic.

Instead of discussions about the complexity, Ian gave examples of where a pragmatic data-centric approach could lead to great benefits, as you can see from one of the illustrated benefits below:

Standards

A characteristic topic of this conference is that we always talk about standards. Torbjörn Holm (Eurostep) gave an excellent overview of where standards have led to significant benefits. For example, the containerization of goods has dramatically improved transportation of goods (we all benefit) while killing proprietary means of transport (trains, type of ships, type of unloading).  See the image below:

Torbjörn rightfully expanded this story to the current situation in the construction industry or the challenges for asset operators. Unfortunately, in these practices, many content suppliers remain focusing on their unique capabilities, reluctantly neglecting the demand for interoperability among the whole value chain.

It is a topic Marc Halpern also mentioned last year as an outcome of their Gartner PLM benefits survey. Gartner’s findings:

Time to Market is not so much improved by using PLM as the inefficient interaction with suppliers is the impediment.

Like transport before containerization, the exchange of information is not standardized and designed for digital exchange. Torbjorn believes that more and more companies will insist on exchange standards –  like CHIFOS – an ISO1596-derived exchange standard in the process industry. It is a user-driven standard, the best standard.

In this context, the presentation from Kenny Swope (Boeing) and Jean Yves Delaunay (Airbus) The Business Value of Standards-based Information Interoperability for Aerospace & Defense illustrated this fact.

While working for competitors, the Aerospace industry understands the criticality of standards to become more efficient and less vendor-dependent.  In the aerospace & defense group, they discuss these themes. The last year’s 2020 Fall sessions showed the results. You can read their publications here

The A&D PLM action group uses the following framework when evaluating standards – as you can see on the image below:

The result – and this is a combined exercise of many participating experts from the field; this is their recommendation:

To conclude:
People often complain about standards, framed by proprietary data format vendors, that they lead to a rigid environment, blocking agility.

In reality, standards allow companies to be more agile as the (proprietary) data flow is less an issue. Remember the containerization example.

Sustainability and System Thinking

This conference has always been known for its attention to the circular economy and green thinking. In the past, these topics might have been considered disconnected from our PLM practices; now, they have become a part of everyone’s mission.

Two presentations stood out on this topic for me. First, Ken Webster, with his keynote speech: In the future, you will own nothing and you will be happy was a significant oversight of how we as consumers currently are disconnected from the circular economy. His plea, as shown below, for making manufacturers responsible for the legal ownership of the materials in the products they deliver would impact consumer behavior.

Product as a Service (PaaS) and new ways to provide a service is becoming essential. For example, buildings as power stations, as they are a place to collect solar or wind energy?

His thoughts are aligned with what is happening in Europe related to the European Green Deal (not in his presentation). There is a push for a PaaS model for all products as this would be an excellent stimulant for the circular economy.  PaaS combined with a Digital Product Passport – more on that next year.

Making upgrades to your products has less impact on the environment than creating new products to sell (and creating waste of the old product).  Ken Webster was an interesting statement about changing the economy – do we want to own products or do we want to benefit from the product and leave the legal ownership to the manufacturer.

A topic I discussed in the PLM Roadmap & PDT Conference Spring 2021 – look here at slide 11

Patrick Hillberg‘s presentation Rising to the challenge of engineering and optimizing . . . what?  was the one closest to my heart. We discussed Sustainability and Systems Thinking with Patrick in our PLM Global Green Alliance, being pretty aligned on this topic.  Patrick started by explaining the difference between Systems Engineering and Systems Thinking. Looking at the product go-to-market of an organization is more than the traditional V-model. Economic pressure and culture will push people to deviate from the ideal technological plan due to other priorities.

Expanding on this observation, Partick stated that there are limits to growth, a topic discussed by many people involved in the sustainable economy. Economic growth is impossible on a limited planet, and we have to take more dimensions into account. Patrick gave some examples of that, including issues related to the infamous Boeing 737 Max example.

For Patrick, the COVID-pandemic is the end of the old 2nd Industrial Revolution and a push for a new Fourth Industrial Revolution, which is not only technical, as the slide below indicates.

With Patrick, I believe we are at a decisive moment to disrupt ourselves, reconsider many things we do and are used to doing. Even for PLM practitioners, this is a new path to go.

Data

There were two presentations related to digitization and the shift from document-based to a data-driven approach.

First, there was Greg Weaver (Gulfstream) with his presentation Indexing Content – Finding Your Needle in the Haystack. Greg explained that by using indexation of existing document-based information combined with a specific dashboard, they could provide fast access to information that otherwise would have been hidden in so many document or even paper archives.

It was a pragmatic solution, making me feel nostalgic seeing the SmarTeam profile cards. It was an excellent example of moving to a digital enterprise, and Gulfstream has always been a front runner on this topic.

Warning: Don’t use this by default at home (your company). The data in a regulated industry like Aerospace is expected to be of high quality due to the configuration management processes in place. If your company does not have a strong CM practice, the retrieved data might be inaccurate.

Martijn Dullaart (ASML)’s presentation The Next disruption, please…..  was the next step into the future. With his statement “No CM = No Trust,” he made an essential point for data-driven environments.

There is a need for Configuration Management, and I touched on this topic in my last post: The road to model-based and connected PLM (part 9 – CM).

Martijn’s presentation can also be found on his blog here, and I encourage you to read it (saving me copy & paste text). It was interesting to see that Martijn improved his CM pyramid, as you can see, more discipline and activity-oriented instead of a system view. With Martijn and others, I will elaborate on this topic soon.

Conclusion

This has been an extremely long post, and thanks for reading until the end. Many interesting topics were presented at the conference. I was less excited this time because many of these topics are triggers for a discussion. Innovation comes from meeting people with different backgrounds. In a live conference, you would meet during the break or during the famous dinner. How can we ensure we follow up on all this interesting information.

Your thoughts? Contact me for a Corona Friday discussion.

When I started this series in July, I expected to talk mostly about new ways of working, enabled through a data-driven and model-based approach. However, when analyzing what is needed for such a future (part 3), it became apparent that many of these new ways of working are dependent on technology.

From coordinated to connected sounds like a business change;

however, it all depends on technology. And here I have to thank Marc Halpern (Gartner’s Research VP, Engineering and Design Technologies)  again, who came with this brilliant scheme below:

So now it is time to address the last point from my starting post:

Configuration Management requires a new approach. The current methodology is very much based on hardware products with labor-intensive change management. However, the world of software products has different configuration management and change procedures. Therefore, we need to merge them into a single framework. Unfortunately, this cannot be the BOM framework due to the dynamics in software changes.

Configuration management at this moment

PLM and CM are often considered overlapping. My March 2019 post: PLM and Configuration Management – a happy marriage? shares some thoughts related to this point

Does having PLM or PDM installed mean you have implemented CM? There is this confusion because revision management is considered the same as configuration management. Read my March 2020 post: What the FFF is happening? Based on a vivid discussion launched by  Yoann Maingon, CEO and founder of Ganister, an example of a modern, graph database-based, flexible PLM solution.

To hear it from a CM-side,  I discussed it with Martijn Dullaart in my February 2021 post: PLM and Configuration Management. We also zoomed in on CM2 in this post as a methodology.

Martijn is the Lead Architect for Enterprise Configuration Management at ASML (Our Dutch national pride) and chairperson of the Industry 4.0 committee of the Integrated Process Excellence (IPX) Congress.

As mentioned before in a previous post (part 6), he will be speaking at the PLM Roadmap & PDT Fall conference starting this upcoming week.

In this post, I want to talk about the CM future. For understanding the current situation, you can find a broad explanation here on Wikipedia. Have a look at CM in the context of the product lifecycle, ensuring that the product As-Specified and As-Designed information matches the As-Built and As-Operated product information.

A mismatch or inconsistency between these artifacts can lead to costly errors, particularly in later lifecycle stages. CM originated from the Aerospace and Defense industry for that reason. However, companies in other industries might have implemented CM practices too. Either due to regulations or thanks to the understanding that configuration mistakes can cause significant damage to the company.

Historically configuration management addresses the needs of “slow-moving” products. For example, the design of an airplane could take years before manufacturing started. Tracking changes and ensuring consistency of all referenced datasets was often a manual process.

On purpose, I wrote “referenced datasets,” as the information was not connected in a single environment most of the time. The identifier of a dataset ( an item or a document) was the primary information carrier used for mentally connecting other artifacts to keep consistency.

The Institute of Process Excellence (IPX) has been one of the significant contributors to configuration management methodology. They have been providing (and still offer) CM2 training and certification.

As mentioned before, PLM vendors or implementers suggest that a PLM system could fully support Configuration Management. However, CM is more than change management, release management and revision management.

As the diagram from Martijn Dullaart shows, PLM is one facet of configuration management.

Of course, there are also (a few) separate CM tools focusing on the configuration management process. CMstat’s EPOCH CM tool is an example of such software. In addition, on their website, you can find excellent articles explaining the history and their future thoughts related to CM.

The future will undoubtedly be a connected, model-based, software-driven environment. Naturally, therefore, configuration management processes will have to change. (Impressive buzz word sentence, still I hope you get the message).

From coordinated to connected has a severe impact on CM. Let’s have a look at the issues.

Configuration Management – the future

The transition to a data-driven and model-based infrastructure has raised the following questions:

  • How to deal with the granularity of data – each dataset needs to be validated. For example, a document (a collection of datasets) needs to be validated in the document-based approach. How to do this efficiently?
  • The behavior of a product (or system) will more and more dependent on software. Product CM practices have been designed for the hardware domain; now, we need a mix of hardware and software CM practices.
  • Due to the increased complexity of products (or systems) and the rapid changes due to software versions, how do we guarantee the As-Operated product is still matching the As-Designed / As-Certified definitions.

I don’t have answers to these questions. I only share observations and trends I see in my actual world.

Granularity of data

The concept of datasets has been discussed in my post (part 6). Now it is about how to manage the right sets of connected data.

The image on the left, borrowed from Erik Herzog’s presentation at the PDM Roadmap & PDT Fall conference in 2020, is a good illustration of the challenge.

At that time, Erik suggested that OSLC could be the enabler of a digital CM backbone for an enterprise. Therefore, it was a pleasure to see Erik providing an update at the yearly OSLC Fest conference this week.

You can find the agenda and Erik’s presentation here on day 2.

OSLC as a framework seems to be a good candidate for supporting modern CM scenarios. It allows a company to build full traceability between all relevant artifacts (if digital available). I can see the beauty of the technical infrastructure.

Still, it is about people and processes first. Therefore, I am curious to learn from my readers who believe and experiment with such a federated infrastructure.

More software

Traditional working companies might believe that software should be treated as part of the Bill of Materials. In this theory, you treat software code as a part, with a part number and revision. In this way, you might believe configuration management practices do not have to change. However, there are some fundamental differences in why we should decouple hardware and software.

First, for the same hardware solution, there might be a whole collection of valid software codes. Just like your computer. How many valid software codes, even from the same application, can you run on this hardware? Managing a computer system and its software through a Bill of Materials is unimaginable.

A computer, of course, is designed for running all kinds of software versions. However, modern products in the field, like cars, machines, electrical devices, all will have a similar type of software-driven flexibility.

For that reason, I believe that companies that deliver software-driven products should design a mechanism to check if the combination of hardware and software is valid. For a computer system, a software mismatch might not be costly or painful; for an industrial system, it might be crucial to ensure invalid combinations can exist. Click on the image to learn more.

Solutions like Configit or pure::variants might lead to a solution. In Feb 2021, I discussed in PLM and Configuration Lifecycle Management with Henrik Hulgaard, the CTO from Configit, the unique features of their solution.

I hope to have a similar post shortly with Pure Systems to understand their added value to configuration management.

Software change management is entirely different from hardware change management. The challenge is to have two different change management approaches under one consistent umbrella without creating needless overhead.

Increased complexity – the digital twin?

With the increased complexity of products and many potential variants of a solution, how can you validate a configuration? Perhaps we should investigate the digital twin concept, with a twin for each instance we want to validate.

Having a complete virtual representation of a product, including the possibility to validate the software behavior on the virtual product, would allow you to run (automated) validation tests to certify and later understand a product in the field.

No need for inspection on-site or test and fix upgrades in the physical world. Needed for space systems for sure, but why not for every system in the long term. When we are able to define and maintain a virtual twin of our physical product (on-demand), we can validate.

I learned about this concept at the 2020 Digital Twin conference in the Netherlands. Bart Theelen from Canon Production Printing explained that they could feed their simulation models with actual customer data to simulate and analyze the physical situation. In some cases, it is even impossible to observe the physical behavior. By tuning the virtual environment, you might understand what happens in the physical world.

An eye-opener and an advocate for the model-based approach. Therefore, I am looking forward to the upcoming PLM Roadmap & PDT Fall conference. Hopefully, Martijn Dullaart will share his thoughts on combining CM and working in a model-based environment. See you there?

Conclusion

Finally, we have reached in this series the methodology part, particularly the one related to configuration management and traceability in a very granular, digital environment.  

After the PLM Roadmap & PDT fall conference, I plan to follow up with three thought leaders on this topic: Martijn Dullaart (ASML), Maxime Gravel (Moog) and Lisa Fenwick (CMstat).  What would you ask them?

In my previous post, I discovered that my header for this series is confusing. Although a future implementation of system lifecycle management (SLM/PLM) will rely on models, the most foundational change needed is a technical one to create a data-driven infrastructure for connected ways of working.

My previous article discussed the concept of the dataset, which led to interesting discussions on LinkedIn and in my personal interactions. Also, this time Matthias Ahrens (HELLA) shared again a relevant but very academic article in this context – how to harmonize company information.

For those who want to dive deeper into the concept of connected datasets, read this article: The euBusinessGraph ontology: A lightweight ontology for harmonizing basic company information.

The article illustrates that the topic is relevant for all larger enterprises (and it is not an easy topic).

This time I want to share my thoughts about the two statements from my introductory post, i.e.:

A model-based approach with connected datasets seems to be the way forward. Managing data in documents will become inefficient as they cannot contribute to any digital accelerator, like applying algorithms. Artificial Intelligence relies on direct access to qualified data.

A model-based approach with connected datasets

We discussed connected datasets in the previous post; now, let’s explore why models and datasets are related. In the traditional CAD-centric PLM domain, most people will associate the word model with a CAD model, to be more precise, the 3D CAD Model. However, there are many other types of models used related to product development, delivery and operations.

A model can be a:

Physical Model

  • A smaller-scale object for the first analysis, e.g., a city or building model, an airplane model

Conceptual Model

  • A conceptual model describes the entities and their relations, e.g., a Process Flow Diagram (PFD)
  • A mathematical model describes a system concept using a mathematical language, e.g., weather or climate models. Modelica and MATLAB would fall in this category
  • A CGI (Computer Generated Imagery) or 3D CAD model is probably the most associated model in the mind of traditional PLM practitioners
  • Functional and Logical Models describing the services and components of a system are crucial in an MBSE

Operational Model

  • A model providing performance analysis based on (real-time) data coming from selected data sources. It could be an operational business model, an asset performance model; even my Garmin’s training performance model is such an operating model.

The list of all models above is not extensive nor academically defined. Moreover, some model term definitions might overlap, e.g., where would we classify software models or manufacturing models?

All models are a best-so-far approach to describing reality. Based on more accurate data from observations or measurements, the model comes closer to what happens in reality.

A model and its data

Never blame the model when there is a difference between what the model predicts and the observed reality. It is still a model.  That’s why we need feedback loops from the actual physical world to the virtual world to fine-tune the model.

Part of what we call Artificial Intelligence is nothing more than applying algorithms to a model. The more accurate data available, the more “intelligent” the artificial intelligence solution will be.

By using data analysis complementary to the model, the model may get better and better through self-learning. Like our human brain, it starts with understanding the world (our model) and collecting experiences (improving our model).

There are two points I would like to highlight for this paragraph:

  • A model is never 100 % the same as reality – so don’t worry about deviations. There will always be a difference between virtual predicted and physical measured – most of the time because reality has much more influencing parameters.
  • The more qualified data we use in the model, the closer to reality – so focus on accurate (and the right) data for your model. Although, as most of the time, it is impossible to fully model a system, focus on the most significant data sources.

The ultimate goal: THE DIGITAL TWIN

The discussion related to data-driven and the usage of models might feel abstract and complex (and that’s the case). However the term “digital twin” is well known and even used in board rooms.

The great benefits of a digital twin for business operations and for sustainability are promoted by many software vendors and consultancy firms.

My statement and reason for this series of blog posts: Digital Twins do not run on documents, you need to have a data-driven, model-based infrastructure to efficiently benefit from digital twin concepts.

Unfortunate a reliable and sustainable implementation of a digital twin requires more than software – it is a learning journey to connect the right data to the right model.
A puzzle every company has to solve as there is no 100 percent blueprint at this time.

Are Low Code platforms the answer?

I mentioned the importance of accurate data. Companies have different systems or even platforms managing enterprise data. The digital dream is that by combining datasets from different systems and platforms, we can provide to any user the needed information in real-time. My statement from my introductory post was:

I don’t believe in Low-Code platforms that provide ad-hoc solutions on demand. The ultimate result after several years might be again a new type of spaghetti. On the other hand, standardized interfaces and protocols will probably deliver higher, long-term benefits. Remember: Low code: A promising trend or a Pandora’s Box?

Let’s look into some of the low-code platform messages mentioned by Low-Code advocates:

You will have an increasingly hard time finding developers to keep up with global app development demands (reason #1 for PEGA)

This statement reminded me of the early days of SmarTeam implementations. With a Data model Wizard, a Form Designer, and a Visual Basic COM API, you could create any kind of data management application with SmarTeam. By using its built-in behaviors for document lifecycle management, item lifecycle management, and CAD integrations combined with easy customizations.

The sky was the limit to satisfy end users.  No need for an experienced partner or to be a skilled programmer (this was 2003+). SmarTeam was a low-code platform the marketing department would say now.

A lot of my activities between 2003 and 2010 were related fixing the problems related to flexibility,  making sense (again) of customizations.  I wrote about this in a 2015 post: The importance of a (PLM) data model sharing the experiences of “fixing” issues created to flexibility.

Think first

The challenge is that an enthusiastic team creates a (low code) solution rapidly. Immediate success is celebrated by the people involved. However, the future impact of this solution is often forgotten – we did the job,  right?

Documentation and a broader visibility are often lacking when implementing such a solution.

For example, suppose your product data is going to be consumed by another app. In that case, you need to make sure that the information you consume is accurate. On the other hand, perhaps the information was valid when you created the app.

However, if your friendly co-worker has moved on to another job and someone with different data standards becomes responsible for the data you consume, the reliability might fail. So how do you guarantee its quality?

Easy tools have often led to spaghetti, starting from Clipper (the old days), Visual Basic (the less old days) to highly customizable systems (like Aras is promoting) and future low-code platforms (and Aras is there again).

However, the strength of being highly flexible is also the weaknesses if not managed and understood correctly. In particular, in a digital enterprise architecture, you need skilled people who guarantee a reliable anchorage of the solution.

The HBR article When Low-Code/No-Code Development Works — and When It Doesn’t mentions the same point:

There are great benefits from LC/NC software development, but management challenges as well. Broad use of these tools institutionalizes the “shadow IT phenomenon, which has bedeviled IT organizations for decades — and could make the problem much worse if not appropriately governed. Citizen developers tend to create applications that don’t work or scale well, and then they try to turn them over to IT. Or the person may leave the company, and no one knows how to change or support the system they developed.

The fundamental difference: from coordinated to connected

For the moment, I remain skeptical about the low-code hype, because I have seen this kind of hype before. The most crucial point companies need to understand is that the coordinated world and the connected world are incompatible.

Using new tools based on old processes and existing data is not a digital transformation. Instead, a focus on value streams and their needed (connected) data should lead to the design of a modern digital enterprise, not the optimization and connectivity between organizational siloes.
Before buying a tool (a medicine) to reduce the current pains, imagine your future ways of working, discover what is possible with your existing infrastructure and identify the gaps.

Next, you need to analyze if these gaps are so significant that it requires a technology change. Probably it does, as historically, systems were not designed to share data horizontally in an organization.

In this context, have a look at Lionel Grealou’s s article for Engineering.com:
Data Readiness in the new age of digital collaboration.

Conclusion

We discussed the crucial relation between models and data. Models have only value if they acquire the right and accurate data (exercise 1).

Next, even the simplest development platforms, like low-code platforms, require brains and a long-term strategy (exercise 2) – nothing is simple at this moment in transformational times.  

The next and final post in this series will focus on configuration management – a new approach is needed. I don’t have the answers, but I will share some thoughts

A recommended event and an exciting agenda and a good place to validate and share your thoughts.

I will be there and look forward to meeting you at this conference (unfortunate still virtually)

This week I attended the SCAF conference in Jonkoping. SCAF is an abbreviation of the Swedish CATIA User Group. First of all, I was happy to be there as it was a “physical” conference, having the opportunity to discuss topics with the attendees outside the presentation time slot.

It is crucial for me as I have no technical message. Instead, I am trying to make sense of the future through dialogues. What is sure is that the future will be based on new digital concepts, completely different from the traditional approach that we currently practice.

My presentation, which you can find here on SlideShare, was again zooming in on the difference between a coordinated approach (current) and a connected approach (the future).

The presentation explains the concepts of datasets, which I discussed in my previous blog post. Now, I focussed on how this concept can be discovered in the Dassault Systemes 3DExperience platform, combined with the must-go path for all companies to more systems thinking and sustainable products.

It was interesting to learn that the concept of connected datasets like the spider’s web in the image reflected the future concept for many of the attendees.

One of the demos during the conference illustrated that it is no longer about managing the product lifecycle through structures (EBOM/MBOM/SBOM).

Still, it is based on a collection of connected datasets – the path in the spider’s web.

It was interesting to talk with the present companies about their roadmap. How to become a digital enterprise is strongly influenced by their legacy culture and ways of working. Where to start to be connected is the main challenge for all.

A final positive remark.  The SCAF had renamed itself to SCAF (3DX), showing that even CATIA practices no longer can be considered as a niche – the future of business is to be connected.

Now back to the thread that I am following on the series The road to model-based. Perhaps I should change the title to “The road to connected datasets, using models”. The statement for this week to discuss is:

Data-driven means that you need to have an enterprise architecture, data governance and a master data management (MDM) approach. So far, the traditional PLM vendors have not been active in the MDM domain as they believe their proprietary data model is leading. Read also this interesting McKinsey article: How enterprise architects need to evolve to survive in a digital world

Reliable data

If you have been following my story related to PLM transition: From a connected to a coordinated infrastructure might have seen the image below:

The challenge of a connected enterprise is that you want to connect different datasets, defined in various platforms, to support any type of context. We called this a digital thread or perhaps even better framed a digital web.

This is new for most organizations because each discipline has been working most of the time in its own silo. They are producing readable information in neutral files – pdf drawings/documents. In cases where a discipline needs to deliver datasets, like in a PDM-ERP integration, we see IT-energy levels rising as integrations are an IT thing, right?

Too much focus on IT

In particular, SAP has always played the IT card (and is still playing it through their Siemens partnership). Historically, SAP claimed that all parts/items should be in their system. Thus, there was no need for a PDM interface, neglecting that the interface moment was now shifted to the designer in CAD. And by using the name Material for what is considered a Part in the engineering world, they illustrated their lack of understanding of the actual engineering world.

There is more to “blame” to SAP when it comes to the PLM domain, or you can state PLM vendors did not yet understand what enterprise data means. Historically ERP systems were the first enterprise systems introduced in a company; they have been leading in a transactional  “digital” world. The world of product development never has been a transactional process.

SAP introduced the Master Data Management for their customers to manage data in heterogeneous environments. As you can imagine, the focus of SAP MDM was more on the transactional side of the product (also PIM) than on the engineering characteristics of a product.

I have no problem that each vendor wants to see their solution as the center of the world. This is expected behavior. However, when it comes to a single system approach, there is a considerable danger of vendor lock-in, a lack of freedom to optimize your business.

In a modern digital enterprise (to be), the business processes and value streams should be driving the requirements for which systems to use. I was tempted to write “not the IT capabilities”; however, that would be a mistake. We need systems or platforms that are open and able to connect to other systems or platforms. The technology should be there, and more and more, we realize the future is based on connectivity between cloud solutions.

In one of my first posts (part 2), I referred to five potential platforms for a connected enterprise.  Each platform will have its own data model based on its legacy design, allowing it to service its core users in an optimized environment.

When it comes to interactions between two or more platforms, for example, between PLM and ERP, between PLM and IoT, but also between IoT and ERP or IoT and CRM, these interactions should first be based on identified business processes and value streams.

The need for Master Data Management

Defining horizontal business processes and value streams independent of the existing IT systems is the biggest challenge in many enterprises. Historically, we have been thinking around a coordinated way of working, meaning people shifting pieces of information between systems – either as files or through interfaces.

In the digital enterprise, the flow should be leading based on the stakeholders involved. Once people agree on the ideal flow, the implementation process can start.

Which systems are involved, and where do we need a connection between the two systems. Is the relationship bidirectional, or is it a push?

The interfaces need to be data-driven in a digital enterprise; we do not want human interference here, slowing down or modifying the flow. This is the moment Master Data Management and Data Governance comes in.

When exchanging data, we need to trust the data in its context, and we should be able to use the data in another context. But, unfortunately, trust is hard to gain.

I can share an example of trust when implementing a PDM system linked to a Microsoft-friendly ERP system. Both systems we able to have Excel as an interface medium – the Excel columns took care of the data mapping between these two systems.

In the first year, engineers produced the Excel with BOM information and manufacturing engineering imported the Excel into their ERP system. After a year, the manufacturing engineers proposed to automatically upload the Excel as they discovered the exchange process did not need their attention anymore – they learned to trust the data.

How often have you seen similar cases in your company where we insist on a readable exchange format?

When you trust the process(es), you can trust the data. In a digital enterprise, you must assume that specific datasets are used or consumed in different systems. Therefore a single data mapping as in the Excel example won’t be sufficient

Master Data Management and standards?

Some traditional standards, like the ISO 15926 or ISO 10303, have been designed to exchange process and engineering data – they are domain-specific. Therefore, they could simplify your master data management approach if your digitalization efforts are in that domain.

To connect other types of data, it is hard to find a global standard that also encompasses different kinds of data or consumers. Think about the GS1 standard, which has more of a focus on the consumer-side of data management.  When PLM meets PIM, this standard and Master Data Management will be relevant.

Therefore I want to point to these two articles in this context:

How enterprise architects need to evolve to survive in a digital world focusing on the transition of a coordinated enterprise towards a connected enterprise from the IT point of view.  And a recent LinkedIn post, Web Ontology Language as a common standard language for Engineering Networks? by Matthias Ahrens exploring the concepts I have been discussing in this post.

To me, it seems that standards are helpful when working in a coordinated environment. However, in a connected environment, we have to rely on master data management and data governance processes, potentially based on a clever IT infrastructure using graph databases to be able to connect anything meaningful and possibly artificial intelligence to provide quality monitoring.

Conclusion

Standards have great value in exchange processes, which happen in a coordinated business environment. To benefit from a connected business environment, we need an open and flexible IT infrastructure supported by algorithms (AI) to guarantee quality. Before installing the IT infrastructure, we should first have defined the value streams it should support.

What are your experiences with this transition?

Translate

Categories

%d bloggers like this: