You are currently browsing the tag archive for the ‘PLM’ tag.
In my previous posts dedicated to PLM education, I shared my PLM bookshelf, spoke with Peter Bilello from CIMdata about their education program and talked with Helena Gutierrez from SharePLM about their education mission.
In that last post, I promised this post will be dedicated to PLM education before s**t hits the fan. This statement came from my conversation with John Stark when we discussed where proper PLM education starts (before it hits the fan).
John is a well-known author of many books. You might have read my post about his book: Products2019: A project to map and blueprint the flow and management of products across the product lifecycle: Ideation; Definition; Realisation; Support of Use; Retirement and Recycling. A book with a very long title reflecting the complexity of a PLM environment.
John is also a long-time PLM consultant known in the early PLM community for his 2PLM e-zine. The 2PLM e-zine was an information letter he published between 1998 and 2017 before blogging and social interaction, updating everyone in the PLM community with the latest news. You probably were subscribed to this e-zine if you are my age.
So, let’s learn something more from John Stark
John Stark
John, first of all, thanks for this conversation. We have known each other for a long time. First of all, can you briefly introduce yourself and explain where your passion for PLM comes from?
The starting point for my PLM journey was that I was involved in developing a CAD system. But by the 1990s, I had moved on to being a consultant. I worked with companies in different industry sectors, with very different products.
I worked on application and business process issues at different product lifecycle stages – Ideation; Definition; Realization; Support of Use; Retirement and Recycling.
However, there was no name for the field I was working in at that time. So, I decided to call it Product Lifecycle Management and came up with the following definition:
‘PLM is the business activity of managing, in the most effective way, a company’s products all the way across their lifecycles; from the very first idea for a product, all the way through until it is retired and disposed of’.
PLM is the management system for a company’s products. It doesn’t just manage one of its products. It manages all of its parts and products and the product portfolio in an integrated way.’
I put that definition at the beginning of a book, ‘Product Lifecycle Management: Paradigm for 21st Century Product Realization’, published in 2004 and has since become the most cited book about PLM. I included my view of the five phases of the product lifecycle
and created the PLM Grid to show how everything (products, applications, product data, processes, people, etc.) fits together in PLM.
From about 2012, I started giving a blended course, The Basics of PLM, with the PLM Institute in Geneva.
As for the passion, I see PLM as important for Mankind. The planet’s 7 billion inhabitants all rely on products of various types, and the great majority would benefit from faster, easier access to better products. So PLM is a win-win for us all.
That’s interesting. I also had a nice definition picture I used in my early days. x

PI London 2011
and I had my view of the (disconnected) lifecycle.

PI Apparel London 2014
The education journey
John, as you have been active in PLM education for more than twenty years, do you feel that PLM Education and Training has changed.
PLM has only existed for about twenty years. Initially, it was so new that there was just one approach to PLM education and training, but that’s changed a lot.
Now there are specific programs for each of the different types of people interested or involved with PLM. So, for example, now there are specific courses for students, PLM application vendor personnel, PLM Managers, PLM users, PLM system integrators, and so on. Each of these groups has a different need for knowledge and skills, so they need different courses.
Another big change has been in the technologies used to support PLM Education and Training. Twenty years ago, the course was usually a deck of PowerPoint slides and an overhead projector. The students were in the same room as the instructor.
These days, courses are often online and use various educational apps to help course participants learn.
Who should be educated?
Having read several of your books, they are very structured and academic. Therefore, they will never be read by people at the C-level of an organization. Who are you targeting with your books, and why?
Initially, I wasn’t targeting anybody. I was just making my knowledge available. But as time went by, I found that my books were mainly used in further education and ongoing education courses.
So now, I focus on a readership of students in such organizations. For example, I’ve adapted some books to have 15 chapters to fit within a 15-week course.
Students make up a good readership because they want to learn to pass their exams. In addition, and it’s a worldwide market, the books are used in courses in more than twenty countries. Also, these courses are sufficiently long, maybe 150 hours, for the students to learn in-depth about PLM. That’s not possible with the type of very short PLM training courses that many companies provide for their employees.
PLM education
Looking at publicly available PLM education, what do you think we can do better to get PLM out of the framing of an engineering solution and become a point of discussion at the C-level
Even today, PLM is discussed at C-level in some companies. But in general, the answer is to provide more education about PLM. Unfortunately, that will take time, as PLM remains very low profile for most people.
For example, I’m not aware of a university with a Chair of Product Lifecycle Management. But then, PLM is only 20 years old, that’s very young.
It often takes two generations for new approaches and technologies to become widely accepted in the industry.
So another possibility would be for leading vendors of PLM applications to make the courses they offer about PLM available to a wider audience.
A career with PLM?
Educating students is a must, and like you and me, there are a lot of institutions that have specialized PLM courses. However, I also noticed a PLM expert at C-level in an organization is an exception; most of the time, people with a financial background get promoted. So, is PLM bad for your career?
No, people can have a good career in PLM, especially if they keep learning. There are many good master’s courses if they want to learn more outside the PLM area. I’ve seen people with a PLM background become a CIO or a CEO of a company with thousands of employees. And others who start their own companies, for example, PLM consulting or PLM training. And others become PLM Coaches.
PLM and Digital Transformation
A question I ask in every discussion. What is the impact of digital transformation on your area of expertise? In this case, how do you see PLM Education and Training looking in 2042, twenty years in the future?
I don’t see digital transformation really changing the concept of PLM over the next twenty years. In 2042, PLM will still be the business activity of managing a company’s products all the way across their lifecycles.
So, PLM isn’t going to disappear because of digital transformation.
On the other hand, the technologies and techniques of PLM Education and Training are likely to change – just as they have over the last twenty years. And I would expect to see some Chairs of Product Lifecycle Management in universities, with more students learning about PLM. And better PLM training courses available in companies.
I see digital transformation making it possible to have an entire connected lifecycle without a lot of overhead.
Want to learn more?
My default closing question is always about giving the readers pointers to more relevant information. Maybe an overkill looking at your oeuvre as a writer. Still, the question is, where can readers from this blog learn more?
x
Three suggestions:
x
- Roger Tempest’s PLMIG
- The IFIP International Conference on Product Lifecycle Management
- Business Value of PLM Curriculum Webinar
What I learned
By talking with John and learning his opinion, I see the academic approach to define PLM as a more scientific definition, creating a space for the PLM professional.
We had some Blog /LinkedIn interaction related to PLM: Should PLM become a Profession? In the past (2017).
When I search on LinkedIn, I find 87.000 persons with the “PLM Consultant” tag. From those, I know in my direct network, I am aware there is a great variety of skills these PLM Consultants have. However, I believe it is too late to establish the PLM Professional role definition.
John’s focus is on providing students and others interested in PLM a broad fundamental knowledge to get into business. In their day-to-day jobs, these people will benefit from knowing the bigger context and understanding the complexity of PLM.
This is also illustrated in Product2019, where the focus is on the experience – company culture and politics.
Due to the diversity of PLM, we will never be able to define the PLM professional job role compared to the Configuration Manager. Two disciplines are crucial and needed for a sustainable, profitable enterprise.
Conclusion
In this post, we explored a third dimension of PLM Education, focusing on a foundational approach, targeting in particular students to get educated on all the aspects of PLM. John is not the only publisher of educational books. I have several others in my network who have described PLM in their wording and often in their language. Unfortunately, there is no central point of reference, and I believe we are too late for that due to the tremendous variety in PLM.
Next week I will talk with a Learning & Development leader from a company providing PLM consultancy – let’s learn how they enable their employees to support their customers.
In my previous posts dedicated to PLM education, I shared my and spoke with Peter Bilello from CIMdata about their education program. This time I am talking with Helena Gutierrez, one of the founders of Share PLM.
They are a young and energetic company with a mission to make PLM implementations successful, not through technology or customization, but through education and training.
Let’s discover their mission.
Share PLM
Helena, let me start with the brilliant name you have chosen for the company: Share PLM. Sharing (information) is the fundamental concept of PLM; if you don’t aim to share from the start, you won’t be able to fix it later. Can you tell us more about Share PLM’s mission and where you fit in the PLM ecosystem?
Jos, first of all, thank you for the invitation to your blog! That’s a great question. In my previous job, as a young PLM director at the former Outotec, nowadays Metso Outotec, I realized how much I could learn from sharing experiences with other professionals.
I thought that by bringing people together from different companies with different backgrounds, PLM professionals could learn and get prepared for some of their projects.
In the beginning, I envisioned some kind of a marketplace, where people could also sell their own resources. A resource I often missed was some kind of POC template for a new deployment, these kinds of things.
I still remember my boss’s face at that time when I told him, Sami Grönstrand, that I wanted to sell templates. [laugh]
A lot has happened since then and we have evolved into a small niche where we can offer a lot of value.
Software vendors keep their PLM systems generic. And almost every company needs to adapt their systems to their company reality: their processes, their system architecture, and their people.
The key questions are: How can I map my company’s processes and the way we work to the new system? How can I make sense of the new systems and help people understand the big picture behind the system clicks?
That’s where we come in.
Education or Training
With Peter Bilello, we discussed the difference between education and training. Where would you position Share PLM?
This is an interesting differentiation – I must say I hadn’t heard of it before, but it makes sense.
I think we are in the middle of the two: theory and practice. You see, many consulting companies focus on the “WHY”, the business needs. But they don’t touch the systems. So don’t tell them to go into Teamcenter or OpenBOM because they want to stay at a theoretical level.
Some system integrators get into the system details, but they don’t connect the clicks to the “WHY” to the big picture.
The connection between the “WHY” and the “HOW” is really important to get the context, to understand how things work.
So that’s where we are very strong. We help companies connect the “WHY” and the “HOW”. And that’s powerful.
The success of training
We are both promoting the importance of adequate training as part of a PLM implementation. Can you share with us a best-in-class example where training really made a difference? Can we talk about ROI in the context of training?
Jos, I think when I look at our success stories, most good examples share some of the following characteristics:
xxxx
- All start with “WHY”, and they have a story.
In today’s world, people want to understand the “WHY”. So in practical terms, we work with customers to prepare a storyline that helps understand the “WHY” in a practical and entertaining manner.
- All have a clear, top-down visualization of the process and related use cases.
This is simple, but it’s a game-changer. When people see the big picture, something “clicks,” and they feel “safe” at first sight. They know there is a blueprint for how things work and how they connect.
- All have quick, online answers to their questions.
A digital knowledge base where people can find quick answers and educate themselves.
This is one example of a knowledge base from one of our customers, OpenBOM. As you can see from the link below, they have documented how the system should be used in their knowledge base. In addition, they have a set of online eLearning courses that users can take to get started.
- All involve people in the training and build a “movement”.
People want to be heard and be a part of something. Engaging people in user communities is a great way to both learn from your users, and make them a part of your program. Bringing people together and putting them at the center of your training. I think this is key to success.
Training for all types of companies?
Do you see a difference between large enterprises and small and medium enterprises regarding training? Where would your approach fit best?
Yes, absolutely. And I think the most important difference is speed.
A big company can afford to work on all the elements I described before at the same time because they have the “horse-power” to drive different tracks. They can involve different project managers, and they can finance the effort.
Small companies start small and build their training environment slowly. Some might do some parts by themselves and use our services to guide them through the process.
I enjoy both worlds – the big corporations have big budgets, and you can do cool stuff.
But the small startups have big brains, and they often are very passionate about what they are doing. I enjoy working with startups because they dare to try new things and they are very creative.
Where Is Share PLM Training Different?
I see all system integrators selling PLM training. In my SmarTeam days I also built some “Express” training – Where are you different?
When I started Share PLM, we participated in a startup accelerator. When I was explaining our business model, they asked me the question: “Aren’t the software vendors or the system integrators doing exactly what you do?”
And the answer, incredibly, is that system integrators are often not interested in training and documentation, and they just don’t do it well as they have no didactical background.
Sometimes it’s even the same guy configuring the system who gets the task to create the training. Those people produce boring “technical’ manuals, using thousands of PowerPoint slides with no soul – who wants to read that?
No wonder PLM training has a bad reputation!
We are laser-focused on digital training, and our training is very high quality. We are good at connecting pieces of information and making sense of complex stuff. We also are strong at aesthetics, and our training looks good. The content is nicely presented when you open our courses, and people look forward to reading it!
Digital Transformation and PLM
I always ask when talking with peers in the PLM domain: How do you see the digital transformation happening at your customers, and how can you help them?
An interesting question. I see that boundaries between systems are getting thinner. For example, some time ago, you would have a program to deploy a PLM system.
Now I see a lot of “outcome-based” programs, where you focus on the business value and use adequate systems to get there.
For example, a program to speed up product deliveries or improve quality. That type of program involves many different systems and teams. It relates to your “connected enterprise” concept.
This transformation is happening, and I think we are well-positioned to help companies make sense of the connection between different systems and how they digitize their processes.
Want to learn more?
Thanks, Helena, for sharing your insights. Are there any specific links you want to provide to learn more? Perhaps some books to read or conferences to visit?
Thanks so much, Jos, for allowing me to discuss this with you today. Yes, I always recommend reading blogs and books to stay up-to-date.
- We both have good blogging and reading lists on our websites. See on our blog the post The 12 Best PLM Blogs To Follow or the recommendations on your PLM Bookshelf
- Conferences are also great for connecting with other people. In general, I think it’s very helpful to see examples from other companies to get inspired.
- And we have our podcast, to my knowledge the only one when you search for PLM – because the interaction is new.
I’m happy to provide some customer references for people who want to learn more about how good training looks practically. Just get in touch with me on LinkedIn or through our website.
What I learned
I know the founders from Share PLM since they were active in Outotec, eager to discuss and learn new PLM concepts. It is impressive to see how they made the next step to launch their company Share PLM and find the niche place that somehow I try to cover too in a similar manner.
When I started my blog virtualdutchman.com in 2008, I wanted to share PLM experiences and knowledge.
Read my 2008 opening post here. It was a one-way sharing – modern at that time – probably getting outdated in the coming years.
However, Helena and the SharePLM team have picked up my mission in a modern manner. They are making PLM accessible and understandable in your company, using a didactical and modern approach to training.
SharePLM perhaps does not focus on the overall business strategy for PLM yet as their focus is on the execution level with a refreshing and modern approach – focussing on the end-user, didactics and attractiveness. I expect in ten years from now, with the experience and the professional team, they will pick up this part too, allowing me to retire.
Conclusion
This was the second post around PLM and Education, mainly focussing on what is happening in the field. Where I see CIMdata’s focus on education on the business strategy level, I see Share PLM’s focus on the execution level, making sure the PLM implementation is fun for the end-user and therefore beneficial for the company. The next post will be again about PLM Education, this time before the s** t hits the fan. Stay tuned.
After two quiet weeks of spending time with my family in slow motion, it is time to start the year.
First of all, I wish you all a happy, healthy, and positive outcome for 2022, as we need energy and positivism together. Then, of course, a good start is always cleaning up your desk and only leaving the relevant things for work on the desk.
Still, I have some books at arm’s length, either physical or on my e-reader, that I want to share with you – first, the non-obvious ones:
The Innovators Dilemma
A must-read book was written by Clayton Christensen explaining how new technologies can overthrow established big companies within a very short period. The term Disruptive Innovation comes up here. Companies need to remain aware of what is happening outside and ready to adapt to your business. There are many examples even recently where big established brands are gone or diminished in a short period.
In his book, he wrote about DEC (Digital Equipment Company) market leader in minicomputers, not having seen the threat of the PC. Or later Blockbuster (from video rental to streaming), Kodak (from analog photography to digital imaging) or as a double example NOKIA (from paper to market leader in mobile phones killed by the smartphone).
The book always inspired me to be alert for new technologies, how simple they might look like, as simplicity is the answer at the end. I wrote about in 2012: The Innovator’s Dilemma and PLM, where I believed cloud, search-based applications and Facebook-like environments could disrupt the PLM world. None of this happened as a disruption; these technologies are now, most of the time, integrated by the major vendors whose businesses are not really disrupted. Newcomers still have a hard time to concur marketspace.
In 2015 I wrote again about this book, The Innovator’s dilemma and Generation change. – image above. At that time, understanding disruption will not happen in the PLM domain. Instead, I predict there will be a more evolutionary process, which I would later call: From Coordinated to Connected.
The future ways of working address the new skills needed for the future. You need to become a digital native, as COVID-19 pushed many organizations to do so. But digital native alone does not bring success. We need new ways of working which are more difficult to implement.
Sapiens
The book Sapiens by Yuval Harari made me realize the importance of storytelling in the domain of PLM and business transformation. In short, Yuval Harari explains why the human race became so dominant because we were able to align large groups around an abstract theme. The abstract theme can be related to religion, the power of a race or nation, the value of money, or even a brand’s image.
The myth (read: simplified and abstract story) hides complexity and inconsistencies. It allows everyone to get motivated to work towards one common goal. A Yuval says: “Fiction is far more powerful because reality is too complex”.
Too often, I have seen well-analyzed PLM projects that were “killed” by management because it was considered too complex. I wrote about this in 2019 PLM – measurable or a myth? claiming that the real benefits of PLM are hard to predict, and we should not look isolated only to PLM.
My 2020 follow-up post The PLM ROI Myth, eludes to that topic. However, even if you have a soundproof business case at the management level, still the myth might be decisive to justify the investment.
That’s why PLM vendors are always working on their myths: the most cost-effective solution, the most visionary solution, the solution most used by your peers and many other messages to influence your emotions, not your factual thinking. So just read the myths on their websites.
If you have no time to read the book, look at the above 2015 Ted to grasp the concept and use it with a PLM -twisted mind.
Re-use your CAD
In 2015, I read this book during a summer holiday (meanwhile, there is a second edition). Although it was not a PLM book, it was helping me to understand the transition effort from a classical document-driven enterprise towards a model-based enterprise.
Jennifer Herron‘s book helps companies to understand how to break down the (information) wall between engineering and manufacturing.
At that time, I contacted Jennifer to see if others like her and Action Engineering could explain Model-Based Definition comprehensively, for example, in Europe- with no success.
As the Model-Based Enterprise becomes more and more the apparent future for companies that want to be competitive or benefit from the various Digital Twin concepts. For that reason, I contacted Jennifer again last year in my post: PLM and Model-Based Definition.
As you can read, the world has improved, there is a new version of the book, and there is more and more information to share about the benefits of a model-based approach.
I am still referencing Action Engineering and their OSCAR learning environment for my customers. Unfortunately, many small and medium enterprises do not have the resources and skills to implement a model-based environment.
Instead, these companies stay on their customers’ lowest denominator: the 2D Drawing. For me, a model-based definition is one of the first steps to master if your company wants to provide digital continuity of design and engineering information towards manufacturing and operations. Digital twins do not run on documents; they require model-based environments.
The book is still on my desk, and all the time, I am working on finding the best PLM practices related to a Model-Based enterprise.
It is a learning journey to deal with a data-driven, model-based environment, not only for PLM but also for CM experts, as you might have seen from my recent dialogue with CM experts: The future of Configuration Management.
Products2019
This book was an interesting novelty published by John Stark in 2020. John is known for his academic and educational books related to PLM. However, during the early days of the COVID-pandemic, John decided to write a novel. The novel describes the learning journey of Jane from Somerset, who, as part of her MBA studies, is performing a research project for the Josef Mayer Maschinenfabrik. Her mission is to report to the newly appointed CEO what happens with the company’s products all along the lifecycle.
Although it is not directly a PLM book, the book illustrates the complexity of PLM. It Is about people and culture; many different processes, often disconnected. Everyone has their focus on their particular discipline in the center of importance. If you believe PLM is all about the best technology only, read this book and learn how many other aspects are also relevant.
I wrote about the book in 2020: Products2019 – a must-read if you are new to PLM if you want to read more details. An important point to pick up from this book is that it is not about PLM but about doing business.
PLM is not a magical product. Instead, it is a strategy to support and improve your business.
System Lifecycle Management
Another book, published a little later and motivated by the extra time we all got during the COVID-19 pandemic, was Martin Eigner‘s book System Lifecycle Management.
A 281-page journey from the early days of data management towards what Martin calls System Lifecycle Management (SysLM). He was one of the first to talk about System Lifecycle Management instead of PLM.
I always enjoyed Martin’s presentations at various PLM conferences where we met. In many ways, we share similar ideas. However, during his time as a professor at the University of Kaiserslautern (2003-2017), he explored new concepts with his students.
I briefly mentioned the book in my series The road to model-based and connected PLM (Part 5) when discussing SLM or SysLM. His academic research and analysis make this book very valuable. It takes you in a very structured way through the times that mechatronics becomes important, next the time that systems (hardware and software) become important.
We discussed in 2015 the applicability of the bimodal approach for PLM. However, as many enterprises are locked in their highly customized PDM/PLM environments, their legacy blocks the introduction of modern model-based and connected approaches.
Where John Stark’s book might miss the PLM details, Martin’s book brings you everything in detail and with all its references.
It is an interesting book if you want to catch up with what has happened in the past 20 years.
More Books …..
More books on my desk have helped me understand the past or that helped me shape the future. As this is a blog post, I will not discuss more books this time reaching my 1500 words.
Still books worthwhile to read – click on their images to learn more:
I discussed this book two times last year. An introduction in PLM and Modularity and a discussion with the authors and some readers of the book: The Modular Way – a follow-up discussion
x
x
A book I read this summer contributed to a better understanding of sustainability. I mentioned this book in my presentation for the Swedish CATIA Forum in October last year – slide 29 of
System Thinking becomes crucial for a sustainable future, as I addressed in my post PLM and Sustainability.
Sustainability is my area of interest at the PLM Green Global Alliance, an international community of professionals working with Product Lifecycle Management (PLM) enabling technologies and collaborating for a more sustainable decarbonized circular economy.
Conclusion
There is a lot to learn. Tell us something about your PLM bookshelf – which books would you recommend. In the upcoming posts, I will further focus on PLM education. So stay tuned and keep on learning.
This week I attended the SCAF conference in Jonkoping. SCAF is an abbreviation of the Swedish CATIA User Group. First of all, I was happy to be there as it was a “physical” conference, having the opportunity to discuss topics with the attendees outside the presentation time slot.
It is crucial for me as I have no technical message. Instead, I am trying to make sense of the future through dialogues. What is sure is that the future will be based on new digital concepts, completely different from the traditional approach that we currently practice.
My presentation, which you can find here on SlideShare, was again zooming in on the difference between a coordinated approach (current) and a connected approach (the future).
The presentation explains the concepts of datasets, which I discussed in my previous blog post. Now, I focussed on how this concept can be discovered in the Dassault Systemes 3DExperience platform, combined with the must-go path for all companies to more systems thinking and sustainable products.
It was interesting to learn that the concept of connected datasets like the spider’s web in the image reflected the future concept for many of the attendees.
One of the demos during the conference illustrated that it is no longer about managing the product lifecycle through structures (EBOM/MBOM/SBOM).
Still, it is based on a collection of connected datasets – the path in the spider’s web.
It was interesting to talk with the present companies about their roadmap. How to become a digital enterprise is strongly influenced by their legacy culture and ways of working. Where to start to be connected is the main challenge for all.
A final positive remark. The SCAF had renamed itself to SCAF (3DX), showing that even CATIA practices no longer can be considered as a niche – the future of business is to be connected.
Now back to the thread that I am following on the series The road to model-based. Perhaps I should change the title to “The road to connected datasets, using models”. The statement for this week to discuss is:
Data-driven means that you need to have an enterprise architecture, data governance and a master data management (MDM) approach. So far, the traditional PLM vendors have not been active in the MDM domain as they believe their proprietary data model is leading. Read also this interesting McKinsey article: How enterprise architects need to evolve to survive in a digital world
Reliable data
If you have been following my story related to PLM transition: From a connected to a coordinated infrastructure might have seen the image below:
The challenge of a connected enterprise is that you want to connect different datasets, defined in various platforms, to support any type of context. We called this a digital thread or perhaps even better framed a digital web.
This is new for most organizations because each discipline has been working most of the time in its own silo. They are producing readable information in neutral files – pdf drawings/documents. In cases where a discipline needs to deliver datasets, like in a PDM-ERP integration, we see IT-energy levels rising as integrations are an IT thing, right?
Too much focus on IT
In particular, SAP has always played the IT card (and is still playing it through their Siemens partnership). Historically, SAP claimed that all parts/items should be in their system. Thus, there was no need for a PDM interface, neglecting that the interface moment was now shifted to the designer in CAD. And by using the name Material for what is considered a Part in the engineering world, they illustrated their lack of understanding of the actual engineering world.
There is more to “blame” to SAP when it comes to the PLM domain, or you can state PLM vendors did not yet understand what enterprise data means. Historically ERP systems were the first enterprise systems introduced in a company; they have been leading in a transactional “digital” world. The world of product development never has been a transactional process.
SAP introduced the Master Data Management for their customers to manage data in heterogeneous environments. As you can imagine, the focus of SAP MDM was more on the transactional side of the product (also PIM) than on the engineering characteristics of a product.
I have no problem that each vendor wants to see their solution as the center of the world. This is expected behavior. However, when it comes to a single system approach, there is a considerable danger of vendor lock-in, a lack of freedom to optimize your business.
In a modern digital enterprise (to be), the business processes and value streams should be driving the requirements for which systems to use. I was tempted to write “not the IT capabilities”; however, that would be a mistake. We need systems or platforms that are open and able to connect to other systems or platforms. The technology should be there, and more and more, we realize the future is based on connectivity between cloud solutions.
In one of my first posts (part 2), I referred to five potential platforms for a connected enterprise. Each platform will have its own data model based on its legacy design, allowing it to service its core users in an optimized environment.
When it comes to interactions between two or more platforms, for example, between PLM and ERP, between PLM and IoT, but also between IoT and ERP or IoT and CRM, these interactions should first be based on identified business processes and value streams.
The need for Master Data Management
Defining horizontal business processes and value streams independent of the existing IT systems is the biggest challenge in many enterprises. Historically, we have been thinking around a coordinated way of working, meaning people shifting pieces of information between systems – either as files or through interfaces.
In the digital enterprise, the flow should be leading based on the stakeholders involved. Once people agree on the ideal flow, the implementation process can start.
Which systems are involved, and where do we need a connection between the two systems. Is the relationship bidirectional, or is it a push?
The interfaces need to be data-driven in a digital enterprise; we do not want human interference here, slowing down or modifying the flow. This is the moment Master Data Management and Data Governance comes in.
When exchanging data, we need to trust the data in its context, and we should be able to use the data in another context. But, unfortunately, trust is hard to gain.
I can share an example of trust when implementing a PDM system linked to a Microsoft-friendly ERP system. Both systems we able to have Excel as an interface medium – the Excel columns took care of the data mapping between these two systems.
In the first year, engineers produced the Excel with BOM information and manufacturing engineering imported the Excel into their ERP system. After a year, the manufacturing engineers proposed to automatically upload the Excel as they discovered the exchange process did not need their attention anymore – they learned to trust the data.
How often have you seen similar cases in your company where we insist on a readable exchange format?
When you trust the process(es), you can trust the data. In a digital enterprise, you must assume that specific datasets are used or consumed in different systems. Therefore a single data mapping as in the Excel example won’t be sufficient
Master Data Management and standards?
Some traditional standards, like the ISO 15926 or ISO 10303, have been designed to exchange process and engineering data – they are domain-specific. Therefore, they could simplify your master data management approach if your digitalization efforts are in that domain.
To connect other types of data, it is hard to find a global standard that also encompasses different kinds of data or consumers. Think about the GS1 standard, which has more of a focus on the consumer-side of data management. When PLM meets PIM, this standard and Master Data Management will be relevant.
Therefore I want to point to these two articles in this context:
How enterprise architects need to evolve to survive in a digital world focusing on the transition of a coordinated enterprise towards a connected enterprise from the IT point of view. And a recent LinkedIn post, Web Ontology Language as a common standard language for Engineering Networks? by Matthias Ahrens exploring the concepts I have been discussing in this post.
To me, it seems that standards are helpful when working in a coordinated environment. However, in a connected environment, we have to rely on master data management and data governance processes, potentially based on a clever IT infrastructure using graph databases to be able to connect anything meaningful and possibly artificial intelligence to provide quality monitoring.
Conclusion
Standards have great value in exchange processes, which happen in a coordinated business environment. To benefit from a connected business environment, we need an open and flexible IT infrastructure supported by algorithms (AI) to guarantee quality. Before installing the IT infrastructure, we should first have defined the value streams it should support.
What are your experiences with this transition?
In my last post in this series, The road to model-based and connected PLM, I mentioned that perhaps it is time to talk about SLM instead of PLM when discussing popular TLA’s for our domain of expertise. There were not so many encouraging statements for SLM so far.
SLM could mean for me, Solution Lifecycle Management, considering that the company’s offering more and more is a mix of products and services. Or SLM could mean System Lifecycle Management, in that case pushing the idea that more and more products are interacting with the outside world and therefore could be considered systems. Products are (almost) dead.
In addition, I mentioned that the typical product lifecycle and related configuration management concepts need to change as in the SLM domain. There is hardware and software with different lifecycles and change processes.
It is a topic I want to explore further. I am curious to learn more from Martijn Dullaart, who will be lecturing at the PLM Road map and PDT 2021 fall conference in November. I hope my expectations are not too high, knowing it is a topic of interest for Martijn. Feel free to join this discussion
In this post, it is time to follow up on my third statement related to what data-driven implies:
Data-driven means that we need to manage data in a much more granular manner. We have to look different at data ownership. It becomes more about data accountability per role as the data can be used and consumed throughout the product lifecycle
On this topic, I have a list of points to consider; let’s go through them.
The dataset
In this post, I will often use the term dataset (you are also allowed to write the data set I understood).
A dataset means a predefined number of attributes and values that belong logically to each other. Datasets should be defined based on the purpose and, if possible, designated for a single goal. In this way, they can be stored in a database.
Combined with other datasets, a combination can result in relevant business information. Note a dataset is not only transactional data; a dataset could also describe geometry.
Identify the dataset
In the document-based world, a lot of information could be stored in a single file. In a data-driven world, we should define a dataset that contains a specific piece of information, logically belonging together. If we are more precise, a part would have various related datasets that make up the definition of a part. These definitions could be:
- Core identification attributes like ID, Name, Type and Status
- The Type could define a set of linked information. For example, a valve would have different characteristics as a resistor. Through classification, we can link data sets to the core definition of a part.
- The part can have engineering-specific data (CAD and metadata), manufacturing-specific data, supplier-specific data, and service-specific data. Each of these datasets needs to be defined as a unique element in a data-driven environment
- CAD is a particular case as most current CAD systems don’t treat geometry as a single dataset. In a file-based world, many other datasets are stored in the file (e.g., engineering or manufacturing details). In a data-driven environment, we want to have the CAD definition to be treated like a dataset. Dassault Systèmes with their CATIA V6 and 3DEXPERIENCE platform or PTC with OnShape are examples of this approach.Having CAD as separate datasets makes sharing and collaboration so much easier, as we can see from these solutions. The concept for CAD stored in a database is not new, and this approach has been used in various disciplines. Mechanical CAD was always a challenge.
Thanks to Moore’s Law (approximate every 2 years, processor power doubled – click on the image for the details) and higher network connection speed, it starts to make sense to have mechanical CAD also stored in a database instead of a file
An important point to consider is a kind of standardization of datasets. In theory, there should be a kind of minimum agreed collection of datasets. Industry standards provide these collections in their dictionary. Whenever you optimize your data model for a connected enterprise, make sure you look first into the standards that apply to your industry.
They might not be perfect or complete, but inventing your own new standard is a guarantee for legacy issues in the future. This remark is also valid for the software vendors in this domain. A proprietary data model might give you a competitive advantage.
Still, in the long term, there is always the need to connect with outside stakeholders.
Identify the RACI
To ensure a dataset is complete and well maintained, the concept of RACI could be used. RACI is the abbreviation for Responsible Accountable Consulted and Informed and a simplification of the RASCI Model, see also a responsibility assignment matrix.
In a data-driven environment, there is no data ownership anymore like you have for documents. The main reason that data ownership can no longer be used is that datasets can be consumed by anyone in the ecosystem. No longer only your department or the manufacturing or service department.
Data sets in a data-driven environment bring value when connected with other datasets in applications or dashboards.
A dataset describing the specification attributes of a part could be used in a spare part app and a service app. Of course, the dataset will be used in a different context – still, we need to ensure we can trust the data.
Therefore, per identified dataset, there should be governed by a kind of RACI concept. The RACI concept is a way to break the siloes in an organization.
Identify Inside / outside
There is a lot of fear that a connected, data-driven environment will expose Intellectual Property (IP). It came up in recent discussions. If you like storytelling and technology, read my old SmarTeam colleague Alex Bruskin’s post: The Bilbo Baggins Threat to PLM Assets. Alex has written some “poetry” with a deep technical message behind it.
It is true that if your data set is too big, you have the challenge of exposing IP when connecting this dataset with others. Therefore, when building a data model, you should make it possible to have datasets pure for internal usage and datasets for sharing.
When you use the concept of RACI, the difference should be defined by the I(informed) – is it PLM-data or PIM-data for example?
Tracking relations
Suppose we follow up on the concept of datasets. In that case, it becomes clear that relations between the datasets are as crucial as the dataset. In traditional PLM applications, these relations are often predefined as part of the core data model/
For example, the EBOM parts have relationships between themselves and specification data – see image.
The MBOM parts have links with the supplier data or the manufacturing process.
The prepared relations in a PLM system allow people to implement the system relatively quickly to map their approaches to this taxonomy.
However, traditional PLM systems are based on a document-based (or file-based) taxonomy combined with related metadata. In a model-based and connected environment, we have to get rid of the document-based type of data.
Therefore, the datasets will be more granular, and there is a need to manage exponential more relations between datasets.
This is why you see the graph database coming up as a needed infrastructure for modern connected applications. If you haven’t heard of a graph database yet, you are probably far from technology hypes. To understand the principles of a graph database you can read this article from neo4j: Graph Databases for Beginners: Why graph technology is the future
As you can see from the 2020 Gartner Hype Cycle for Artificial Intelligence this technology is at the top of the hype and conceptually the way to manage a connected enterprise. The discussion in this post also demonstrates that besides technology there is a lot of additional conceptual thinking needed before it can be implemented.
Although software vendors might handle the relations and datasets within their platform, the ultimate challenge will be sharing datasets with other platforms to get a connected ecosystem.
For example, the digital web picture shown above and introduced by Marc Halpern at the 2018 PDT conference shows this concept. Recently CIMdata discussed this topic in a similar manner: The Digital Thread is Really a Web, with the Engineering Bill of Materials at Its Center
(Note I am not sure if CIMdata has published a recording of this webinar – if so I will update the link)
Anyway, these are signs that we started to find the right visuals to imagine new concepts. The traditional digital thread pictures, like the one below, are, for me, impressions of the past as they are too rigid and focusing on some particular value streams.
From a distance, it looks like a connected enterprise should work like our brain. We story information on different abstraction levels. We keep incredibly many relations between information elements. As the brain is a biological organ, connections degrade or get lost. Or the opposite other relationships become so strong that we cannot change them anymore. (“I know I am always right”)
Interestingly, the brain does not use the “single source of truth”-concept – there can be various “truths” inside a brain. This makes us human beings with all the good and the harmful effects of that.
As long as we realize there is no single source of truth.
In business and our technological world, we need sometimes the undisputed truth. Blockchain could be the basis for securing the right connections between datasets to guarantee the result is valid. I am curious if blockchain can scale to complex connected situations, although Moore’s Law might ultimately help us here too(if still valid).
The topic is not new – in 2014 I wrote a post with the title: PLM is doomed unless …. Where I introduced the topic of owning and sharing in the context of the human brain. In the post, I refer to the book On Intelligence by Jeff Hawkins how tries to analyze what is human-based intelligence and how could we apply it to our technology concepts. Still a fascinating book worth reading if you have the time and opportunity.
Conclusion
A data-driven approach requires a more granular definition of information, leading to the concepts of datasets and managing relations between datasets. This is a fundamental difference compared to the past, where we were operating systems with information. Now we are heading towards connected platforms that provide a filtered set of real-time data to act upon.
I am curious to learn more about how people have solved the connected challenges and in what kind of granularity. Let us know!
After a short summer break with almost no mentioning of the word PLM, it is time to continue this series of posts exploring the future of “connected” PLM. For those who also started with a cleaned-up memory, here is a short recap:
In part 1, I rush through more than 60 years of product development, starting from vellum drawings ending with the current PLM best practice for product development, the item-centric approach.
In part 2, I painted a high-level picture of the future, introducing the concept of digital platforms, which, if connected wisely, could support the digital enterprise in all its aspects. The five platforms I identified are the ERP and CRM platform (the oldest domains).
Next, the MES and PIP platform(modern domains to support manufacturing and product innovation in more detail) and the IoT platform (needed to support connected products and customers).
In part 3, I explained what is data-driven and how data-driven is closely connected to a model-based approach. Here we abandon documents (electronic files) as active information carriers. Documents will remain, however, as reports, baselines, or information containers. In this post, I ended up with seven topics related to data-driven, which I will discuss in upcoming posts.
Hopefully, by describing these topics – and for sure, there are more related topics – we will better understand the connected future and make decisions to enable the future instead of freezing the past.
Topic 1 for this post:
Data-driven does not imply, there needs to be a single environment, a single database that contains all information. As I mentioned in my previous post, it will be about managing connected datasets federated. It is not anymore about owned the data; it is about access to reliable data.
Platform or a collection of systems?
One of the first (marketing) hurdles to take is understanding what a data platform is and what is a collection of systems that work together, sold as a platform.
CIMdata published in 2017 an excellent whitepaper positioning the PIP (Product Innovation Platform): Product Innovation Platforms: Definition, Their Role in the Enterprise, and Their Long-Term Viability. CIMdata’s definition is extensive and covers the full scope of product innovation. Of course, you can find a platform that starts from a more focused process.
For example, look at OpenBOM (focus on BOM collaboration), OnShape (focus on CAD collaboration) or even Microsoft 365 (historical, document-based collaboration).
The idea behind a platform is that it provides basic capabilities connected to all stakeholders, inside and outside your company. In addition, to avoid that these capabilities are limited, a platform should be open and able to connect with other data sources that might be either local or central available.
From these characteristics, it is clear that the underlying infrastructure of a platform must be based on a multitenant SaaS infrastructure, still allowing local data to be connected and shielded for performance or IP reasons.
The picture below describes the business benefits of a Product Innovation Platform as imagined by Accenture in 2014
Link to CIMdata’s 2014 commentary of Digital PLM HERE
Sometimes vendors sell their suite of systems as a platform. This is a marketing trick because when you want to add functionality to your PLM infrastructure, you need to install a new system and create or use interfaces with the existing systems, not really a scalable environment.
In addition, sometimes, the collaboration between systems in such a marketing platform is managed through proprietary exchange (file) formats.
A practice we have seen in the construction industry before cloud connectivity became available. However, a so-called end-to-end solution working on PowerPoint implemented in real life requires a lot of human intervention.
Not a single environment
There has always been the debate:
“Do I use best-in-class tools, supporting the end-user of the software, or do I provide an end-to-end infrastructure with more generic tools on top of that, focusing on ease of collaboration?”
In the system approach, the focus was most of the time on the best-in-class tools where PLM-systems provide the data governance. A typical example is the item-centric approach. It reflects the current working culture, people working in their optimized siloes, exchanging information between disciplines through (neutral) files.
The platform approach makes it possible to deliver the optimized user interface for the end-user through a dedicated app. Assuming the data needed for such an app is accessible from the current platform or through other systems and platforms.
It might be tempting as a platform provider to add all imaginable data elements to their platform infrastructure as much as possible. The challenge with this approach is whether all data should be stored in a central data environment (preferably cloud) or federated. And what about filtering IP?
In my post PLM and Supply Chain Collaboration, I described the concept of having an intermediate hub (ShareAspace) between enterprises to facilitate real-time data sharing, however carefully filtered which data is shared in the hub.
It may be clear that storing everything in one big platform is not the future. As I described in part 2, in the end, a company might implement a maximum of five connected platforms (CRM, ERP, PIP, IoT and MES). Each of the individual platforms could contain a core data model relevant for this part of the business. This does not imply there might be no other platforms in the future. Platforms focusing on supply chain collaboration, like ShareAspace or OpenBOM, will have a value proposition too. In the end, the long-term future is all about realizing a digital tread of information within the organization.
Will we ever reach a perfectly connected enterprise or society? Probably not. Not because of technology but because of politics and human behavior. The connected enterprise might be the most efficient architecture, but will it be social, supporting all humanity. Predicting the future is impossible, as Yuval Harari described in his book: 21 Lessons for the 21st Century. Worth reading, still a collection of ideas.
Proprietary data model or standards?
So far, when you are a software vendor developing a system, there is no restriction in how you internally manage your data. In the domain of PLM, this meant that every vendor has its own proprietary data model and behavior.
I have learned from my 25+ years of experience with systems that the original design of a product combined with the vendor’s culture defines the future roadmap. So even if a PLM vendor would rewrite all their software to become data-driven, the ways of working, the assumptions will be based on past experiences.
This makes it hard to come to unified data models and methodology valid for our PLM domain. However, large enterprises like Airbus and Boeing and the major Automotive suppliers have always pushed for standards as they will benefit the most from standardization.
The recent PDT conferences were an example of this, mainly the 2020 Fall conference. Several Aerospace & Defense PLM Action groups reported their progress.
You can read my impression of this event in The weekend after PLM Roadmap / PDT 2020 – part 1 and The next weekend after PLM Roadmap PDT 2020 – part 2.
It would be interesting to see a Product Innovation Platform built upon a data model as much as possible aligned to existing standards. Probably it won’t happen as you do not make money from being open and complying with standards as a software vendor. Still, companies should push their software vendors to support standards as this is the only way to get larger connected eco-systems.
I do not believe in the toolkit approach where every company can build its own data model based on its current needs. I have seen this flexibility with SmarTeam in the early days. However, it became an upgrade risk when new, overlapping capabilities were introduced, not matching the past.
In addition, a flexible toolkit still requires a robust data model design done by experienced people who have learned from their mistakes.
The benefit of using standards is that they contain the learnings from many people involved.
Conclusion
I did not like writing this post so much, as my primary PLM focus lies on people and methodology. Still, understanding future technologies is an important point to consider. Therefore, this time a not-so-exciting post. There is enough to read on the internet related to PLM technology; see some of the recent articles below. Enjoy
Matthias Ahrens shared: Integrated Product Lifecycle Management (Google translated from German)
Oleg Shilovitsky wrote numerous articles related to technology –
in this context:
3 Challenges of Unified Platforms and System Locking and
SaaS PLM Acceleration Trends
For a year, we are now used to virtual events. PI PLMx 2020 in London was my last real event where I met people. When rereading my post about this event (the weekend after PI PLMx), I wrote that it was not a technology festival. Many presentations were about business change and how to engage people in an organization.
The networking discussions during the event and evenings were the most valuable parts of the conference.
And then came COVID-19. ☹
Shortly after, in April 2020, I participated in the TECHNIA Innovation Forum, which was the first virtual conference with a setup like a conference. A main stage, with live sessions, virtual booths, and many prerecorded sessions related to various PLM topics.
You can read my experience related to the conference in two posts: the weekend after PLMIF and My four picks from PLMIF. A lot of content available for 30 days. However, I was missing the social interaction, the people.
My favourite conference for 2020 was the CIMdata PLM Roadmap / PDT Fall 2020 conference in November. The PLM Roadmap/PDT conferences are not conferences for a novice audience; you have to be skilled in the domain of PLM most of the time with a strong presence from Aerospace and Defense companies.
The Fall 2020 theme: “Digital Thread—the PLM Professionals’ Path to Delivering Innovation, Efficiency, and Quality” might sound like a marketing term.
We hear so many times the words Digital Thread and Digital Twin. However, this conference was with speakers, active practitioners, from the field. I wrote about this conference in two posts: The weekend after PLM Roadmap / PDT 2020 – Part 1 and Part 2. I enjoyed the conference; however, I was missing social interaction.
The Digital Twin
Beyond the marketing hype, there is still a lot to learn and discuss from each other. First of all, it is not about realizing a digital twin; a business need should be the driver to investigate the possibility of a digital twin.
I am preparing a longer blog post on this topic to share learnings from people in the field. For example, in November 2020, I participated in the Netherlands in a Digital Twin Conference, focusing on real-life cases.
Companies shared their vision and successes. It was clear that we are all learning to solve pieces of the big puzzle; there are small successes. However, without marketing language, this type of event becomes extremely helpful for further discussion and follow-up.
Recently, I enjoyed the panel discussions during the PI DX Spotlight session: Digital Twin-Driven Design. The PI DX Spotlight sessions are a collection of deep dives in various themes – have a look for the upcoming schedule here.
In the Digital Twin-Driven Design session, I enjoyed the session: What does a Digital Twin mean to your Business and Defining Requirements?
The discussion was moderated by Peter Bilello, with three interesting panellists with different industrial backgrounds. (Click on the image for the details). I have to re-watch some of the Spotlight sessions (the beauty of a virtual event) to see how they fit in the planned Digital Twin post.
The Cenit/Keonys Innovation day
On March 23rd (this Tuesday), Cenit & Keonys launch their virtual Innovation Day, another event that, before COVID-19, would have been a real people event. I am mentioning this event in particular, as I was allowed to interview fifteen of their customers about their day-to-day work, PLM-related plans, and activities.
All these interviews have been recorded and processed in such a manner that within 5 to 8 minutes, you get an understanding of what people are doing.
To prepare for these interviews, I spoke with each of them before the interview. I wanted to understand the passion for their work and where our interests overlap.
I will not mention the individual interviews in this post, as I do not want to spoil the event. I talked with various startups (do they need PLM?) and established companies that started a PLM journey. I spoke with simulation experts (the future) and dimensional management experts (listen to these interviews to understand what it means). And ultimately, I interviewed a traditional porcelain family brand using 3D printing and 3D design, and at the other end, the German CIO of the year from 2020
(if you Google a little, you will easily find the companies involved here)
The most common topics discussed were:
- What was the business value of your PLM-related activity?
- Did COVID-19 impact your business?
- What about a cloud-based solution, and how do people align?
- If relevant, what are your experiences with a Model-Based Definition?
- What about sustainability?
I hope you will take the opportunity to register and watch these interviews as, for me, they were an excellent opportunity to be in touch with the reality in the field. As always, we keep on learning.
The Modular Way
Talking about learning. This week, I finished the book The Modular Way, written by Bjorn Eriksson & Daniel Strandhammar. During the lockdown last year, Bjorn & Daniel, founders of the Brick Strategy, decided to write down their experiences with mainly Scandinavian companies into a coherent framework to achieve modularization.
Modularity is a popular topic in many board meetings. How often have you heard: “We want to move from Engineering To Order to more Configure To Order”? Or another related incentive: “We need to be cleverer with our product offering and reduced the number of different parts”.
Next, the company buys a product that supports modularity, and management believes the work has been done. Of course, not. Modularity requires a thoughtful strategy.
The book can be a catalyst for such companies that want to invest in modularity but do not know where and how to start. The book is not written academically. It is more a story taking you along the steps needed to define, implement, and maintain modularity. Every step has been illustrated by actual cases and their business motivation and achieved benefits where possible. I plan to come back with Bjorn and Daniel in a dedicated post related to PLM and Modularity.
Conclusion
Virtual Events are probably part of our new future. A significant advantage is the global reach of such events. Everyone can join from anywhere connected around the world. Besides the larger events, I look forward to discovering more small and targeted discussion events like PI DX Spotlights. The main challenge for all – keep it interactive and social.
Let us know your favourite virtual event !!
After “The Doctor is IN,” now again a written post in the category of PLM and complementary practices/domains. In January, I discussed together with Henrik Hulgaard from Configit the complementary value of PLM and CLM (Configuration Lifecycle Management). For me, CLM is a synonym for Product Configuration Management.
As expected, readers were asking the question:
“What is the difference between CLM (Configuration Lifecycle Management) and CM(Configuration Management)?”
Good question.
As the complementary role of CM is also a part of the topics to discuss, I am happy to share this blog today with Martijn Dullaart. You probably know Martijn if you are actively following topics on PLM and CM.
Martijn has his own blog mdux.net, and you might have seen him recently in Jenifer Moore’s PLM TV-episode: Why CM2 for Faster Change and Better Documentation. Martijn is the Lead Architect for Enterprise Configuration Management at ASML (Our Dutch national pride) and chairperson of the Industry 4.0 committee of the Integrated Process Excellence (IPX) Congress. Let us start.
Configuration Management and CM2
Martijn, first of all, can you bring some clarity in terminology. When discussing Configuration Management, what is the pure definition, what is CM2 as a practice, and what is IpX‘s role and please explain where you fit in this picture?
Classical CM focuses mainly on the product, the product definition, and actual configurations like as-built and as-maintained of the product. CM2 extends the focus to the entire enterprise, e.g., the processes and procedures (ways of working) of a company, including the IT and facilities, to support the company’s value stream.
CM2 expands the scope to all information that could impact safety, security, quality, schedule, cost, profit, the environment, corporate reputation, or brand recognition.
Basically, CM2 shifts the focus to Integrated Process Excellence and promotes continual improvement.
Next to this, CM2 provides the WHAT and the HOW, something most standards lack. My main focus is still around the product and promoting the use of CM outside the product domain.
For all CM related documentation, we are already doing this.
Configuration Management and PLM
People claim that if you implement PLM as an enterprise backbone, not as an engineering tool, you can do Configuration Management with your PLM environment.
What is your opinion?
Yes, I think that this is possible, provided that the PLM tool has the right capabilities. Though the question should be: Is this the best way to go about it. For instance, some parts of Configuration Management are more transactional oriented, e.g., registering the parts you build in or out of a product.
Other parts of CM are more iterative in nature, e.g., doing impact analysis and making an implementation plan. I am not saying this cannot be done in a PLM tool as an enterprise backbone. Still, the nature of most PLM tools is to support iterative types of work rather than a transactional type of work.
I think you need some kind of enterprise backbone that manages the configuration as an As-Planned/As-Released baseline. A baseline that shows not only the released information but also all planned changes to the configuration.
Because the source of information in such a baseline comes from different tools, you need an overarching tool to connect everything. For most companies, this means that they require an overarching system with their current state of enterprise applications.
Preferably I would like to use the data directly from the sources. Still, connectivity and performance are not yet to a level that we can do this. Cloud and modern application and database architectures are very promising to this end.
Configuration Management for Everybody?
I can imagine companies in the Aerospace industry need to have proper configuration management for safety reasons. Also, I can imagine that proper configuration management can be relevant for other industries. Do they need to be regulated, or are there other reasons for a company to start implementing CM processes?
I will focus the first part of my answer within the context of CM for products only.
Basically, all products are regulated to some degree. Aerospace & Defense and Medical Device and Pharma are highly regulated for obvious reasons. Other industries are also regulated, for example, through environmental regulations like REACH, RoHS, WEEE or safety-related regulations like the CE marking or FCC marking.
Customers can also be an essential driver for the need for CM. If, as a customer, you buy expensive equipment, you expect that the supplier of that equipment can deliver per commitment. The supplier can also maintain and upgrade the equipment efficiently with as few disruptions to your operations as possible.
Not just customers but also consumers are critical towards the traceability of the product and all its components.
Even if you are sitting on a rollercoaster, you presume the product is well designed and maintained. In other words, there is often a case to be made to apply proper configuration management in any company. Still, the extent to which you need to implement it may vary based on your needs.
The need for Enterprise Configuration Management is even more significant because one of the hardest things is to change the way an organization works and operates.
Often there are different ways of doing the same thing. There is a lot of tribal knowledge, and ways of working are not documented so that people can easily find it, let alone that it is structured and linked so that you can do an impact analysis when you want to introduce a change in your organization.
CM and Digital Transformation
One of the topics that we both try to understand better is how CM will evolve in the future when moving to a more model-based approach. In the CM-terminology, we still talk about documents as information objects to be managed. What is your idea of CM and a model-based future?
It is indeed a topic where probably new or changed methodology is required, and I started already describing CM topics in several posts on my enterprise MDUX blog. Some of the relevant posts in this context are:
- HELP!!! Parts, Documents, Data & Revisions
- Where does the deliverable begin, and where does it end?
- A Glimpse into the Future of CM: Part 1, Part 2, and Part 3
First, let me say that model-based has the future, although, at the same time, the CM aspects are often overlooked.
When managing changes, too much detail makes estimating cost and effort for a business case more challenging, and planning information that is too granular is not desirable. Therefore, CM2 looks at datasets. Datasets should be as small as possible but not smaller. Datasets are sets of information that need to be released as a whole. Still, they can be released independently from other datasets. For example, a bill of materials, a BOM line item is not a dataset, but the complete set of BOM line items that make up the BoM of an assembly is considered a dataset. I can release a BoM independent from a test plan.
Data models need to facilitate this. However, today, in many PLM systems, a BOM and the metadata of a part are using the same revision. This means that to change the metadata, I need a revision of the BoM, while the BoM might not change. Some changes to metadata might not be relevant for a supplier. Communicating the changes to your supplier could create confusion.
I know some people think this is about document vs. model-centric, but it is not. A part is identified in the ‘physical world’ by its part ID. Even if you talk about allowing revisions in the supply chain, including the part ID’s revision, you create a new identifier. Now every new revision will end up in a different stock location. Is that what we want?
In any case, we are still in the early days, and the thinking about this topic has just begun and needs to take shape in the coming year(s).
CM and/or CLM?
As in my shared blog post with Henrik Hulgaard related to CLM, can you make a clear differentiation between the two domains for the readers?
Configuration Lifecycle Management (CLM) is mainly positioned towards Configurable Products and the configurable level of the product.
Why I think this, even though Configit’s CLM declaration states that “Configuration Lifecycle Management (CLM) is the management of all product configuration definitions and configurations across all involved business processes applied throughout the lifecycle of a product.”,
it also states:
- “CLM differs from other Enterprise Business Disciplines because it focuses on cross-functional use of configurable products.”
- “Provides a Single Source of Truth for Configurable Data“
- “handles the ever-increasing complexity of Configurable Products“.
I find Configuration Lifecycle Management is a core Configuration Management practice you need to have in place for configurable products. The dependencies you need to manage are enormously complex. Software parameters that depend on specific hardware, hardware to hardware dependencies, commercial variants, and options.
Want to learn more?
In this post, we just touched the surface of PLM and Configuration Management. Where can an interested reader find more information related to CM for their company?
For becoming trained in CM2, people can reach out to the Institute for Process Excellence, a company that focuses on consultancy and methodology for many aspects of a modern, digital enterprise, including Configuration Management.
And there is more out there, e.g.:
- ENGINEERING DOCUMENTATION CONTROL HANDBOOK, Configuration Management, Second Edition, by Frank. B. Watts
- Decision tree to assess interchangeability, Also Jörg Eisenträger (which is a good starting point)
- Configuration Management: Theory and Application for Engineers, Managers, and Practitioners, by Jon Quigleyand Kim Robertson.
- CM Insights Blog by CMStat
- Configuration Management Standard EIA649C by SAE
- MIL-HDBK-61A by product-lifecycle-management.com
Conclusion
Thanks, Martijn, for your clear explanations. People working seriously in the PLM domain managing the full product lifecycle should also learn and consider Configuration Management best practices. I look forward to a future discussion on how to perform Configuration Management in a model-based environment.
As promised in my blog post: PLM 2021 – My plans – your goals? I was planning to experiment with a format, which I labeled as: The PLM Doctor is IN.
The idea behind this format that anyone interested could ask a question – anonymous or through a video recording – and I would answer this single question.
As you can see from the survey result, many of the respondents (approx. 30 % that did not skip the question) had a question. Enough for the upcoming year to experiment – if the experiment works for you. As it is an experiment, I am also looking forward to your feedback to optimize this format.
Today the first episode: PLM and ROI
Relevant links discussed in this video
CIMdata webinar: PLM Benefits, Metrics & ROI with John MacKrell
VirtualDutchman: The PLM ROI Myth
Conclusion
What do you think? Does this format help you to understand and ask PLM related questions? Or should I not waste my time as there is already so much content out there. Let me know what you think in the comments.
Added February 10th
As the PLM Doctor sometimes talks like an oracle, it was great to see the summary written by SharePLM Learning Expert Helena Guitierrez.
Click on the image to see the full post.
First of all, thank you for the overwhelming response to the survey that I promoted last week: PLM 2021– your goals? It gave me enough inspiration and content to fill the upcoming months.
The first question of the survey was dealing with complementary practices or systems related to a traditional PLM-infrastructure.
As you can see, most of you are curious about Digital Twin management 68 % (it is hype). Second best are Configuration Management, Product Configuration Management and Supplier Collaboration Management, all with 58% of the votes. Click on the image to see the details. Note: you could vote for more than one topic.
Product Configuration Management
Therefore, I am happy to share this blog space with Configit’s CTO, Henrik Hulgaard. Configit is a company specialized in Product Configuration Management, or as they call it, Configuration Lifecycle Management (CLM).
Recently Henrik wrote an interesting article on LinkedIn: How to achieve End-To-End Configuration. A question that I heard several times from my clients. How to align the selling and delivery of configurable products, including sales, engineering and manufacturing?
Configit – the company / the mission
Henrik, thanks for helping me explaining the complementary value of end-to-end Product Configuration Management to traditional PLM systems. First of all, can you give a short introduction to Configit as a company and the unique value you are offering to your clients?
Hi Jos, thank you for having me. Configit has worked with configuration challenges for the last 20 years. We are approximately 200 people and have offices in Denmark, Germany, India, and in the US (Atlanta and Detroit) and work with some of the world’s largest manufacturing companies.
We are founded on patented technology, called Virtual Tabulation. The YouTube movie below explains the term Virtual Tabulation.
Virtual Tabulation compiles EVERY possible configuration scenario and then compresses that data into a very small file so that it can be used by everyone in your team.
Virtual Tabulations enables important capabilities such as:
- Consolidation of all configuration data, both Engineering and Sales related, into single-source-of-truth.
- Effortless maintenance of complicated rule data.
- Fast and error-free configuration engine that provides perfect guidance to the customer across multiple platforms and channels..
As the only vendor, Configit provides a configuration platform that fully supports end-to-end configuration processes, from early design and engineering, over sales and manufacturing to support and service configurable products.
This is what we understand by Configuration Lifecycle Management (CLM).
Why Configuration Lifecycle Management?
You have introduced the term Configuration Lifecycle Management – another TLA (Three Letter Acronym) and easy pronounce. However, why would a company being interested to implement Configuration Lifecycle Management (CLM)?
CLM is a way to break down the siloed systems traditionally found in manufacturing companies where products are defined in a PLM system, sold using a CRM/CPQ system, manufactured using an ERP system and serviced by typically ad-hoc and home-grown systems. A CLM system feeds these existing systems with an aligned and consistent view of what variants of a configurable product is available.
Organizations obtain several benefits when aligning across functions on what product variants it offers:
- Engineering: faster time-to-market, optimized variability, and the assurance to only engineer products that are sold
- Sales: reducing errors, making sure that what gets quoted is accurate, and reducing the time to close the deal. The configurator provides current, up-to-date, and accurate information.
- Manufacturing: reducing errors and production stoppages due to miss-builds
- Service: accurate information about the product’s configuration. The service technician knows precisely what capabilities to expect on the particular product to be serviced.
For example, one of our customers experienced a 95% reduction in the time – from a year to two weeks – it took them to create the configuration models needed to build and sell their products. This reduction meant a significant reduction in time to market and allowed additional product lines to be introduced.
CLM for everybody?
I can imagine that companies with products that are organized for mass-production still wanting to have the mindset of being as flexible as possible on the sales side. What type of companies would benefit the most from a CLM approach?
Any company that offers customized or configurable products or services will need to ensure that what is engineered is aligned with what is sold and serviced. Our customers typically have relatively high complexity with hundreds to thousands of configuration parameters.
CLM is not just for automotive companies that have high volume and high complexity. Many of our customers are in industrial components and machinery, offering complex systems and services. A couple of examples:
Philips Healthcare sells advanced scanners to hospitals and uses CLM to ensure that what is sold is aligned with what can be offered. They also would like to move to sell scanners as a service where the hospital may pay per MR scan.
Thyssenkrupp Elevators sell elevators that are highly customizable based on the needs and environment. The engineering rules start in the CAD environment. They are combined with commercial rules to provide guidance to the customer about valid options.
CLM and Digital Transformation
For me, CLM is an excellent example of what modern, digital enterprises need to do. Having product data available along the whole lifecycle to make real-time decisions. CLM is a connecting layer that allows companies to break the siloes between marketing, sales, engineering and operations. At C-level get excited by that idea as I can see the business value.
Now, what would you recommend realizing this idea?
- The first step is to move away from talking about parts and instead talk about features when communicating about product capabilities.
This requires that an organization establishes a common feature “language” (sometimes this is called a master feature dictionary) that is shared across the different functions.
As the feature codes are essential in the communication between the functions, the creation and updating of the feature language must be carefully managed by putting people and processes in place to manage them.
- The next step is typically to make information about valid configurations available in a central place, sometimes referred to as the single source of truth for configuration.
We offer services to expose this information and integrate it into existing enterprise systems such as PLM, ERP and CRM/CPQ. The configuration models may still be maintained in legacy systems. Still, they are imported and brought together in the CLM system.
Once consuming systems all share a single configuration engine, the organization may move on to improve on the rule authoring and replace the existing legacy rule authoring applications found in PLM and ERP systems with more modern applications such as Configit Ace.
As can be seen from above, these steps all go across the functional silos. Thus, it is essential that the CLM journey has top-level management support, typically from the CIO.
COVID-19?
Related to COVID-19, I believe companies realized that they had to reconsider their supply chains due to limiting dependencies on critical suppliers. Is this an area where Configit would contribute too?
The digital transformation that many manufacturing companies have worked on for years clearly has been accelerated by the COVID-19 situation, and indeed they might now start to encode information about the critical suppliers in the rules.
We have seen this happening in 2011 with the tsunami in Japan when suddenly supplier could not provide certain parts anymore. The organization then has to quickly adapt the rules so that the options requiring those parts are no longer available to order.
Therefore, the CLM vision also includes suppliers as configuration knowledge has to be shared across organizations to ensure that what is ordered also can be delivered.
Learning more?
It is clear that CLM is a complementary layer to standard PLM-infrastructures and complementary to CRM and ERP. A great example of what is possible in a modern, digital enterprise. Where can readers find more information?
Configit offers several resources on Configuration Lifecycle Management on our website, including our blog, webinars and YouTube videos, e.g., Tech Chat on Manufacturing and Configuration Lifecycle Management (CLM)
Besides these continuous growing resources, there is the whitepaper “Accelerating Digital Transformation in Manufacturing with Configuration Lifecycle Management (CLM)” available here among other whitepapers.
What I have learned
- Configuration Lifecycle Management is relevant for companies that want to streamline their business functions, i.e., sales, engineering, manufacturing, and service. CLM will reduce the number of iterations in the process, reduce costly fixing when trying to align to customer demands, and ultimately create more service offerings by knowing customer existing configurations.
- The technology to implement CLM is there. Configit has shown in various industries, it is possible. It is an example of adding value on top of a digital information infrastructure (CRM, PLM, and ERP)
- The challenge will be on aligning the different functions to agree and align on one standard configuration authority. Therefore, responsibility should lie at the top-level of an organization, likely the modern CIO or CDO.
- I was glad to learn that Henrik stated:
“The first step is to move away from talking about parts and instead talk about features when communicating about product capabilities”.
A topic I will discuss soon when talking about Product & Portfolio Management with PLM.
Conclusion
It was a pleasure to work with Configit, in particular, Henrik Hulgaard, learning more about Configuration Lifecycle Management or whatever you may name it. More important, I hope you find this post insightful for your understanding if and where it applies to your business.
Always feel free to ask more questions related to the complimentary value of PLM and Product Configuration Management(CLM)
Jos, what a ride you have had! And looking at some of the spaghetti system architectures of even today's businesses,…
Congratulations, Jos! I'm very happy that you'll stay active in the PLM world and continue with your blogs - during…
Jos, welcome to the world of (part-time) retirement. Enjoy your AOW. Thanks Dick, you have the experience now - enjoy…
Thanks for all the valuable thoughts you have shared with us Jos, hope your 'new career' will bring you lots…
Great.. Congratulations on reaching yet another milestone... your blog is very thought proving and helps us to think in multiple…