You are currently browsing the tag archive for the ‘Configuration Management’ tag.

Image http://www.mdux.net
As promised in my early November post – The road to model-based and connected PLM (part 9 – CM), I come back with more thoughts and ideas related to the future of configuration management. Moving from document-driven ways of working to a data-driven and model-based approach fundamentally changes how you can communicate and work efficiently.
Let’s be clear: configuration management’s target is first of all about risk management. Ensuring your company’s business remains sustainable, efficient, and profitable.
By providing the appropriate change processes and guidance, configuration management either avoids costly mistakes and iterations during all phases of a product lifecycle or guarantees the quality of the product and information to ensure safety.
Companies that have not implemented CM practices probably have not observed these issues. Or they have not realized that the root cause of these issues is a lack of CM.
Similar to what is said in smaller companies related to PLM, CM is often seen as an overhead, as employees believe they thoroughly understand their products. In addition, CM is seen as a hurdle to innovation because of the standardization of practices. So yes, they think it is normal that there are sometimes problems. That’s life.
I already wrote about this topic in 2010 PLM, CM and ALM – not sexy 😦 – where ALM means Asset Lifecycle Management – my focus at that time.
Hear it from the experts
To shape the discussion related to the future of Configuration Management, I had a vivid discussion with three thought leaders in this field: Lisa Fenwick, Martijn Dullaart and Maxime Gravel. A short introduction of the three of them:
Lisa Fenwick, VP Product Development at CMstat, a leading company in Configuration Management and Data Management software solutions and consulting services for aviation, aerospace & defense, marine, and other high-tech industries. She has over 25 years of experience with CM and Deliverables Management, including both government and commercial environments.
Ms. Fenwick has achieved CMPIC SME, CMPIC CM Assessor, and CMII-C certifications. Her experience includes implementing CM software products, CM-related consulting and training, and participation in the SAE and IEEE standards development groups
Martijn Dullaart is the Lead Architect for Enterprise Configuration Management at ASML (Our Dutch national pride) and chairperson of the Industry 4.0 committee of the Institute Process Excellence (IPX) Congress. Martijn has his own blog mdux.net, and you might have seen him recently during the PLM Roadmap & PDT Fall conference in November – his thoughts about the CM future can be found on his blog here
Maxime Gravel, Manager Model-Based Engineering at Moog Inc., a worldwide designer, manufacturer, and integrator of advanced motion control products. Max has been the director of the model-based enterprise at the Institute for Process Excellence (IPX) and Head of Configuration and Change Management at Gulfstream Aerospace which certified the first aircraft in a 3D Model-Based Environment.
What we discussed:
We had an almost one-hour discussion related to the following points:
- The need for Enterprise Configuration Management – why and how
- The needed change from document-driven to model-based – the impact on methodology and tools
- The “neural network” of data – connecting CM to all other business domains, a similar view as from the PLM domain,
I kept from our discussion the importance of planning – as seen in the CMstat image on the left.
To plan which data you need to manage and how you will manage the data. How often are you doing this in your company’s projects?
Next, all participants stressed the importance of education and training on this topic – get educated. Configuration Management is not a topic that is taught at schools. Early next year, I will come back on education as the benefits of education are often underestimated. Not everything can be learned by “googling.”
Conclusion
The journey towards a model-based and data-driven future is not a quick one to be realized by new technologies. However, it is interesting to learn that the future of connected data (the “neural network”) allows organizations to implement both CM and PLM in a similar manner, using graph databases and automation. When executed at the enterprise level, the result will be that CM and PLM become natural practices instead of other siloed system-related disciplines.
Most of the methodology is there; the implementation to make it smooth and embedded in organizations will be the topics to learn. Join us in discussing and learning!
When I started this series in July, I expected to talk mostly about new ways of working, enabled through a data-driven and model-based approach. However, when analyzing what is needed for such a future (part 3), it became apparent that many of these new ways of working are dependent on technology.
From coordinated to connected sounds like a business change;
however, it all depends on technology. And here I have to thank Marc Halpern (Gartner’s Research VP, Engineering and Design Technologies) again, who came with this brilliant scheme below:
So now it is time to address the last point from my starting post:
Configuration Management requires a new approach. The current methodology is very much based on hardware products with labor-intensive change management. However, the world of software products has different configuration management and change procedures. Therefore, we need to merge them into a single framework. Unfortunately, this cannot be the BOM framework due to the dynamics in software changes.
Configuration management at this moment
PLM and CM are often considered overlapping. My March 2019 post: PLM and Configuration Management – a happy marriage? shares some thoughts related to this point
Does having PLM or PDM installed mean you have implemented CM? There is this confusion because revision management is considered the same as configuration management. Read my March 2020 post: What the FFF is happening? Based on a vivid discussion launched by Yoann Maingon, CEO and founder of Ganister, an example of a modern, graph database-based, flexible PLM solution.
To hear it from a CM-side, I discussed it with Martijn Dullaart in my February 2021 post: PLM and Configuration Management. We also zoomed in on CM2 in this post as a methodology.
Martijn is the Lead Architect for Enterprise Configuration Management at ASML (Our Dutch national pride) and chairperson of the Industry 4.0 committee of the Integrated Process Excellence (IPX) Congress.
As mentioned before in a previous post (part 6), he will be speaking at the PLM Roadmap & PDT Fall conference starting this upcoming week.
In this post, I want to talk about the CM future. For understanding the current situation, you can find a broad explanation here on Wikipedia. Have a look at CM in the context of the product lifecycle, ensuring that the product As-Specified and As-Designed information matches the As-Built and As-Operated product information.
A mismatch or inconsistency between these artifacts can lead to costly errors, particularly in later lifecycle stages. CM originated from the Aerospace and Defense industry for that reason. However, companies in other industries might have implemented CM practices too. Either due to regulations or thanks to the understanding that configuration mistakes can cause significant damage to the company.
Historically configuration management addresses the needs of “slow-moving” products. For example, the design of an airplane could take years before manufacturing started. Tracking changes and ensuring consistency of all referenced datasets was often a manual process.
On purpose, I wrote “referenced datasets,” as the information was not connected in a single environment most of the time. The identifier of a dataset ( an item or a document) was the primary information carrier used for mentally connecting other artifacts to keep consistency.
The Institute of Process Excellence (IPX) has been one of the significant contributors to configuration management methodology. They have been providing (and still offer) CM2 training and certification.
As mentioned before, PLM vendors or implementers suggest that a PLM system could fully support Configuration Management. However, CM is more than change management, release management and revision management.
As the diagram from Martijn Dullaart shows, PLM is one facet of configuration management.
Of course, there are also (a few) separate CM tools focusing on the configuration management process. CMstat’s EPOCH CM tool is an example of such software. In addition, on their website, you can find excellent articles explaining the history and their future thoughts related to CM.
The future will undoubtedly be a connected, model-based, software-driven environment. Naturally, therefore, configuration management processes will have to change. (Impressive buzz word sentence, still I hope you get the message).
From coordinated to connected has a severe impact on CM. Let’s have a look at the issues.
Configuration Management – the future
The transition to a data-driven and model-based infrastructure has raised the following questions:
- How to deal with the granularity of data – each dataset needs to be validated. For example, a document (a collection of datasets) needs to be validated in the document-based approach. How to do this efficiently?
- The behavior of a product (or system) will more and more dependent on software. Product CM practices have been designed for the hardware domain; now, we need a mix of hardware and software CM practices.
- Due to the increased complexity of products (or systems) and the rapid changes due to software versions, how do we guarantee the As-Operated product is still matching the As-Designed / As-Certified definitions.
I don’t have answers to these questions. I only share observations and trends I see in my actual world.
Granularity of data
The concept of datasets has been discussed in my post (part 6). Now it is about how to manage the right sets of connected data.
The image on the left, borrowed from Erik Herzog’s presentation at the PDM Roadmap & PDT Fall conference in 2020, is a good illustration of the challenge.
At that time, Erik suggested that OSLC could be the enabler of a digital CM backbone for an enterprise. Therefore, it was a pleasure to see Erik providing an update at the yearly OSLC Fest conference this week.
You can find the agenda and Erik’s presentation here on day 2.
OSLC as a framework seems to be a good candidate for supporting modern CM scenarios. It allows a company to build full traceability between all relevant artifacts (if digital available). I can see the beauty of the technical infrastructure.
Still, it is about people and processes first. Therefore, I am curious to learn from my readers who believe and experiment with such a federated infrastructure.
More software
Traditional working companies might believe that software should be treated as part of the Bill of Materials. In this theory, you treat software code as a part, with a part number and revision. In this way, you might believe configuration management practices do not have to change. However, there are some fundamental differences in why we should decouple hardware and software.
First, for the same hardware solution, there might be a whole collection of valid software codes. Just like your computer. How many valid software codes, even from the same application, can you run on this hardware? Managing a computer system and its software through a Bill of Materials is unimaginable.
A computer, of course, is designed for running all kinds of software versions. However, modern products in the field, like cars, machines, electrical devices, all will have a similar type of software-driven flexibility.
For that reason, I believe that companies that deliver software-driven products should design a mechanism to check if the combination of hardware and software is valid. For a computer system, a software mismatch might not be costly or painful; for an industrial system, it might be crucial to ensure invalid combinations can exist. Click on the image to learn more.
Solutions like Configit or pure::variants might lead to a solution. In Feb 2021, I discussed in PLM and Configuration Lifecycle Management with Henrik Hulgaard, the CTO from Configit, the unique features of their solution.
I hope to have a similar post shortly with Pure Systems to understand their added value to configuration management.
Software change management is entirely different from hardware change management. The challenge is to have two different change management approaches under one consistent umbrella without creating needless overhead.
Increased complexity – the digital twin?
With the increased complexity of products and many potential variants of a solution, how can you validate a configuration? Perhaps we should investigate the digital twin concept, with a twin for each instance we want to validate.
Having a complete virtual representation of a product, including the possibility to validate the software behavior on the virtual product, would allow you to run (automated) validation tests to certify and later understand a product in the field.
No need for inspection on-site or test and fix upgrades in the physical world. Needed for space systems for sure, but why not for every system in the long term. When we are able to define and maintain a virtual twin of our physical product (on-demand), we can validate.
I learned about this concept at the 2020 Digital Twin conference in the Netherlands. Bart Theelen from Canon Production Printing explained that they could feed their simulation models with actual customer data to simulate and analyze the physical situation. In some cases, it is even impossible to observe the physical behavior. By tuning the virtual environment, you might understand what happens in the physical world.
An eye-opener and an advocate for the model-based approach. Therefore, I am looking forward to the upcoming PLM Roadmap & PDT Fall conference. Hopefully, Martijn Dullaart will share his thoughts on combining CM and working in a model-based environment. See you there?
Conclusion
Finally, we have reached in this series the methodology part, particularly the one related to configuration management and traceability in a very granular, digital environment.
After the PLM Roadmap & PDT fall conference, I plan to follow up with three thought leaders on this topic: Martijn Dullaart (ASML), Maxime Gravel (Moog) and Lisa Fenwick (CMstat). What would you ask them?
After “The Doctor is IN,” now again a written post in the category of PLM and complementary practices/domains. In January, I discussed together with Henrik Hulgaard from Configit the complementary value of PLM and CLM (Configuration Lifecycle Management). For me, CLM is a synonym for Product Configuration Management.
As expected, readers were asking the question:
“What is the difference between CLM (Configuration Lifecycle Management) and CM(Configuration Management)?”
Good question.
As the complementary role of CM is also a part of the topics to discuss, I am happy to share this blog today with Martijn Dullaart. You probably know Martijn if you are actively following topics on PLM and CM.
Martijn has his own blog mdux.net, and you might have seen him recently in Jenifer Moore’s PLM TV-episode: Why CM2 for Faster Change and Better Documentation. Martijn is the Lead Architect for Enterprise Configuration Management at ASML (Our Dutch national pride) and chairperson of the Industry 4.0 committee of the Integrated Process Excellence (IPX) Congress. Let us start.
Configuration Management and CM2
Martijn, first of all, can you bring some clarity in terminology. When discussing Configuration Management, what is the pure definition, what is CM2 as a practice, and what is IpX‘s role and please explain where you fit in this picture?
Classical CM focuses mainly on the product, the product definition, and actual configurations like as-built and as-maintained of the product. CM2 extends the focus to the entire enterprise, e.g., the processes and procedures (ways of working) of a company, including the IT and facilities, to support the company’s value stream.
CM2 expands the scope to all information that could impact safety, security, quality, schedule, cost, profit, the environment, corporate reputation, or brand recognition.
Basically, CM2 shifts the focus to Integrated Process Excellence and promotes continual improvement.
Next to this, CM2 provides the WHAT and the HOW, something most standards lack. My main focus is still around the product and promoting the use of CM outside the product domain.
For all CM related documentation, we are already doing this.
Configuration Management and PLM
People claim that if you implement PLM as an enterprise backbone, not as an engineering tool, you can do Configuration Management with your PLM environment.
What is your opinion?
Yes, I think that this is possible, provided that the PLM tool has the right capabilities. Though the question should be: Is this the best way to go about it. For instance, some parts of Configuration Management are more transactional oriented, e.g., registering the parts you build in or out of a product.
Other parts of CM are more iterative in nature, e.g., doing impact analysis and making an implementation plan. I am not saying this cannot be done in a PLM tool as an enterprise backbone. Still, the nature of most PLM tools is to support iterative types of work rather than a transactional type of work.
I think you need some kind of enterprise backbone that manages the configuration as an As-Planned/As-Released baseline. A baseline that shows not only the released information but also all planned changes to the configuration.
Because the source of information in such a baseline comes from different tools, you need an overarching tool to connect everything. For most companies, this means that they require an overarching system with their current state of enterprise applications.
Preferably I would like to use the data directly from the sources. Still, connectivity and performance are not yet to a level that we can do this. Cloud and modern application and database architectures are very promising to this end.
Configuration Management for Everybody?
I can imagine companies in the Aerospace industry need to have proper configuration management for safety reasons. Also, I can imagine that proper configuration management can be relevant for other industries. Do they need to be regulated, or are there other reasons for a company to start implementing CM processes?
I will focus the first part of my answer within the context of CM for products only.
Basically, all products are regulated to some degree. Aerospace & Defense and Medical Device and Pharma are highly regulated for obvious reasons. Other industries are also regulated, for example, through environmental regulations like REACH, RoHS, WEEE or safety-related regulations like the CE marking or FCC marking.
Customers can also be an essential driver for the need for CM. If, as a customer, you buy expensive equipment, you expect that the supplier of that equipment can deliver per commitment. The supplier can also maintain and upgrade the equipment efficiently with as few disruptions to your operations as possible.
Not just customers but also consumers are critical towards the traceability of the product and all its components.
Even if you are sitting on a rollercoaster, you presume the product is well designed and maintained. In other words, there is often a case to be made to apply proper configuration management in any company. Still, the extent to which you need to implement it may vary based on your needs.
The need for Enterprise Configuration Management is even more significant because one of the hardest things is to change the way an organization works and operates.
Often there are different ways of doing the same thing. There is a lot of tribal knowledge, and ways of working are not documented so that people can easily find it, let alone that it is structured and linked so that you can do an impact analysis when you want to introduce a change in your organization.
CM and Digital Transformation
One of the topics that we both try to understand better is how CM will evolve in the future when moving to a more model-based approach. In the CM-terminology, we still talk about documents as information objects to be managed. What is your idea of CM and a model-based future?
It is indeed a topic where probably new or changed methodology is required, and I started already describing CM topics in several posts on my enterprise MDUX blog. Some of the relevant posts in this context are:
- HELP!!! Parts, Documents, Data & Revisions
- Where does the deliverable begin, and where does it end?
- A Glimpse into the Future of CM: Part 1, Part 2, and Part 3
First, let me say that model-based has the future, although, at the same time, the CM aspects are often overlooked.
When managing changes, too much detail makes estimating cost and effort for a business case more challenging, and planning information that is too granular is not desirable. Therefore, CM2 looks at datasets. Datasets should be as small as possible but not smaller. Datasets are sets of information that need to be released as a whole. Still, they can be released independently from other datasets. For example, a bill of materials, a BOM line item is not a dataset, but the complete set of BOM line items that make up the BoM of an assembly is considered a dataset. I can release a BoM independent from a test plan.
Data models need to facilitate this. However, today, in many PLM systems, a BOM and the metadata of a part are using the same revision. This means that to change the metadata, I need a revision of the BoM, while the BoM might not change. Some changes to metadata might not be relevant for a supplier. Communicating the changes to your supplier could create confusion.
I know some people think this is about document vs. model-centric, but it is not. A part is identified in the ‘physical world’ by its part ID. Even if you talk about allowing revisions in the supply chain, including the part ID’s revision, you create a new identifier. Now every new revision will end up in a different stock location. Is that what we want?
In any case, we are still in the early days, and the thinking about this topic has just begun and needs to take shape in the coming year(s).
CM and/or CLM?
As in my shared blog post with Henrik Hulgaard related to CLM, can you make a clear differentiation between the two domains for the readers?
Configuration Lifecycle Management (CLM) is mainly positioned towards Configurable Products and the configurable level of the product.
Why I think this, even though Configit’s CLM declaration states that “Configuration Lifecycle Management (CLM) is the management of all product configuration definitions and configurations across all involved business processes applied throughout the lifecycle of a product.”,
it also states:
- “CLM differs from other Enterprise Business Disciplines because it focuses on cross-functional use of configurable products.”
- “Provides a Single Source of Truth for Configurable Data“
- “handles the ever-increasing complexity of Configurable Products“.
I find Configuration Lifecycle Management is a core Configuration Management practice you need to have in place for configurable products. The dependencies you need to manage are enormously complex. Software parameters that depend on specific hardware, hardware to hardware dependencies, commercial variants, and options.
Want to learn more?
In this post, we just touched the surface of PLM and Configuration Management. Where can an interested reader find more information related to CM for their company?
For becoming trained in CM2, people can reach out to the Institute for Process Excellence, a company that focuses on consultancy and methodology for many aspects of a modern, digital enterprise, including Configuration Management.
And there is more out there, e.g.:
- ENGINEERING DOCUMENTATION CONTROL HANDBOOK, Configuration Management, Second Edition, by Frank. B. Watts
- Decision tree to assess interchangeability, Also Jörg Eisenträger (which is a good starting point)
- Configuration Management: Theory and Application for Engineers, Managers, and Practitioners, by Jon Quigleyand Kim Robertson.
- CM Insights Blog by CMStat
- Configuration Management Standard EIA649C by SAE
- MIL-HDBK-61A by product-lifecycle-management.com
Conclusion
Thanks, Martijn, for your clear explanations. People working seriously in the PLM domain managing the full product lifecycle should also learn and consider Configuration Management best practices. I look forward to a future discussion on how to perform Configuration Management in a model-based environment.
Jos, one could take the approach that there is an engineering transformation strategy that can be realized by implementing PLM…
Jos, some more thoughts. It is already split; no company is just using one tool. Could be that the engagement…
Jos, I agree we should break out from the monolithic approach as this typically means lock-in, risk and frustration. The…
Jos, Thanks for these insights. I believe that the mature capabilities provided by advanced toolsets can also be of benefit…
Definitely… the digital mesh is here to stay !!!! Thanks for your confidence Jorge