You are currently browsing the category archive for the ‘Platform’ category.
This year started for me with a discussion related to federated PLM. A topic that I highlighted as one of the imminent trends of 2022. A topic relevant for PLM consultants and implementers. If you are working in a company struggling with PLM, this topic might be hard to introduce in your company.
Before going into the discussion’s topics and arguments, let’s first describe the historical context.
The traditional PLM frame.
Historically PLM has been framed first as a system for engineering to manage their product data. So you could call it PDM first. After that, PLM systems were introduced and used to provide access to product data, upstream and downstream. The most common usage was the relation with manufacturing, leading to EBOM and MBOM discussions.
IT landscape simplification often led to an infrastructure of siloed solutions – PLM, ERP, CRM and later, MES. IT was driving the standardization of systems and defining interfaces between systems. System capabilities were leading, not the flow of information.
As many companies are still in this stage, I would call it PLM 1.0
PLM 1.0 systems serve mainly as a System of Record for the organization, where disciplines consolidate their data in a given context, the Bills of Information. The Bill of Information then is again the place to connect specification documents, i.e., CAD models, drawings and other documents, providing a Digital Thread.
The actual engineering work is done with specialized tools, MCAD/ECAD, CAE, Simulation, Planning tools and more. Therefore, each person could work in their discipline-specific environment and synchronize their data to the PLM system in a coordinated manner.
However, this interaction is not easy for some of the end-users. For example, the usability of CAD integrations with the PLM system is constantly debated.
Many of my implementation discussions with customers were in this context. For example, suppose your products are relatively simple, or your company is relatively small. In that case, the opinion is that the System or Record approach is overkill.
That’s why many small and medium enterprises do not see the value of a PLM backbone.
This could be true till recently. However, the threats to this approach are digitization and regulations.
Customers, partners, and regulators all expect more accurate and fast responses on specific issues, preferably instantly. In addition, sustainability regulations might push your company to implement a System of Record.
PLM as a business strategy
For the past fifteen years, we have discussed PLM more as a business strategy implemented with business systems and an infrastructure designed for sharing. Therefore, I choose these words carefully to avoid overhanging the expression: PLM as a business strategy.
The reason for this prudence is that, in reality, I have seen many PLM implementations fail due to the ambiguity of PLM as a system or strategy. Many enterprises have previously selected a preferred PLM Vendor solution as a starting point for their “PLM strategy”.

One of the most neglected best practices.
In reality, this means there was no strategy but a hope that with this impressive set of product demos, the company would find a way to support its business needs. Instead of people, process and then tools to implement the strategy, most of the time, it was starting with the tools trying to implement the processes and transform the people. That is not really the definition of business transformation.
In my opinion, this is happening because, at the management level, decisions are made based on financials.
Developing a PLM-related business strategy requires management understanding and involvement at all levels of the organization.
This is often not the case; the middle management has to solve the connection between the strategy and the execution. By design, however, the middle management will not restructure the organization. By design, they will collect the inputs van the end users.
And it is clear what end users want – no disruption in their comfortable way of working.
Halfway conclusion:
Rebranding PLM as a business strategy has not really changed the way companies work. PLM systems remain a System of Record mainly for governance and traceability.
To understand the situation in your company, look at who is responsible for PLM.
- If IT is responsible, then most likely, PLM is not considered a business strategy but more an infrastructure.
- If engineering is responsible for PLM, then you are still in the early days of PLM, the engineering tools to be consulted by others upstream or downstream.
Only when PLM accountability is at the upper management level, it might be a business strategy (assume the upper management understands the details)
Connected is the game changer
Connecting all stakeholders in an engagement has been a game changer in the world. With the introduction of platforms and the smartphone as a connected device, consumers could suddenly benefit from direct responses to desired service requests (Spotify, iTunes, Uber, Amazon, Airbnb, Booking, Netflix, …).
The business change: connecting real-time all stakeholders to deliver highly rapid results.
What would be the game changer in PLM was the question? The image below describes the 2014 Accenture description of digital PLM and its potential benefits.
Is connected PLM a utopia?
Marc Halpern from Gartner shared in 2015 the slide below that you might have seen many times before. Digital Transformation is really moving from a coordinated to a connected technology, it seems.
The image below gives an impression of an evolution.
I have been following this concept till I was triggered by a 2017 McKinsey publication: “our insights/toward an integrated technology operating model“.
This was the first notion for me that the future should be hybrid, a combination of traditional PLM (system of record) complemented with teams that work digitally connected; McKinsey called them pods that become product-centric (multidisciplinary team focusing on a product) instead of discipline-centric (marketing/engineering/manufacturing/service)
In 2019 I wrote the post: The PLM migration dilemma supporting the “shocking” conclusion “Don’t think about migration when moving to data-driven, connected ways of working. You need both environments.”
One of the main arguments behind this conclusion was that legacy product data and processes were not designed to ensure data accuracy and quality on such a level that it could become connected data. As a result, converting documents into reliable datasets would be a costly, impossible exercise with no real ROI.
The second argument was that the outside world, customers, regulatory bodies and other non-connected stakeholders still need documents as standardized deliverables.
The conclusion led to the image below.

Systems of Record (left) and Systems of Engagement (right)
Splitting PLM?
In 2021 these thoughts became more mature through various publications and players in the PLM domain.
We saw the upcoming of Systems of Engagement – I discussed OpenBOM, Colab and potentially Configit in the post: A new PLM paradigm. These systems can be characterized as connected solutions across the enterprise and value chain, focusing on a platform experience for the stakeholders.
These are all environments addressing the needs of a specific group of users as efficiently and as friendly as possible.
A System of Engagement will not fit naturally in a traditional PLM backbone; the System of Record.
Erik Herzog with SAAB Aerospace and Yousef Houshmand at that time with Daimler published that year papers related to “Federated PLM” or “The end of monolithic PLM.”. They acknowledged a company needs to focus on more than a single PLM solution. The presentation from Erik Herzog at the PLM Roadmap/PDT conference was interesting because Erik talked about the Systems of Engagement and the Systems of Record. He proposed using OSLC as the standard to connect these two types of PLM.
It was a clear example of an attempt to combine the two kinds of PLM.
And here comes my question: Do we need to split PLM?
When I look at PLM implementations in the field, almost all are implemented as a System of Record, an information backbone proved by a single vendor PLM. The various disciplines deliver their content through interfaces to the backbone (Coordinated approach).
However, there is low usability or support for multidisciplinary collaboration; the PLM backbone is not designed for that.
Due to concepts of Model-Based Systems Engineering (MBSE) and Model-Based Definition (MBD), there are now solutions on the market that allow different disciplines to work jointly related to connected datasets that can be manipulated using modeling software (1D, 2D, 3D, 4D,…).
These environments, often a mix of software and hardware tools, are the Systems of Engagement and provide speedy results with high quality in the virtual world. Digital Twins are running on Systems of Engagements, not on Systems of Records.
Systems of Engagement do not need to come from the same vendor, as they serve different purposes. But how to explain this to your management, who wants simplicity. I can imagine the IT organization has a better understanding of this concept as, at the end of 2015, Gartner introduced the concept of the bimodal approach.
Their definition:
Mode 1 is optimized for areas that are more well-understood. It focuses on exploiting what is known. This includes renovating the legacy environment so it is fit for a digital world. Mode 2 is exploratory, potentially experimenting to solve new problems. Mode 2 is optimized for areas of uncertainty. Mode 2 often works on initiatives that begin with a hypothesis that is tested and adapted during a process involving short iterations.
No Conclusion – but a question this time:
At the management level, unfortunately, there is most of the time still the “Single PLM”-mindset due to a lack of understanding of the business. Clearly splitting your PLM seems the way forward. IT could be ready for this, but will the business realize this opportunity?
What are your thoughts?
In my previous post, “My PLM Bookshelf,” on LinkedIn, I shared some of the books that influenced my thinking related to PLM. As you can see in the LinkedIn comments, other people added their recommendations for PLM-related books to get inspired or more knowledgeable.
Where reading a book is a personal activity, now I want to share with you how to get educated in a more interactive manner related to PLM. In this post, I talk with Peter Bilello, President & CEO of CIMdata. If you haven’t heard about CIMdata and you are active in PLM, more to learn on their website HERE. Now let us focus on Education.
CIMdata
Peter, knowing CIMdata from its research valid for the whole PLM community, I am curious to learn what is the typical kind of training CIMdata is providing to their customers.
Jos, throughout much of CIMdata’s existence, we have delivered educational content to the global PLM industry. With a core business tenant of knowledge transfer, we began offering a rich set of PLM-related tutorials at our North American and pan-European conferences starting in the earlier 1990s.
Since then, we have expanded our offering to include a comprehensive set of assessment-based certificate programs in a broader PLM sense. For example, systems engineering and digital transformation-related topics. In total, we offer more than 30 half-day classes. All of which can be delivered in-person as a custom configuration for a specific client and through public virtual-live or in-person classes. We have certificated more than 1,000 PLM professionals since the introduction in 2009 of this PLM Leadership offering.
Based on our experience, we recommend that an organization’s professional education strategy and plans address the organization’s specific processes and enabling technologies. This will help ensure that it drives the appropriate and consistent operations of its processes and technologies.
For that purpose, we expanded our consulting offering to include a comprehensive and strategic digital skills transformation framework. This framework provides an organization with a roadmap that can define the skills an organization’s employees need to possess to ensure a successful digital transformation.
In turn, this framework can be used as an efficient tool for the organization’s HR department to define its training and job progression programs that align with its overall transformation.
The success of training
We are both promoting the importance of education to our customers. Can you share with us an example where Education really made a difference? Can we talk about ROI in the context of training?
Jos, I fully agree. Over the years, we have learned that education and training are often minimized (i.e., sub-optimized). This is unfortunate and has usually led to failed or partially successful implementations.
In our view, both education and training are needed, along with strong organizational change management (OCM) and a quality assurance program during and after the implementation.
In our terms, education deals with the “WHY” and training with the “HOW”. Why do we need to change? Why do we need to do things differently? And then “HOW” to use new tools within the new processes.
We have seen far too many failed implementations where sub-optimized decisions were made due to a lack of understanding (i.e., a clear lack of education). We have also witnessed training and education being done too early or too late.
This leads to a reduced Return on Investment (ROI).
Therefore a well-defined skills transformation framework is critical for any company that wants to grow and thrive in the digital world. Finally, a skills transformation framework needs to be tied directly to an organization’s digital implementation roadmap and structure, state of the process, and technology maturity to maximize success.
Training for every size of the company?
When CIMdata conducts PLM training, is there a difference, for example, when working with a big global enterprise or a small and medium enterprise?
You might think the complexity might be similar; however, the amount of internal knowledge might differ. So how are you dealing with that?
We basically find that the amount of training/education required mostly depends on the implementation scope. Meaning the scope of the proposed digital transformation and the current maturity level of the impacted user community.
It is important to measure the current maturity and establish appropriate metrics to measure the success of the training (e.g., are people, once trained, using the tools correctly).
CIMdata has created a three-part PLM maturity model that allows an organization to understand its current PLM-related organizational, process, and technology maturity.
The PLM maturity model provides an important baseline for identifying and/or developing the appropriate courses for execution.
This also allows us, when we are supporting the definition of a digital skills transformation framework, to understand how the level of internal knowledge might differ within and between departments, sites, and disciplines. All of which help define an organization-specific action plan, no matter its size.
Where is CIMdata training different?
Most of the time, PLM implementers offer training too for their prospects or customers. So, where is CIMdata training different?
For this, it is important to differentiate between education and training. So, CIMdata provides education (the why) and training and education strategy development and planning.
We don’t provide training on how to use a specific software tool. We believe that is best left to the systems integrator or software provider.
While some implementation partners can develop training plans and educational strategies, they often fall short in helping an organization to effectively transform its user community. Here we believe training specialists are better suited.
Digital Transformation and PLM
One of my favorite topics is the impact of digitization in the area of product development. CIMdata introduced the Product Innovation Platform concept to differentiate from traditional PDM/PLM. Who needs to get educated to understand such a transformation, and what does CIMdata contribute to this understanding.
We often start with describing the difference between digitalization and digitization. This is crucial to be understood by an organization’s management team. In addition, management must understand that digitalization is an enterprise initiative.
It isn’t just about product development, sales, or enabling a new service experience. It is about maximizing a company’s ROI in applying and leveraging digital as needed throughout the organization. The only way an organization can do this successfully is by taking an end-to-end approach.
The Product Innovation Platform is focused on end-to-end product lifecycle management. Therefore, it must work within the context of other enterprise processes that are focused on the business’s resources (i.e., people, facilities, and finances) and on its transactions (e.g., purchasing, paying, and hiring).
As a result, an organization must understand the interdependencies among these domains. If they don’t, they will ultimately sub-optimize their investment. It is these and other important topics that CIMdata describes and communicates in its education offering.
More than Education?
As a former teacher, I know that a one-time education, a good book or slide deck, is not enough to get educated. How does CIMdata provide a learning path or coaching path to their customers?
Jos, I fully agree. Sustainability of a change and/or improved way of working (i.e., long-term sustainability) is key to true and maximized ROI. Here I am referring to the sustainability of the transformation, which can take years.
With this, organizational change management (OCM) is required. OCM must be an integral part of a digital transformation program and be embedded into a program’s strategy, execution, and long-term usage. That means training, education, communication, and reward systems all have to be managed and executed on an ongoing basis.
For example, OCM must be executed alongside an organization’s digital skills transformation program. Our OCM services focus on strategic planning and execution support. We have found that most companies understand the importance of OCM, often don’t fully follow through on it.
A model-based future?
During the CIMdata Roadmap & PDT conferences, we have often discussed the importance of Model-Based Systems Engineering methodology as a foundation of a model-based enterprise. What do you see? Is it only the big Aerospace and Defense companies that can afford this learning journey, or should other industries also invest? And if yes, how to start.
Jos, here I need to step back for a minute. All companies have to deal with increasing complexity for their organization, supply chain, products, and more.
So, to optimize its business, an organization must understand and employ systems thinking and system optimization concepts. Unfortunately, most people think of MBSE as an engineering discipline. This is unfortunate because engineering is only one of the systems of systems that an organization needs to optimize across its end-to-end value streams.
The reality is all companies can benefit from MBSE. As long as they consider optimization across their specific disciplines, in the context of their products and services and where they exist within their value chain.
The MBSE is not just for Aerospace and Defense companies. Still, a lot can be learned from what has already been done. Also, leading automotive companies are implementing and using MBSE to design and optimize semi- and high-automated vehicles (i.e., systems of systems).
The starting point is understanding your systems of systems environment and where bottlenecks exist.
There should be no doubt, education is needed on MBSE and how MBSE supports the organization’s Model-Based Enterprise requirements.
Published work from the CIMdata administrated A&D PLM Action Group can be helpful. Also, various MBE and systems engineering maturity models, such as one that CIMdata utilizes in its consulting work.
Want to learn more?
Thanks, Peter, for sharing your insights. Are there any specific links you want to provide to get educated on the topics discussed? Perhaps some books to read or conferences to visit?
x
Jos, as you already mentioned:
x
- the CIMdata Roadmap & PDT conferences have provided a wealth of insight into this market for more than 25 years.
[Jos: Search for my blog posts starting with the text: “The weekend after ….”] - In addition, there are several blogs, like yours, that are worth following, and websites, like CIMdata’s pages for education or other resources which are filled with downloadable reading material.
- Additionally, there are many user conferences from PLM solution providers and third-party conferences, such as those hosted by the MarketKey organization in the UK.
These conferences have taken place in Europe and North America for several years. Information exchange and formal training and education are offered in many events. Additionally, they provide an excellent opportunity for networking and professional collaboration.
What I learned
Talking with Peter made me again aware of a few things. First, it is important to differentiate between education and training. Where education is a continuous process, training is an activity that must take place at the right time. Unfortunately, we often mix those two terms and believe that people are educated after having followed a training.
Secondly, investing in education is as crucial as investing in hard- or software. As Peter mentioned:
We often start with describing the difference between digitalization and digitization. This is crucial to be understood by an organization’s management team. In addition, management must understand that digitalization is an enterprise initiative.
System Thinking is not just an engineering term; it will be a mandate for managing a company, a product and even a planet into the future
Conclusion
This time a quote from Albert Einstein, supporting my PLM coaching intentions:
“Education is not the learning of facts
but the training of the mind to think.”
This week I attended the SCAF conference in Jonkoping. SCAF is an abbreviation of the Swedish CATIA User Group. First of all, I was happy to be there as it was a “physical” conference, having the opportunity to discuss topics with the attendees outside the presentation time slot.
It is crucial for me as I have no technical message. Instead, I am trying to make sense of the future through dialogues. What is sure is that the future will be based on new digital concepts, completely different from the traditional approach that we currently practice.
My presentation, which you can find here on SlideShare, was again zooming in on the difference between a coordinated approach (current) and a connected approach (the future).
The presentation explains the concepts of datasets, which I discussed in my previous blog post. Now, I focussed on how this concept can be discovered in the Dassault Systemes 3DExperience platform, combined with the must-go path for all companies to more systems thinking and sustainable products.
It was interesting to learn that the concept of connected datasets like the spider’s web in the image reflected the future concept for many of the attendees.
One of the demos during the conference illustrated that it is no longer about managing the product lifecycle through structures (EBOM/MBOM/SBOM).
Still, it is based on a collection of connected datasets – the path in the spider’s web.
It was interesting to talk with the present companies about their roadmap. How to become a digital enterprise is strongly influenced by their legacy culture and ways of working. Where to start to be connected is the main challenge for all.
A final positive remark. The SCAF had renamed itself to SCAF (3DX), showing that even CATIA practices no longer can be considered as a niche – the future of business is to be connected.
Now back to the thread that I am following on the series The road to model-based. Perhaps I should change the title to “The road to connected datasets, using models”. The statement for this week to discuss is:
Data-driven means that you need to have an enterprise architecture, data governance and a master data management (MDM) approach. So far, the traditional PLM vendors have not been active in the MDM domain as they believe their proprietary data model is leading. Read also this interesting McKinsey article: How enterprise architects need to evolve to survive in a digital world
Reliable data
If you have been following my story related to PLM transition: From a connected to a coordinated infrastructure might have seen the image below:
The challenge of a connected enterprise is that you want to connect different datasets, defined in various platforms, to support any type of context. We called this a digital thread or perhaps even better framed a digital web.
This is new for most organizations because each discipline has been working most of the time in its own silo. They are producing readable information in neutral files – pdf drawings/documents. In cases where a discipline needs to deliver datasets, like in a PDM-ERP integration, we see IT-energy levels rising as integrations are an IT thing, right?
Too much focus on IT
In particular, SAP has always played the IT card (and is still playing it through their Siemens partnership). Historically, SAP claimed that all parts/items should be in their system. Thus, there was no need for a PDM interface, neglecting that the interface moment was now shifted to the designer in CAD. And by using the name Material for what is considered a Part in the engineering world, they illustrated their lack of understanding of the actual engineering world.
There is more to “blame” to SAP when it comes to the PLM domain, or you can state PLM vendors did not yet understand what enterprise data means. Historically ERP systems were the first enterprise systems introduced in a company; they have been leading in a transactional “digital” world. The world of product development never has been a transactional process.
SAP introduced the Master Data Management for their customers to manage data in heterogeneous environments. As you can imagine, the focus of SAP MDM was more on the transactional side of the product (also PIM) than on the engineering characteristics of a product.
I have no problem that each vendor wants to see their solution as the center of the world. This is expected behavior. However, when it comes to a single system approach, there is a considerable danger of vendor lock-in, a lack of freedom to optimize your business.
In a modern digital enterprise (to be), the business processes and value streams should be driving the requirements for which systems to use. I was tempted to write “not the IT capabilities”; however, that would be a mistake. We need systems or platforms that are open and able to connect to other systems or platforms. The technology should be there, and more and more, we realize the future is based on connectivity between cloud solutions.
In one of my first posts (part 2), I referred to five potential platforms for a connected enterprise. Each platform will have its own data model based on its legacy design, allowing it to service its core users in an optimized environment.
When it comes to interactions between two or more platforms, for example, between PLM and ERP, between PLM and IoT, but also between IoT and ERP or IoT and CRM, these interactions should first be based on identified business processes and value streams.
The need for Master Data Management
Defining horizontal business processes and value streams independent of the existing IT systems is the biggest challenge in many enterprises. Historically, we have been thinking around a coordinated way of working, meaning people shifting pieces of information between systems – either as files or through interfaces.
In the digital enterprise, the flow should be leading based on the stakeholders involved. Once people agree on the ideal flow, the implementation process can start.
Which systems are involved, and where do we need a connection between the two systems. Is the relationship bidirectional, or is it a push?
The interfaces need to be data-driven in a digital enterprise; we do not want human interference here, slowing down or modifying the flow. This is the moment Master Data Management and Data Governance comes in.
When exchanging data, we need to trust the data in its context, and we should be able to use the data in another context. But, unfortunately, trust is hard to gain.
I can share an example of trust when implementing a PDM system linked to a Microsoft-friendly ERP system. Both systems we able to have Excel as an interface medium – the Excel columns took care of the data mapping between these two systems.
In the first year, engineers produced the Excel with BOM information and manufacturing engineering imported the Excel into their ERP system. After a year, the manufacturing engineers proposed to automatically upload the Excel as they discovered the exchange process did not need their attention anymore – they learned to trust the data.
How often have you seen similar cases in your company where we insist on a readable exchange format?
When you trust the process(es), you can trust the data. In a digital enterprise, you must assume that specific datasets are used or consumed in different systems. Therefore a single data mapping as in the Excel example won’t be sufficient
Master Data Management and standards?
Some traditional standards, like the ISO 15926 or ISO 10303, have been designed to exchange process and engineering data – they are domain-specific. Therefore, they could simplify your master data management approach if your digitalization efforts are in that domain.
To connect other types of data, it is hard to find a global standard that also encompasses different kinds of data or consumers. Think about the GS1 standard, which has more of a focus on the consumer-side of data management. When PLM meets PIM, this standard and Master Data Management will be relevant.
Therefore I want to point to these two articles in this context:
How enterprise architects need to evolve to survive in a digital world focusing on the transition of a coordinated enterprise towards a connected enterprise from the IT point of view. And a recent LinkedIn post, Web Ontology Language as a common standard language for Engineering Networks? by Matthias Ahrens exploring the concepts I have been discussing in this post.
To me, it seems that standards are helpful when working in a coordinated environment. However, in a connected environment, we have to rely on master data management and data governance processes, potentially based on a clever IT infrastructure using graph databases to be able to connect anything meaningful and possibly artificial intelligence to provide quality monitoring.
Conclusion
Standards have great value in exchange processes, which happen in a coordinated business environment. To benefit from a connected business environment, we need an open and flexible IT infrastructure supported by algorithms (AI) to guarantee quality. Before installing the IT infrastructure, we should first have defined the value streams it should support.
What are your experiences with this transition?
In my last post in this series, The road to model-based and connected PLM, I mentioned that perhaps it is time to talk about SLM instead of PLM when discussing popular TLA’s for our domain of expertise. There were not so many encouraging statements for SLM so far.
SLM could mean for me, Solution Lifecycle Management, considering that the company’s offering more and more is a mix of products and services. Or SLM could mean System Lifecycle Management, in that case pushing the idea that more and more products are interacting with the outside world and therefore could be considered systems. Products are (almost) dead.
In addition, I mentioned that the typical product lifecycle and related configuration management concepts need to change as in the SLM domain. There is hardware and software with different lifecycles and change processes.
It is a topic I want to explore further. I am curious to learn more from Martijn Dullaart, who will be lecturing at the PLM Road map and PDT 2021 fall conference in November. I hope my expectations are not too high, knowing it is a topic of interest for Martijn. Feel free to join this discussion
In this post, it is time to follow up on my third statement related to what data-driven implies:
Data-driven means that we need to manage data in a much more granular manner. We have to look different at data ownership. It becomes more about data accountability per role as the data can be used and consumed throughout the product lifecycle
On this topic, I have a list of points to consider; let’s go through them.
The dataset
In this post, I will often use the term dataset (you are also allowed to write the data set I understood).
A dataset means a predefined number of attributes and values that belong logically to each other. Datasets should be defined based on the purpose and, if possible, designated for a single goal. In this way, they can be stored in a database.
Combined with other datasets, a combination can result in relevant business information. Note a dataset is not only transactional data; a dataset could also describe geometry.
Identify the dataset
In the document-based world, a lot of information could be stored in a single file. In a data-driven world, we should define a dataset that contains a specific piece of information, logically belonging together. If we are more precise, a part would have various related datasets that make up the definition of a part. These definitions could be:
- Core identification attributes like ID, Name, Type and Status
- The Type could define a set of linked information. For example, a valve would have different characteristics as a resistor. Through classification, we can link data sets to the core definition of a part.
- The part can have engineering-specific data (CAD and metadata), manufacturing-specific data, supplier-specific data, and service-specific data. Each of these datasets needs to be defined as a unique element in a data-driven environment
- CAD is a particular case as most current CAD systems don’t treat geometry as a single dataset. In a file-based world, many other datasets are stored in the file (e.g., engineering or manufacturing details). In a data-driven environment, we want to have the CAD definition to be treated like a dataset. Dassault Systèmes with their CATIA V6 and 3DEXPERIENCE platform or PTC with OnShape are examples of this approach.Having CAD as separate datasets makes sharing and collaboration so much easier, as we can see from these solutions. The concept for CAD stored in a database is not new, and this approach has been used in various disciplines. Mechanical CAD was always a challenge.
Thanks to Moore’s Law (approximate every 2 years, processor power doubled – click on the image for the details) and higher network connection speed, it starts to make sense to have mechanical CAD also stored in a database instead of a file
An important point to consider is a kind of standardization of datasets. In theory, there should be a kind of minimum agreed collection of datasets. Industry standards provide these collections in their dictionary. Whenever you optimize your data model for a connected enterprise, make sure you look first into the standards that apply to your industry.
They might not be perfect or complete, but inventing your own new standard is a guarantee for legacy issues in the future. This remark is also valid for the software vendors in this domain. A proprietary data model might give you a competitive advantage.
Still, in the long term, there is always the need to connect with outside stakeholders.
Identify the RACI
To ensure a dataset is complete and well maintained, the concept of RACI could be used. RACI is the abbreviation for Responsible Accountable Consulted and Informed and a simplification of the RASCI Model, see also a responsibility assignment matrix.
In a data-driven environment, there is no data ownership anymore like you have for documents. The main reason that data ownership can no longer be used is that datasets can be consumed by anyone in the ecosystem. No longer only your department or the manufacturing or service department.
Data sets in a data-driven environment bring value when connected with other datasets in applications or dashboards.
A dataset describing the specification attributes of a part could be used in a spare part app and a service app. Of course, the dataset will be used in a different context – still, we need to ensure we can trust the data.
Therefore, per identified dataset, there should be governed by a kind of RACI concept. The RACI concept is a way to break the siloes in an organization.
Identify Inside / outside
There is a lot of fear that a connected, data-driven environment will expose Intellectual Property (IP). It came up in recent discussions. If you like storytelling and technology, read my old SmarTeam colleague Alex Bruskin’s post: The Bilbo Baggins Threat to PLM Assets. Alex has written some “poetry” with a deep technical message behind it.
It is true that if your data set is too big, you have the challenge of exposing IP when connecting this dataset with others. Therefore, when building a data model, you should make it possible to have datasets pure for internal usage and datasets for sharing.
When you use the concept of RACI, the difference should be defined by the I(informed) – is it PLM-data or PIM-data for example?
Tracking relations
Suppose we follow up on the concept of datasets. In that case, it becomes clear that relations between the datasets are as crucial as the dataset. In traditional PLM applications, these relations are often predefined as part of the core data model/
For example, the EBOM parts have relationships between themselves and specification data – see image.
The MBOM parts have links with the supplier data or the manufacturing process.
The prepared relations in a PLM system allow people to implement the system relatively quickly to map their approaches to this taxonomy.
However, traditional PLM systems are based on a document-based (or file-based) taxonomy combined with related metadata. In a model-based and connected environment, we have to get rid of the document-based type of data.
Therefore, the datasets will be more granular, and there is a need to manage exponential more relations between datasets.
This is why you see the graph database coming up as a needed infrastructure for modern connected applications. If you haven’t heard of a graph database yet, you are probably far from technology hypes. To understand the principles of a graph database you can read this article from neo4j: Graph Databases for Beginners: Why graph technology is the future
As you can see from the 2020 Gartner Hype Cycle for Artificial Intelligence this technology is at the top of the hype and conceptually the way to manage a connected enterprise. The discussion in this post also demonstrates that besides technology there is a lot of additional conceptual thinking needed before it can be implemented.
Although software vendors might handle the relations and datasets within their platform, the ultimate challenge will be sharing datasets with other platforms to get a connected ecosystem.
For example, the digital web picture shown above and introduced by Marc Halpern at the 2018 PDT conference shows this concept. Recently CIMdata discussed this topic in a similar manner: The Digital Thread is Really a Web, with the Engineering Bill of Materials at Its Center
(Note I am not sure if CIMdata has published a recording of this webinar – if so I will update the link)
Anyway, these are signs that we started to find the right visuals to imagine new concepts. The traditional digital thread pictures, like the one below, are, for me, impressions of the past as they are too rigid and focusing on some particular value streams.
From a distance, it looks like a connected enterprise should work like our brain. We story information on different abstraction levels. We keep incredibly many relations between information elements. As the brain is a biological organ, connections degrade or get lost. Or the opposite other relationships become so strong that we cannot change them anymore. (“I know I am always right”)
Interestingly, the brain does not use the “single source of truth”-concept – there can be various “truths” inside a brain. This makes us human beings with all the good and the harmful effects of that.
As long as we realize there is no single source of truth.
In business and our technological world, we need sometimes the undisputed truth. Blockchain could be the basis for securing the right connections between datasets to guarantee the result is valid. I am curious if blockchain can scale to complex connected situations, although Moore’s Law might ultimately help us here too(if still valid).
The topic is not new – in 2014 I wrote a post with the title: PLM is doomed unless …. Where I introduced the topic of owning and sharing in the context of the human brain. In the post, I refer to the book On Intelligence by Jeff Hawkins how tries to analyze what is human-based intelligence and how could we apply it to our technology concepts. Still a fascinating book worth reading if you have the time and opportunity.
Conclusion
A data-driven approach requires a more granular definition of information, leading to the concepts of datasets and managing relations between datasets. This is a fundamental difference compared to the past, where we were operating systems with information. Now we are heading towards connected platforms that provide a filtered set of real-time data to act upon.
I am curious to learn more about how people have solved the connected challenges and in what kind of granularity. Let us know!
In my last post, I zoomed in on a preferred technical architecture for the future digital enterprise. Drawing the conclusion that it is a mission impossible to aim for a single connected environment. Instead, information will be stored in different platforms, both domain-oriented (PLM, ERP, CRM, MES, IoT) and value chain oriented (OEM, Supplier, Marketplace, Supply Chain hub).
In part 3, I posted seven statements that I will be discussing in this series. In this post, I will zoom in on point 2:
Data-driven does not mean we do not need any documents anymore. Read electronic files for documents. Likely, document sets will still be the interface to non-connected entities, suppliers, and regulatory bodies. These document sets can be considered a configuration baseline.
System of Record and System of Engagement
In the image below, a slide from 2016, I show a simplified view when discussing the difference between the current, coordinated approach and the future, connected approach. This picture might create the wrong impression that there are two different worlds – either you are document-driven, or you are data-driven.
In the follow-up of this presentation, I explained that companies need both environments in the future. The most efficient way of working for operations will be infrastructure on the right side, the platform-based approach using connected information.
For traceability and disconnected information exchanges, the left side will be there for many years to come. Systems of Record are needed for data exchange with disconnected suppliers, disconnected regulatory bodies and probably crucial for configuration management.
The System of Record will probably remain as a capability in every platform or cross-section of platform information. The Systems of Engagement will be the configured real-time environment for anyone involved in active company processes, not only ERP or MES, all execution.
Introducing SysML and SML
This summer, I received a copy of Martin Eigner’s System Lifecycle Management book, which I am reading at his moment in my spare moments. I always enjoyed Martin’s presentations. In many ways, we share similar ideas. Martin from his profession spent more time on the academic aspects of product and system lifecycle than I. But, on the other hand, I have always been in the field observing and trying to make sense of what I see and learn in a coherent approach. I am halfway through the book now, and for sure, I will come back on the book when I have finished.
A first impression: A great and interesting book for all. Martin and I share the same history of data management. Read all about this in his second chapter: Forty Years of Product Data Management
From PDM via PLM to SysLM, is a chapter that everyone should read when you haven’t lived it yourself. It helps you to understand the past (Learning for the past to understand the future). When I finish this series about the model-based and connected approach for products and systems, Martin’s book will be highly complementary given the content he describes.
There is one point for which I am looking forward to is feedback from the readers of this blog.
Should we, in our everyday language, better differentiate between Product Lifecycle Management (PLM) and System Lifecycle Management(SysLM)?
In some customer situations, I talk on purpose about System Lifecycle Management to create the awareness that the company’s offering is more than an electro/mechanical product. Or ultimately, in a more circular economy, would we use the term Solution Lifecycle Management as not only hardware and software might be part of the value proposition?
Martin uses consistently the abbreviation SysLM, where I would prefer the TLA SLM. The problem we both have is that both abbreviations are not unique or explicit enough. SysLM creates confusion with SysML (for dyslectic people or fast readers). SLM already has so many less valuable meanings: Simulation Lifecycle Management, Service Lifecycle Management or Software Lifecycle Management.
For the moment, I will use the abbreviation SLM, leaving it in the middle if it is System Lifecycle Management or Solution Lifecycle Management.
How to implement both approaches?
In the long term, I predict that more than 80 percent of the activities related to SLM will take place in a data-driven, model-based environment due to the changing content of the solutions offered by companies.
A solution will be based on hardware, the solid part of the solution, for which we could apply a BOM-centric approach. We can see the BOM-centric approach in most current PLM implementations. It is the logical result of optimizing the product lifecycle management processes in a coordinated manner.
However, the most dynamic part of the solution will be covered by software and services. Changing software or services related to a solution has completely different dynamics than a hardware product.
Software and services implementations are associated with a data-driven, model-based approach.
The management of solutions, therefore, needs to be done in a connected manner. Using the BOM-centric approach to manage software and services would create a Kafkaesque overhead.
Depending on your company’s value proposition to the market, the challenge will be to find the right balance. For example, when you keep on selling “disconnected” hardware, there is probably no need to change your internal PLM processes that much.
However, when you are moving to a “connected” business model providing solutions (connected systems / Outcome-based services), you need to introduce new ways of working with a different go-to-market mindset. No longer linear, but iterative.
A McKinsey concept, I have been promoting several times, illustrates a potential path – note the article was not written with a PLM mindset but in a business mindset.
What about Configuration Management?
The different datasets defining a solution also challenge traditional configuration management processes. Configuration Management (CM) is well established in the aerospace & defense industry. In theory, proper configuration management should be the target of every industry to guarantee an appropriate performance, reduced risk and cost of fixing issues.
The challenge, however, is that configuration management processes are not designed to manage systems or solutions, where dynamic updates can be applied whether or not done by the customer.
This is a topic to solve for the modern Connected Car (system) or Connected Car Sharing (solution)
For that reason, I am inquisitive to learn more from Martijn Dullaart’s presentation at the upcoming PLM Roadmap/PDT conference. The title of his session: The next disruption please …
In his abstract for this session, Martijn writes:
From Paper to Digital Files brought many benefits but did not fundamentally impact how Configuration Management was and still is done. The process to go digital was accelerated because of the Covid-19 Pandemic. Forced to work remotely was the disruption that was needed to push everyone to go digital. But a bigger disruption to CM has already arrived. Going model-based will require us to reexamine why we need CM and how to apply it in a model-based environment. Where, from a Configuration Management perspective, a digital file still in many ways behaves like a paper document, a model is something different. What is the deliverable? How do you manage change in models? How do you manage ownership? How should CM adopt MBx, and what requirements to support CM should be considered in the successful implementation of MBx? It’s time to start unraveling these questions in search of answers.
One of the ideas I am currently exploring is that we need a new layer on top of the current configuration management processes extending the validation to software and services. For example, instead of describing every validated configuration, a company might implement the regular configuration management processes for its hardware.
Next, the systems or solutions in the field will report (or validate) their configuration against validation rules. A topic that requires a long discussion and more than this blog post, potentially a full conference.
Therefore I am looking forward to participating in the CIMdata/PDT FALL conference and pick-up the discussions towards a data-driven, model-based future with the attendees. Besides CM, there are several other topics of great interest for the future. Have a look at the agenda here
Conclusion
A data-driven and model-based infrastructure still need to be combined with a coordinated, document-driven infrastructure. Where the focus will be, depends on your company’s value proposition.
If we discuss hardware products, we should think PLM. When you deliver systems, you should perhaps talk SysML (or SLM). And maybe it is time to define Solution Lifecycle Management as the term for the future.
Please, share your thoughts in the comments.
After a short summer break with almost no mentioning of the word PLM, it is time to continue this series of posts exploring the future of “connected” PLM. For those who also started with a cleaned-up memory, here is a short recap:
In part 1, I rush through more than 60 years of product development, starting from vellum drawings ending with the current PLM best practice for product development, the item-centric approach.
In part 2, I painted a high-level picture of the future, introducing the concept of digital platforms, which, if connected wisely, could support the digital enterprise in all its aspects. The five platforms I identified are the ERP and CRM platform (the oldest domains).
Next, the MES and PIP platform(modern domains to support manufacturing and product innovation in more detail) and the IoT platform (needed to support connected products and customers).
In part 3, I explained what is data-driven and how data-driven is closely connected to a model-based approach. Here we abandon documents (electronic files) as active information carriers. Documents will remain, however, as reports, baselines, or information containers. In this post, I ended up with seven topics related to data-driven, which I will discuss in upcoming posts.
Hopefully, by describing these topics – and for sure, there are more related topics – we will better understand the connected future and make decisions to enable the future instead of freezing the past.
Topic 1 for this post:
Data-driven does not imply, there needs to be a single environment, a single database that contains all information. As I mentioned in my previous post, it will be about managing connected datasets federated. It is not anymore about owned the data; it is about access to reliable data.
Platform or a collection of systems?
One of the first (marketing) hurdles to take is understanding what a data platform is and what is a collection of systems that work together, sold as a platform.
CIMdata published in 2017 an excellent whitepaper positioning the PIP (Product Innovation Platform): Product Innovation Platforms: Definition, Their Role in the Enterprise, and Their Long-Term Viability. CIMdata’s definition is extensive and covers the full scope of product innovation. Of course, you can find a platform that starts from a more focused process.
For example, look at OpenBOM (focus on BOM collaboration), OnShape (focus on CAD collaboration) or even Microsoft 365 (historical, document-based collaboration).
The idea behind a platform is that it provides basic capabilities connected to all stakeholders, inside and outside your company. In addition, to avoid that these capabilities are limited, a platform should be open and able to connect with other data sources that might be either local or central available.
From these characteristics, it is clear that the underlying infrastructure of a platform must be based on a multitenant SaaS infrastructure, still allowing local data to be connected and shielded for performance or IP reasons.
The picture below describes the business benefits of a Product Innovation Platform as imagined by Accenture in 2014
Link to CIMdata’s 2014 commentary of Digital PLM HERE
Sometimes vendors sell their suite of systems as a platform. This is a marketing trick because when you want to add functionality to your PLM infrastructure, you need to install a new system and create or use interfaces with the existing systems, not really a scalable environment.
In addition, sometimes, the collaboration between systems in such a marketing platform is managed through proprietary exchange (file) formats.
A practice we have seen in the construction industry before cloud connectivity became available. However, a so-called end-to-end solution working on PowerPoint implemented in real life requires a lot of human intervention.
Not a single environment
There has always been the debate:
“Do I use best-in-class tools, supporting the end-user of the software, or do I provide an end-to-end infrastructure with more generic tools on top of that, focusing on ease of collaboration?”
In the system approach, the focus was most of the time on the best-in-class tools where PLM-systems provide the data governance. A typical example is the item-centric approach. It reflects the current working culture, people working in their optimized siloes, exchanging information between disciplines through (neutral) files.
The platform approach makes it possible to deliver the optimized user interface for the end-user through a dedicated app. Assuming the data needed for such an app is accessible from the current platform or through other systems and platforms.
It might be tempting as a platform provider to add all imaginable data elements to their platform infrastructure as much as possible. The challenge with this approach is whether all data should be stored in a central data environment (preferably cloud) or federated. And what about filtering IP?
In my post PLM and Supply Chain Collaboration, I described the concept of having an intermediate hub (ShareAspace) between enterprises to facilitate real-time data sharing, however carefully filtered which data is shared in the hub.
It may be clear that storing everything in one big platform is not the future. As I described in part 2, in the end, a company might implement a maximum of five connected platforms (CRM, ERP, PIP, IoT and MES). Each of the individual platforms could contain a core data model relevant for this part of the business. This does not imply there might be no other platforms in the future. Platforms focusing on supply chain collaboration, like ShareAspace or OpenBOM, will have a value proposition too. In the end, the long-term future is all about realizing a digital tread of information within the organization.
Will we ever reach a perfectly connected enterprise or society? Probably not. Not because of technology but because of politics and human behavior. The connected enterprise might be the most efficient architecture, but will it be social, supporting all humanity. Predicting the future is impossible, as Yuval Harari described in his book: 21 Lessons for the 21st Century. Worth reading, still a collection of ideas.
Proprietary data model or standards?
So far, when you are a software vendor developing a system, there is no restriction in how you internally manage your data. In the domain of PLM, this meant that every vendor has its own proprietary data model and behavior.
I have learned from my 25+ years of experience with systems that the original design of a product combined with the vendor’s culture defines the future roadmap. So even if a PLM vendor would rewrite all their software to become data-driven, the ways of working, the assumptions will be based on past experiences.
This makes it hard to come to unified data models and methodology valid for our PLM domain. However, large enterprises like Airbus and Boeing and the major Automotive suppliers have always pushed for standards as they will benefit the most from standardization.
The recent PDT conferences were an example of this, mainly the 2020 Fall conference. Several Aerospace & Defense PLM Action groups reported their progress.
You can read my impression of this event in The weekend after PLM Roadmap / PDT 2020 – part 1 and The next weekend after PLM Roadmap PDT 2020 – part 2.
It would be interesting to see a Product Innovation Platform built upon a data model as much as possible aligned to existing standards. Probably it won’t happen as you do not make money from being open and complying with standards as a software vendor. Still, companies should push their software vendors to support standards as this is the only way to get larger connected eco-systems.
I do not believe in the toolkit approach where every company can build its own data model based on its current needs. I have seen this flexibility with SmarTeam in the early days. However, it became an upgrade risk when new, overlapping capabilities were introduced, not matching the past.
In addition, a flexible toolkit still requires a robust data model design done by experienced people who have learned from their mistakes.
The benefit of using standards is that they contain the learnings from many people involved.
Conclusion
I did not like writing this post so much, as my primary PLM focus lies on people and methodology. Still, understanding future technologies is an important point to consider. Therefore, this time a not-so-exciting post. There is enough to read on the internet related to PLM technology; see some of the recent articles below. Enjoy
Matthias Ahrens shared: Integrated Product Lifecycle Management (Google translated from German)
Oleg Shilovitsky wrote numerous articles related to technology –
in this context:
3 Challenges of Unified Platforms and System Locking and
SaaS PLM Acceleration Trends
My previous post introducing the concept of connected platforms created some positive feedback and some interesting questions. For example, the question from Maxime Gravel:
Thank you, Jos, for the great blog. Where do you see Change Management tool fit in this new Platform ecosystem?
is one of the questions I try to understand too. You can see my short comment in the comments here. However, while discussing with other experts in the CM-domain, we should paint the path forward. Because if we cannot solve this type of question, the value of connected platforms will be disputable.
It is essential to realize that a digital transformation in the PLM domain is challenging. No company or vendor has the perfect blueprint available to provide an end-to-end answer for a connected enterprise. In addition, I assume it will take 10 – 20 years till we will be familiar with the concepts.
It takes a generation to move from drawings to 3D CAD. It will take another generation to move from a document-driven, linear process to data-driven, real-time collaboration in an iterative manner. Perhaps we can move faster, as the Automotive, Aerospace & Defense, and Industrial Equipment industries are not the most innovative industries at this time. Other industries or startups might lead us faster into the future.
Although I prefer discussing methodology, I believe before moving into that area, I need to clarify some more technical points before moving forward. My apologies for writing it in such a simple manner. This information should be accessible for the majority of readers.
What means data-driven?
I often mention a data-driven environment, but what do I mean precisely by that. For me, a data-driven environment means that all information is stored in a dataset that contains a single aspect of information in a standardized manner, so it becomes accessible by outside tools.
A document is not a dataset, as often it includes a collection of datasets. Most of the time, the information it is exposed to is not standardized in such a manner a tool can read and interpret the exact content. We will see that a dataset needs an identifier, a classification, and a status.
An identifier to be able to create a connection between other datasets – traceability or, in modern words, a digital thread.
A classification as the classification identifier will determine the type of information the dataset contains and potential a set of mandatory attributes
A status to understand if the dataset is stable or still in work.
Examples of a data-driven approach – the item
The most common dataset in the PLM world is probably the item (or part) in a Bill of Material. The identifier is the item number (ID + revision if revisions are used). Next, the classification will tell you the type of part it is.
Part classification can be a topic on its own, and every industry has its taxonomy.
Finally, the status is used to identify if the dataset is shareable in the context of other information (released, in work, obsolete), allowing tools to expose only relevant information.
In a data-driven manner, a part can occur in several Bill of Materials – an example of a single definition consumed in other places.
When the part information changes, the accountable person has to analyze the relations to the part, which is easy in a data-driven environment. It is normal to find this functionality in a PDM or ERP system.
When the part would change in a document-driven environment, the effort is much higher.
First, all documents need to be identified where this part occurs. Then the impact of change needs to be managed in document versions, which will lead to other related changes if you want to keep the information correct.
Examples of a data-driven approach – the requirement
Another example illustrating the benefits of a data-driven approach is implementing requirements management, where requirements become individual datasets. Often a product specification can contain hundreds of requirements, addressing the needs of different stakeholders.
In addition, several combinations of requirements need to be handled by other disciplines, mechanical, electrical, software, quality and legal, for example.
As requirements need to be analyzed and ranked, a specification document would never be frozen. Trade-off analysis might lead to dropping or changing a single requirement. It is almost impossible to manage this all in a document, although many companies use Excel. The disadvantages of Excel are known, in particular in a dynamic environment.
The advantage of managing requirements as datasets is that they can be grouped. So, for example, they can be pushed to a supplier (as a specification).
Or requirements could be linked to test criteria and test cases, without the need to manage documents and make sure you work with them last updated document.
As you will see, also requirements need to have an Identifier (to manage digital relations), a classification (to allow grouping) and a status (in work / released /dropped)
Data-driven and Models – the 3D CAD model
When I launched my series related to the model-based approach in 2018, the first comments I got came from people who believed that model-based equals the usage of 3D CAD models – see Model-based – the confusion. 3D Models are indeed an essential part of a model-based infrastructure, as the 3D model provides an unambiguous definition of the physical product. Just look at how most vendors depict the aspects of a virtual product using 3D (wireframe) models.
Although we use a 3D representation at each product lifecycle stage, most companies do not have a digital continuity for the 3D representation. Design models are often too heavy for visualization and field services support. The connection between engineering and manufacturing is usually based on drawings instead of annotated models.
I wrote about modern PLM and Model-Based Definition, supported by Jennifer Herron from Action Engineering – read the post PLM and Model-Based Definition here.
If your company wants to master a data-driven approach, this is one of the most accessible learning areas. You will discover that connecting engineering and manufacturing requires new technology, new ways of working and much more coordination between stakeholders.
Implementing Model-Based Definition is not an easy process. However, it is probably one of the best steps to get your digital transformation moving. The benefits of connected information between engineering and manufacturing have been discussed in the blog post PLM and Model-Based Definition
Essential to realize all these exciting capabilities linked to Industry 4.0 require a data-driven, model-based connection between engineering and manufacturing.
If this is not the case, the projected game-changers will not occur as they become too costly.
Data-driven and mathematical models
To manage complexity, we have learned that we have to describe the behavior in models to make logical decisions. This can be done in an abstract model, purely based on mathematical equations and relations. For example, suppose you look at climate models, weather models or COVID infections models.
In that case, we see they all lead to discussions from so-called experts that believe a model should be 100 % correct and any exception shows the model is wrong.
It is not that the model is wrong; the expectations are false.
For less complex systems and products, we also use models in the engineering domain. For example, logical models and behavior models are all descriptive models that allow people to analyze the behavior of a product.
For example, how software code impacts the product’s behavior. Usually, we speak about systems when software is involved, as the software will interact with the outside world.
There can be many models related to a product, and if you want to get an impression, look at this page from the SEBoK wiki: Types of Models. The current challenge is to keep the relations between these models by sharing parameters.
The sharable parameters then again should be datasets in a data-driven environment. Using standardized diagrams, like SysML or UML, enables the used objects in the diagram to become datasets.
I will not dive further into the modeling details as I want to remain at a high level.
Essential to realize digital models should connect to a data-driven infrastructure by sharing relevant datasets.
What does data-driven imply?
I want to conclude this time with some statements to elaborate on further in upcoming posts and discussions
- Data-driven does not imply there needs to be a single environment, a single database that contains all information. Like I mentioned in my previous post, it will be about managing connected datasets in a federated manner. It is not anymore about owned the data; it is about access to reliable data.
- Data-driven does not mean we do not need any documents anymore. Read electronic files for documents. Likely, document sets will still be the interface to non-connected entities, suppliers, and regulatory bodies. These document sets can be considered a configuration baseline.
- Data-driven means that we need to manage data in a much more granular manner. We have to look different at data ownership. It becomes more data accountability per role as the data can be used and consumed throughout the product lifecycle.
- Data-driven means that you need to have an enterprise architecture, data governance and a master data management (MDM) approach. So far, the traditional PLM vendors have not been active in the MDM domain as they believe their proprietary data model is leading. Read also this interesting McKinsey article: How enterprise architects need to evolve to survive in a digital world
- A model-based approach with connected datasets seems to be the way forward. Managing data in documents will become inefficient as they cannot contribute to any digital accelerator, like applying algorithms. Artificial Intelligence relies on direct access to qualified data.
- I don’t believe in Low-Code platforms that provide ad-hoc solutions on demand. The ultimate result after several years might be again a new type of spaghetti. On the other hand, standardized interfaces and protocols will probably deliver higher, long-term benefits. Remember: Low code: A promising trend or a Pandora’s Box?
- Configuration Management requires a new approach. The current methodology is very much based on hardware products with labor-intensive change management. However, the world of software products has different configuration management and change procedure. Therefore, we need to merge them in a single framework. Unfortunately, this cannot be the BOM framework due to the dynamics in software changes. An interesting starting point for discussion can be found here: Configuration management of industrial products in PDM/PLM
Conclusion
Again, a long post, slowly moving into the future with many questions and points to discuss. Each of the seven points above could be a topic for another blog post, a further discussion and debate.
After my summer holiday break in August, I will follow up. I hope you will join me in this journey by commenting and contributing with your experiences and knowledge.
So far, I have been discussing PLM experiences and best practices that have changed due to introducing electronic drawings and affordable 3D CAD systems for the mainstream. From vellum to PDM to item-centric PLM to manage product designs and manufacturing specifications.
Although the technology has improved, the overall processes haven’t changed so much. As a result, disciplines could continue to work in their own comfort zone, most of the time hidden and disconnected from the outside world.
Now, thanks to digitalization, we can connect and format information in real-time. Now we can provide every stakeholder in the company’s business to have almost real-time visibility on what is happening (if allowed). We have seen the benefits of platformization, where the benefits come from real-time connectivity within an ecosystem.
Apple, Amazon, Uber, Airbnb are the non-manufacturing related examples. Companies are trying to replicate these models for other businesses, connecting the concept owner (OEM ?), with design and manufacturing (services), with suppliers and customers. All connected through information, managed in data elements instead of documents – I call it connected PLM
Vendors have already shared their PowerPoints, movies, and demos from how the future would be in the ideal world using their software. The reality, however, is that implementing such solutions requires new business models, a new type of organization and probably new skills.
The last point is vital, as in schools and organizations, we tend to teach what we know from the past as this gives some (fake) feeling of security.
The reality is that most of us will have to go through a learning path, where skills from the past might become obsolete; however, knowledge of the past might be fundamental.
In the upcoming posts, I will share with you what I see, what I deduct from that and what I think would be the next step to learn.
I firmly believe connected PLM requires the usage of various models. Not only the 3D CAD model, as there are so many other models needed to describe and analyze the behavior of a product.
I hope that some of my readers can help us all further on the path of connected PLM (with a model-based approach). This series of posts will be based on the max size per post (avg 1500 words) and the ideas and contributes coming from you and me.
What is platformization?
In our day-to-day life, we are more and more used to direct interaction between resellers and services providers on one side and consumers on the other side. We have a question, and within 24 hours, there is an answer. We want to purchase something, and potentially the next day the goods are delivered. These are examples of a society where all stakeholders are connected in a data-driven manner.
We don’t have to create documents or specialized forms. An app or a digital interface allows us to connect. To enable this type of connectivity, there is a need for an underlying platform that connects all stakeholders. Amazon and Salesforce are examples for commercial activities, Facebook for social activities and, in theory, LinkedIn for professional job activities.
The platform is responsible for direct communication between all stakeholders.
The same applies to businesses. Depending on the products or services they deliver, they could benefit from one or more platforms. The image below shows five potential platforms that I identified in my customer engagements. Of course, they have a PLM focus (in the middle), and the grouping can be made differently.
The 5 potential platforms
The ERP platform
is mainly dedicated to the company’s execution processes – Human Resources, Purchasing, Finance, Production scheduling, and potentially many more services. As platforms try to connect as much as possible all stakeholders. The ERP platform might contain CRM capabilities, which might be sufficient for several companies. However, when the CRM activities become more advanced, it would be better to connect the ERP platform to a CRM platform. The same logic is valid for a Product Innovation Platform and an ERP platform. Examples of ERP platforms are SAP and Oracle (and they will claim they are more than ERP)
Note: Historically, most companies started with an ERP system, which is not the same as an ERP platform. A platform is scalable; you can add more apps without having to install a new system. In a platform, all stored data is connected and has a shared data model.
The CRM platform
a platform that is mainly focusing on customer-related activities, and as you can see from the diagram, there is an overlap with capabilities from the other platforms. So again, depending on your core business and products, you might use these capabilities or connect to other platforms. Examples of CRM platforms are Salesforce and Pega, providing a platform to further extend capabilities related to core CRM.
The MES platform
In the past, we had PDM and ERP and what happened in detail on the shop floor was a black box for these systems. MES platforms have become more and more important as companies need to trace and guide individual production orders in a data-driven manner. Manufacturing Execution Systems (and platforms) have their own data model. However, they require input from other platforms and will provide specific information to other platforms.
For example, if we want to know the serial number of a product and the exact production details of this product (used parts, quality status), we would use an MES platform. Examples of MES platforms (none PLM/ERP related vendors) are Parsec and Critical Manufacturing
The IoT platform
these platforms are new and are used to monitor and manage connected products. For example, if you want to trace the individual behavior of a product of a process, you need an IoT platform. The IoT platform provides the product user with performance insights and alerts.
However, it also provides the product manufacturer with the same insights for all their products. This allows the manufacturer to offer predictive maintenance or optimization services based on the experience of a large number of similar products. Examples of IoT platforms (none PLM/ERP-related vendors) are Hitachi and Microsoft.
The Product Innovation Platform (PIP)
All the above platforms would not have a reason to exist if there was not an environment where products were invented, developed, and managed. The Product Innovation Platform PIP – as described by CIMdata -is the place where Intellectual Property (IP) is created, where companies decide on their portfolio and more.
The PIP contains the traditional PLM domain. It is also a logical place to manage product quality and technical portfolio decisions, like what kind of product platforms and modules a company will develop. Like all previous platforms, the PIP cannot exist without other platforms and requires connectivity with the other platforms is applicable.
Look below at the CIMdata definition of a Product Innovation Platform.
You will see that most of the historical PLM vendors aiming to be a PIP (with their different flavors): Aras, Dassault Systèmes, PTC and Siemens.
Of course, several vendors sell more than one platform or even create the impression that everything is connected as a single platform. Usually, this is not the case, as each platform has its specific data model and combining them in a single platform would hurt the overall performance.
Therefore, the interaction between these platforms will be based on standardized interfaces or ad-hoc connections.
Standard interfaces or ad-hoc connections?
Suppose your role and information needs can be satisfied within a single platform. In that case, most likely, the platform will provide you with the right environment to see and manipulate the information.
However, it might be different if your role requires access to information from other platforms. For example, it could be as simple as an engineer analyzing a product change who needs to know the actual stock of materials to decide how and when to implement a change.
This would be a PIP/ERP platform collaboration scenario.
Or even more complex, it might be a product manager wanting to know how individual products behave in the field to decide on enhancements and new features. This could be a PIP, CRM, IoT and MES collaboration scenario if traceability of serial numbers is needed.
The company might decide to build a custom app or dashboard for this role to support such a role. Combining in real-time data from the relevant platforms, using standard interfaces (preferred) or using API’s, web services, REST services, microservices (for specialists) and currently in fashion Low-Code development platforms, which allow users to combine data services from different platforms without being an expert in coding.
Without going too much in technology, the topics in this paragraph require an enterprise architecture and vision. It is opportunistic to think that your existing environment will evolve smoothly into a digital highway for the future by “fixing” demands per user. Your infrastructure is much more likely to end up congested as spaghetti.
In that context, I read last week an interesting post Low code: A promising trend or Pandora’s box. Have a look and decide for yourself
I am less focused on technology, more on methodology. Therefore, I want to come back to the theme of my series: The road to model-based and connected PLM. For sure, in the ideal world, the platforms I mentioned, or other platforms that run across these five platforms, are cloud-based and open to connect to other data sources. So, this is the infrastructure discussion.
In my upcoming blog post, I will explain why platforms require a model-based approach and, therefore, cause a challenge, particularly in the PLM domain.
It took us more than fifty years to get rid of vellum drawings. It took us more than twenty years to introduce 3D CAD for design and engineering. Still primarily relying on drawings. It will take us for sure one generation to switch from document-based engineering to model-based engineering.
Conclusion
In this post, I tried to paint a picture of the ideal future based on connected platforms. Such an environment is needed if we want to be highly efficient in designing, delivering, and maintaining future complex products based on hardware and software. Concepts like Digital Twin and Industry 4.0 require a model-based foundation.
In addition, we will need Digital Twins to reach our future sustainability goals efficiently. So, there is work to do.
Your opinion, Your contribution?
After “The PLM Doctor is IN #2,” now again a written post in the category of PLM and complementary practices/domains.
After PLM and Configuration Lifecycle ManagementCLM (January 2021) and PLM and Configuration Management CM (February 2021), now it is time to address the third interesting topic:
PLM and Supply Chain collaboration.
In this post, I am speaking with Magnus Färneland from Eurostep, a company well known in my PLM ecosystem, through their involvement in standards (STEP and PLCS), the PDT conferences, and their PLM collaboration hub, ShareAspace.
Supply Chain collaboration
The interaction between OEMs and their suppliers has been a topic of particular interest to me. As a warming-up, read my post after CIMdata/PDT Roadmap 2020: PLM and the Supply Chain. In this post, I briefly touched on the Eurostep approach – having a Supply Chain Collaboration Hub. Below an image from that post – in this case, the Collaboration Hub is positioned between two OEMs.
Recently Eurostep shared a blog post in the same context: 3 Steps to remove data silos from your supply chain addressing the dreams of many companies: moving from disconnected information silos towards a logical flow of data. This topic is well suited for all companies in the digital transformation process with their supply chain. So, let us hear it from Eurostep.
Eurostep – the company / the mission
First of all, can you give a short introduction to Eurostep as a company and the unique value you are offering to your clients?
Eurostep was founded in 1994 by several world-class experts on product data and information management. In the year 2000, we started developing ShareAspace. We took all the experience we had from working with collaboration in the extended enterprise, mixed it with our standards knowledge, and selected Microsoft as the technology for our software platform.
We now offer ShareAspace as a solution for product information collaboration in all three industry verticals where we are active: Manufacturing, Defense and AEC & Plant.
ShareAspace is based on an information standard called PLCS (ISO 10300-239). This means we have a data model covering the complete life cycle of a product from requirements and conceptual design to an existing installed base. We have added things needed, such as consolidation and security. Our partnership with Microsoft has also resulted in ShareAspace being available in Azure as a service (our Design to Manufacturing software).
Why a supply chain collaboration hub?
Currently, most suppliers work in a disconnected manner with their clients – sending files up and down or the need to work inside the OEM environment. What are the reasons to consider a supply chain collaboration hub or, as you call it, a product information collaboration solution?
The hub concept is not new per se. There are plenty of examples of file sharing hubs. Once you realize that sending files back and forth by email is a disaster for keeping control of your information being shared with suppliers, you would probably try out one of the available file-sharing alternatives.
However, after a while, you begin to realize that a file share can be quite time consuming to keep up to date. Files are being changed. Files are being removed! Some files are enormous, and you realize that you only need a fraction of what is in the file. References within one file to another file becomes corrupt because the other file is of a new version. Etc. Etc.
This is about the time when you realize that you need similar control of the data you share with suppliers as you have in your internal systems. If not better.
A hub allows all partners to continue to use their internal tools and processes. It is also a more secure way of collaboration as the suppliers and partners are not let into the internal systems of the OEM.
Another significant side effect of this is that you only expose the data in the hub intended for external sharing and avoid sharing too much or exposing internal sensitive data.
A hub is also suitable for business flexibility as partners are not hardwired with the OEM. Partners can change, and IT systems in the value chain can change without impacting more than the single system’s connecting to the hub.
Should every company implement a supply chain collaboration hub?
Based on your experience, what types of companies should implement a supply chain collaboration hub and what are the expected benefits?
The large OEMs and 1st tier suppliers certainly benefit from this since they can incorporate hundreds, if not thousands, of suppliers. Sharing technical data across the supply chain from a dedicated hub will remove confusions, improve control of the shared data, and build trust with their partners.
With our cloud-based offering, we now also make it possible for at least mid-sized companies (like 200+ employees) to use ShareAspace. They may not have a well-adopted PLM system or the issues of communicating complex specifications originating from several internal sources. However still, they need to be professional in dealing with suppliers.
The smallest client we have is a manufacturer of pool cleaners, a complex product with many suppliers. The company Weda [www.weda.se] has less than 10 employees, and they use ShareAspace as SaaS. With ShareAspace, they have improved their collaboration process with suppliers and cut costs and lowered inventory levels.
ShareAspace can really scale big. It serves as a collaboration solution for the two new Aircraft carriers in the UK, the QUEEN ELIZABETH class. The aircraft carriers were built by a consortium that was closed in early 2020.
ShareAspace is being used to hold the design data and other documentation from the consortium to be available to the multiple organizations (both inside and outside of the Ministry of Defence) that need controlled access.
What is the dependency on standards?
I always associate Eurostep with the PLCS (ISO 10303-239) standard, providing an information model for “hardware” products along the lifecycle. How important is this standard for you in the context of your ShareAspace offering?
Should everyone adapt to this standard?
We have used PLCS to define the internal data schema in ShareAspace. This is an excellent starting point for capturing information from different systems and domains and still getting it to fit together. Why invent something new?
However, we can import data in most formats, and it does not have to be according to a standard. When connecting to Teamcenter, Windchill, Enovia, SAP, Oracle, Maximo etc., it is more often in a proprietary format than according to any standards.
On the other hand, in some industries like Defense, standards-based data exchange is required and put into contracts. Sometimes it prescribes PLCS. For the plant industry, it could be CFIHOS or ISO15926.
Supply Chain Collaboration and digital transformation
As stated at the beginning of this post, digital transformation is about connecting the information siloes through a digital thread. How important is this related to the supply chain?
Many companies have come a long way in improving their internal management of product data. But still, the exchange and sharing of data with the external world has considerable potential for improvement. Just look at the chaos everyone has experienced with emails, still used a lot, in finding the latest Word document or PowerPoint file. Imagine if you collaborate on a ship, a truck, a power plant, or a piece of complex infrastructure. FTP is not the answer, and for product data, Dropbox is not doing the trick.
A Digital Thread must support versions and changes in all directions, as changes are natural with reasonably advanced products. Much of the information created about or around a product is generated within the supply chain, like production parameters, test and inspection protocols, certifications, and more. Without an intelligent way of capturing this data, companies will continue to spend a fortune on administration trying to manage this manually.
As the Digital Thread extends across the value chain, a useful sharing tool is needed to allow for configuration management across the complete chain – ShareAspace is designed for this. The great thing with PLCS is that it gives a standard model for the Digital Thread covering several Digital Twins. PLCS adds the life cycle component, which is essential, and there is no alternative. Therefore, we are welcome with ShareAspace and PLCS to add capabilities to snapshot standards like IFC etc., that are outside the STEP series of standards.
Learning more
We discussed that a supply chain collaboration hub can have specific value to a company. Where can readers learn more?
There is a lot of information available. Of course, on our Eurostep website, you will find information under the tab Resources or on the ShareAspace website under the tab News.
Other sources are:
What I have learned
- I am surprised to see that the type of Supplier Collaboration Platform delivered by Eurostep is not a booming market. Where Time to Market is significantly impacted by how companies work with their suppliers, most companies still rely on the exchange of data packages.
- The most advanced exchanges are using a model-based definition if relevant. Traditional PLM Vendors will not develop such platforms as the platform needs to be agnostic in both directions.
- Having a recommended data model based on PLCS or a custom-data model in case of a large OEM can provide such a collaboration hub. Relative easy to implement (as you do not change your own PLM) and relatively easy to scale (adding a new supplier is easy). For me, the supplier collaboration platform is a must in a modern, digital connected enterprise.
Conclusion
A lot of marketing money is spent on “Digital Thread” or “Digital Continuity”. If you are looking at the full value chain of product development and operational support, there are still many manual hand-over processes with suppliers. A supplier collaboration hub might be the missing piece of the puzzle to realize a real digital thread or continuity.
Last week I shared my first review of the PLM Roadmap / PDT Fall 2020 conference, organized by CIMdata and Eurostep. Having digested now most of the content in detail, I can state this was the best conference of 2020. In my first post, the topics I shared were mainly the consultant’s view of digital thread and digital twin concepts.
This time, I want to focus on the content presented by the various Aerospace & Defense working groups who shared their findings, lessons-learned (so far) on topics like the Multi-view BOM, Supply Chain Collaboration, MBSE Data interoperability.
These sessions were nicely wrapped with presentations from Alberto Ferrari (Raytheon), discussing the digital thread between PLM and Simulation Lifecycle Management and Jeff Plant (Boeing) sharing their Model-Based Engineering strategy.
I believe these insights are crucial, although there might be people in the field that will question if this research is essential. Is not there an easier way to achieve to have the same results?
Nicely formulated by Ilan Madjar as a comment to my first post:
Ilan makes a good point about simplifying the ideas to the masses to make it work. The majority of companies probably do not have the bandwidth to invest and understand the future benefits of a digital thread or digital twins.
This does not mean that these topics should not be studied. If your business is in a small, simple eco-system and wants to work in a connected mode, you can choose a vendor and a few custom interfaces.
However, suppose you work in a global industry with an extensive network of partners, suppliers, and customers.
In that case, you cannot rely on ad-hoc interfaces or a single vendor. You need to invest in standards; you need to study common best practices to drive methodology, standards, and vendors to align.
This process of standardization is so crucial if you want to have a sustainable, connected enterprise. In the end, the push from these companies will lead to standards, allowing the smaller companies to ad-here or connect to.
The future is about Connected through Standards, as discussed in part 1 and further in this post. Let’s go!
Global Collaboration – Defining a baseline for data exchange processes and standards
Katheryn Bell (Pratt & Whitney Canada) presented the progress of the A&D Global Collaboration workgroup. As you can see from the project timeline, they have reached the phase to look towards the future.
Katheryn mentioned the need to standardize terminology as the first point of attention. I am fully aligned with that point; without a standardized terminology framework, people will have a misunderstanding in communication.
This happens even more in the smaller businesses that just pick sometimes (buzz) terms without a full understanding.
Several years ago, I talked with a PLM-implementer telling me that their implementation focus was on systems engineering. After some more explanations, it appeared they were making an attempt for configuration management in reality. Here the confusion was massive. Still, a standard, common terminology is crucial in our domain, even if it seems academic.
The group has been analyzing interoperability standards, standards for long-time archival and retrieval (LOTAR), but also has been studying the ISO 44001 standard related to Collaborative business relationship management systems
In the Q&A session, Katheryn explained that the biggest problem to solve with collaboration was the risk of working with the wrong version of data between disciplines and suppliers.
Of course, such errors can lead to huge costs if they are discovered late (or too late). As some of the big OEMs work with thousands of suppliers, you can imagine it is not an issue easily discovered in a more ad-hoc environment.
The move to a standardized Technical Data Package based on a Model-Based Definition is one of these initiatives in this domain to reduce these types of errors.
You can find the proceedings from the Global Collaboration working group here.
Connect, Trace, and Manage Lifecycle of Models, Simulation and Linked Data: Is That Easy?
I loved Alberto Ferrari‘s (Raytheon) presentation how he described the value of a model-based digital thread, positioning it in a targeted enterprise.
Click on the image and discover how business objectives, processes and models go together supported by a federated infrastructure.
Alberto’s presentation was a kind of mind map from how I imagine the future, and it is a pity if you have not had the chance to see his session.
Alberto also focused on the importance of various simulation capabilities combined with simulation lifecycle management. For Alberto, they are essential to implement digital twins. Besides focusing on standards, Alberto pleas for a semantic integration, open service architecture with the importance of DevSecOps.
Enough food for thought; as Alberto mentioned, he presented the corporate vision, not the current state.
More A&D Action Groups
There were two more interesting specialized sessions where teams from the A&D action groups provided a status update.
Brandon Sapp (Boeing) and Ian Parent (Pratt & Whitney) shared the activities and progress on Minimum Model-Based Definition (MBD) for Type Design Certification.
As Brandon mentioned, MBD is already a widely used capability; however, MBD is still maturing and evolving. I believe that is also one of the reasons why MBD is not yet accepted in mainstream PLM. Smaller organizations will wait; however, can your company afford to wait?
More information about their progress can be found here.
Mark Williams (Boeing) reported from the A&D Model-Based Systems Engineering action group their first findings related to MBSE Data Interoperability, focusing on an Architecture Model Exchange Solution. A topic interesting to follow as the promise of MBSE is that it is about connected information shared in models. As Mark explained, data exchange standards for requirements and behavior models are mature, readily available in the tools, and easily adopted. Exchanging architecture models has proven to be very difficult. I will not dive into more details, respecting the audience of this blog.
For those interested in their progress, more information can be found here
Model-Based Engineering @ Boeing
In this conference, the participation of Boeing was significant through the various action groups. As the cherry on the cake, there was Jeff Plant‘s session, giving an overview of what is happening at Boeing. Jeff is Boeing’s director of engineering practices, processes, and tools.
In his introduction, Jeff mentioned that Boeing has more than 160.000 employees in over 65 countries. They are working with more than 12.000 suppliers globally. These suppliers can be manufacturing, service or technology partnerships. Therefore you can imagine, and as discussed by others during the conference, streamlined collaboration and traceability are crucial.
The now-famous MBE Diamond symbol illustrates the model-based information flows in the virtual world and the physical world based on the systems engineering approach. Like Katheryn Bell did in her session related to Global Collaboration, Jeff started explaining the importance of a common language and taxonomy needed if you want to standardize processes.
Zoom in on the Boeing MBE Taxonomy, you will discover the clarity it brings for the company.
I was not aware of the ISO 23247 standard concerning the Digital Twin framework for manufacturing, aiming to apply industry standards to the model-based definition of products and process planning. A standard certainly to follow as it brings standardization on top of existing standards.
As Jeff noted: A practical standard for implementation in a company of any size. In my opinion, mandatory for a sustainable, connected infrastructure.
Jeff presented the slide below, showing their standardization internally around federated platforms.
This slide resembles a lot the future platform vision I have been sharing since 2017 when discussing PLM’s future at PLM conferences, when explaining the differences between Coordinated and Connected – see also my presentation here on Slideshare.
You can zoom in on the picture to see the similarities. For me, the differences were interesting to observe. In Jeff’s diagram, the product lifecycle at the top indicates the platform of (central) interest during each lifecycle stage, suggesting a linear process again.
In reality, the flow of information through feedback loops will be there too.
The second exciting detail is that these federated architectures should be based on strong interoperability standards. Jeff is urging other companies, academics and vendors to invest and come to industry standards for Model-Based System Engineering practices. The time is now to act on this domain.
It reminded me again of Marc Halpern’s message mentioned in my previous post (part 1) that we should be worried about vendor alliances offering an integrated end-to-end data flow based on their solutions. This would lead to an immense vendor-lock in if these interfaces are not based on strong industry standards.
Therefore, don’t watch from the sideline; it is the voice (and effort) of the companies that can drive standards.
Finally, during the Q&A part, Jeff made an interesting point explaining Boeing is making a serious investment, as you can see from their participation in all the action groups. They have made the long-term business case.
The team is confident that the business case for such an investment is firm and stable, however in such long-term investment without direct results, these projects might come under pressure when the business is under pressure.
The virtual fireside chat
The conference ended with a virtual fireside chat from which I picked up an interesting point that Marc Halpern was bringing in. Marc mentioned a survey Gartner has done with companies in fast-moving industries related to the benefits of PLM. Companies reported improvements in accuracy and product development. They did not see so much a reduced time to market or cost reduction. After analysis, Gartner believes the real issue is related to collaboration processes and supply chain practices. Here lead times did not change, nor the number of changes.
Marc believes that this topic will be really showing benefits in the future with cloud and connected suppliers. This reminded me of an article published by McKinsey called The case for digital reinvention. In this article, the authors indicated that only 2 % of the companies interview were investing in a digital supply chain. At the same time, the expected benefits in this area would have the most significant ROI.
The good news, there is consistency, and we know where to focus for early results.
Conclusion
It was a great conference as here we could see digital transformation in action (groups). Where vendor solutions often provide a sneaky preview of the future, we saw people working on creating the right foundations based on standards. My appreciation goes to all the active members in the CIMdata A&D action groups as they provide the groundwork for all of us – sooner or later.
Your Miele story caught my attention… My 15-year-old Miele dishwasher (which I loved) was failing to wash dishes, and I…
Jos, great thoughts about BOM management. Here are some of my thoughts. I can see how BOM management will evolve…
As a complement, even if more and more of the diversity of a product is managed at the software level…
1) A wiring diagram stores information (wires between ports of the electrical components) that does not exist in most of…
BOM has NEVER been the sole "master" of the Product. The DEFINITION FILE is ! For example the wiring of…