You are currently browsing the category archive for the ‘model-based’ category.

While preparing my presentation for the Dutch Model-Based Definition solutions event, I had some reflections and experiences discussing Model-Based Definition. Particularly in traditional industries. In the Aerospace & Defense, and Automotive industry, Model-Based Definition has become the standard. However, other industries have big challenges in adopting this approach. In this post, I want to share my observations and bring clarifications about the importance.

 

What is a Model-Based Definition?

The Wiki-definition for Model-Based Definition is not bad:

Model-based definition (MBD), sometimes called digital product definition (DPD), is the practice of using 3D models (such as solid models, 3D PMI and associated metadata) within 3D CAD software to define (provide specifications for) individual components and product assemblies. The types of information included are geometric dimensioning and tolerancing (GD&T), component level materials, assembly level bills of materials, engineering configurations, design intent, etc.

By contrast, other methodologies have historically required the accompanying use of 2D engineering drawings to provide such details.

When I started to write about Model-Based definition in 2016, the concept of a connected enterprise was not discussed. MBD mainly enhanced data sharing between engineering, manufacturing, and suppliers at that time. The 3D PMI is a data package for information exchange between these stakeholders.

The main difference is that the 3D Model is the main information carrier, connected to 2D manufacturing views and other relevant data, all connected in this package.

 

MBD – the benefits

There is no need to write a blog post related to the benefits of MBD. With some research, you find enough reasons. The most important benefits of MBD are:

  • the information is and human-readable and machine-readable. Allowing the implementation of Smart Manufacturing / Industry 4.0 concepts
  • the information relies on processes and data and is no longer dependent on human interpretation. This leads to better quality and error-fixing late in the process.
  • MBD information is a building block for the digital enterprise. If you cannot master this concept, forget the benefits of MBSE and Virtual Twins. These concepts don’t run on documents.

To help you discover the benefits of MBD described by others – have a look here:

 

MBD as a stepping stone to the future

When you are able to implement model-based definition practices in your organization and connect with your eco-system, you are learning what it means to work in a connected matter. Where the scope is limited, you already discover that working in a connected manner is not the same as mandating everyone to work with the same systems or tools. Instead, it is about new ways of working (skills & people), combined with exchange standards (which to follow).

Where MBD is part of the bigger model-based enterprise, the same principles apply for connecting upstream information (Model-Based Systems Engineering) and downstream information(IoT-based operation and service models).

Oleg Shilovitsky addresses the same need from a data point of view in his recent blog: PLM Strategy For Post COVID Time. He makes an important point about the Digital Thread:

Digital Thread is one of my favorite topics because it is leading directly to the topic of connected data and services in global manufacturing networks.

I agree with that statement as the digital thread is like MBD, another steppingstone to organize information in a connected manner, even beyond the scope of engineering-manufacturing interaction. However, Digital Thread is an intermediate step toward a full data-driven and model-based enterprise.

To master all these new ways is working, it is crucial for the management of manufacturing companies, both OEM and their suppliers, to initiate learning programs. Not as a Proof of Concept but as a real-life, growing activity.

Why MBD is not yet a common practice?

If you look at the success of MBD in Aerospace & Defense and Automotive, one of the main reasons was the push from the OEMs to align their suppliers. They even dictated CAD systems and versions to enable smooth and efficient collaboration.

In other industries, there we not so many giant OEMs that could dictate their supply chain. Often also, the OEM was not even ready for MBD. Therefore, the excuse was often we cannot push our suppliers to work different, let’s remain working as best as possible (the old way and some automation)

Besides the technical changes, MBD also had a business impact. Where the traditional 2D-Drawing was the contractual and leading information carrier, now the annotated 3D Model has to become the contractual agreement. This is much more complex than browsing through (paper) documents; now, you need an application to open up the content and select the right view(s) or datasets.

In the interaction between engineering and manufacturing, you could hear statements like:

you can use the 3D Model for your NC programming, but be aware the 2D drawing is leading. We cannot guarantee consistency between them.

In particular, this is a business change affecting the relationship between an OEM and its suppliers. And we know business changes do not happen overnight.

Smaller suppliers might even refuse to work on a Model-Based definition, as it is considered an extra overhead they do not benefit from.

In particular, when working with various OEMs that might have their own preferred MBD package content based on their preferred usage. There are standards; however, OEMs often push for their preferred proprietary format.

It is about an orchestrated change.

Implementing MBD in your company, like PLM, is challenging because people need to be aligned and trained on new ways of working. In particular, this creates resistance at the end-user level.

Similar to the introduction of mainstream CAD (AutoCAD in the eighties) and mainstream 3D CAD (Solidworks in the late nineties), it requires new processes, trained people, and matching tools.

This is not always on the agenda of C-level people who try to avoid technical details (because they don’t understand them – read this great article: Technical Leadership: A Chronic Weakness in Engineering Enterprises.

I am aware of learning materials coming from the US, not so much about European or Asian thought leaders. Feel free to add other relevant resources for the readers in this post’s comments. Have a look and talk with:

Action Engineering with their OSCAR initiative: Bringing MBD Within Reach. I spoke with Jennifer Herron, founder of Action Engineering, a year ago about MBD and OSCAR in my blog post: PLM and Model-Based Definition.

Another interesting company to follow is Capvidia. Read their blog post to start with is MBD model-based definition in the 21st century.

The future

What you will discover from these two companies is that they focus on the connected flow of information between companies while anticipating that each stakeholder might have their preferred (traditional) PLM environment. It is about data federation.

The future of a connected enterprise is even more complex. So I was excited to see and download Yousef Hooshmand’s paper:  ”From a Monolithic PLM Landscape to a Federated Domain and Data Mesh”.

Yousef and some of his colleagues report about their PLM modernization project @Mercedes-Benz AG, aiming at transforming a monolithic PLM landscape into a federated Domain and Data Mesh.

This paper provides a lot of structured thinking related to the concepts I try to explain to my audience in everyday language. See my The road to model-based and connected PLM thoughts.

This paper has much more depth and is a must-read and must-discuss writing for those interested – perhaps an opportunity for new startups and a threat to traditional PLM vendors.

Conclusion

Vellum drawings are almost gone now – we have electronic 2D Drawings. The model-based definition has confirmed the benefits of improving the interaction between engineering, manufacturing & suppliers. Still, many industries are struggling with this approach due to process & people changes needed. If you are not able or willing to implement a model-based definition approach, be worried about the future. The eco-systems will only run efficiently (and survive) when their information exchange is based on data and models. Start learning now.

p.s. just out of curiosity:
If you are model-based advocate support this post with a

 

Once and a while, the discussion pops up if, given the changes in technology and business scope, we still should talk about PLM. John Stark and others have been making a point that PLM should become a profession.

In a way, I like the vagueness of the definition and the fact that the PLM profession is not written in stone. There is an ongoing change, and who wants to be certified for the past or framed to the past?

However, most people, particularly at the C-level, consider PLM as something complex, costly, and related to engineering. Partly this had to do with the early introduction of PLM, which was a little more advanced than PDM.

The focus and capabilities made engineering teams happy by giving them more access to their data. But unfortunately, that did not work, as engineers are not looking for more control.

Old (current) PLM

Therefore, I would like to suggest that when we talk about PLM, we frame it as Product Lifecycle Data Management (the definition). A PLM infrastructure or system should be considered the System of Record, ensuring product data is archived to be used for manufacturing, service, and proving compliance with regulations.

In a modern way, the digital thread results from building such an infrastructure with related artifacts. The digital thread is somehow a slow-moving environment, connecting the various as-xxx structures (As-Designed, As-Planned, As-Manufactured, etc.). Looking at the different PLM vendor images, Aras example above, I consider the digital thread a fancy name for traceability.

I discussed the topic of Digital Thread in 2018:  Document Management or Digital Thread. One of the observations was that few people talk about the quality of the relations when providing traceability between artifacts.

The quality of traceability is relevant for traditional Configuration Management (CM). Traditional CM has been framed, like PLM, to be engineering-centric.

Both PLM and CM need to become enterprise activities – perhaps unified.

Read my blog post and see the discussion with Martijn Dullaart, Lisa Fenwick and Maxim Gravel when discussing the future of Configuration Management.

New digital PLM

In my posts, I talked about modern PLM. I described it as data-driven, often in relation to a model-based approach. And as a result of the data-driven approach, a digital PLM environment could be connected to processes outside the engineering domain. I wrote a series of posts related to the potential of such a new PLM infrastructure (The road to model-based and connected PLM)

Digital PLM, if implemented correctly, could serve people along the full product lifecycle, from marketing/portfolio management until service and, if relevant, decommissioning). The bigger challenge is even connecting eco-systems to the same infrastructure, in particular suppliers & partners but also customers. This is the new platform paradigm.

Some years ago, people stated IoT is the new PLM  (IoT is the new PLM – PTC 2017). Or MBSE is the foundation for a new PLM (Will MBSE be the new PLM instead of IoT? A discussion @ PLM Roadmap conference 2018).

Even Digital Transformation was mentioned at that time. I don’t believe Digital Transformation is pointing to a domain, more to an ongoing process that most companies have t go through. And because it is so commonly used, it becomes too vague for the specifics of our domain. I liked Monica Schnitger‘s LinkedIn post: Digital Transformation? Let’s talk. There is enough to talk about; we have to learn and be more specific.

 

What is the difference?

The challenge is that we need more in-depth thinking about what a “digital transformed” company would look like. What would impact their business, their IT infrastructure, and their organization and people? As I discussed with Oleg Shilovitsky, a data-driven approach does not necessarily mean simplification.

I just finished recording a podcast with Nina Dar while writing this post. She is even more than me, active in the domain of PLM and strategic leadership toward a digital and sustainable future. You can find the pre-announcement of our podcast here (it was great fun to talk), and I will share the result later here too.

What is clear to me is that a new future data-driven environment becomes like a System of Engagement. You can simulate assumptions and verify and qualify trade-offs in real-time in this environment. And not only product behavior, but you can also simulate and analyze behaviors all along the lifecycle, supporting business decisions.

This is where I position the digital twin. Modern PLM infrastructures are in real-time connected to the business. Still, PLM will have its system of record needs; however, the real value will come from the real-time collaboration.

The traditional PLM consultant should transform into a business consultant, understanding technology. Historically this was the opposite, creating friction in companies.

Starting from the business needs

In my interactions with customers, the focus is no longer on traditional PLM; we discuss business scenarios where the company will benefit from a data-driven approach. You will not obtain significant benefits if you just implement your serial processes again in a digital PLM infrastructure.

Efficiency gains are often single digit, where new ways of working can result in double-digit benefits or new opportunities.

Besides traditional pressure on companies to remain competitive, there is now a new additional driver that I have been discussing in my previous post, the Innovation Dilemma. To survive on our planet, we and therefore also companies, need to switch to sustainable products and business models.

This is a push for innovation; however, it requires a coordinated, end-to-end change within companies.

Be the change

When do you decide to change your business model from pushing products to the marker into a business model of Product as a Service? When do you choose to create repairable and upgradeable products? It is a business need. Sustainability does not start with the engineer. It must be part of the (new) DNA of a company.

Interesting to read is this article from Jan Bosch that I read this morning: Resistance to Change. Read the article as it makes so much sense, but we need more than sense – we need people to get involved. My favorite quote from the article:

“The reasonable man adapts himself to the world; the unreasonable one persists in trying to adapt the world to himself. Therefore, all progress depends on the unreasonable man”.

Conclusion

PLM consultants should retrain themselves in System Thinking and start from the business. PLM technology alone is no longer enough to support companies in their (digital/sustainable) transformation. Therefore, I would like to introduce BLM (Business Lifecycle Management) as the new TLA.

However, BLM has been already framed as Black Lives Matter. I agree with that, extending it to ALM (All Lives Matter).

What do you think should we leave the comfortable term PLM behind us for a new frame?

Yes, it is not a typo. Clayton Christensen famous book written in 1995 discussed the Innovator’s Dilemma when new technologies cause great firms to fail. This was the challenge two decades ago. Existing prominent companies could become obsolete quickly as they were bypassed by new technologies.

The examples are well known. To mention a few: DEC (Digital Equipment Corporation), Kodak, and Nokia.

Why the innovation dilemma?

This decade the challenge has become different. All companies are forced to become more sustainable in the next ten years. Either pushed by global regulations or because of their customer demands. The challenge is this time different. Besides the priority of reducing greenhouse gas emissions, there is also the need to transform our society from a linear, continuous growth economy into a circular doughnut economy.

The circular economy makes the creation, the usage and the reuse of our products more complex as the challenge is to reduce the need for raw materials and avoid landfills.

The circular economy concept – the regular product lifecycle in the middle

The doughnut economy makes the values of an economy more complex as it is not only about money and growth, human and environmental factors should also be considered.

Doughnut Economics: Trying to stay within the green boundaries

To manage this complexity, I wrote SYSTEMS THINKING – a must-have skill in the 21st century, focusing on the logical part of the brain. In my follow-up post, Systems Thinking: a second thought, I looked at the human challenge. Our brain is not rational and wants to think fast to solve direct threats. Therefore, we have to overcome our old brains to make progress.

An interesting and thought-provoking was shared by Nina Dar in this discussion, sharing the video below. The 17 Sustainability Development Goals (SDGs) describe what needs to be done. However, we also need the Inner Development Goals (IDGs) and the human side to connect. Watch the movie:

Our society needs to change and innovate; however, we cannot. The Innovation Dilemma.

The future is data-driven and digital.

What is clear to me is that companies developing products and services have only one way to move forward: becoming data-driven and digital.

Why data-driven and digital?

Let’s look at something companies might already practice, REACH (Registration, Evaluation, Authorization and Restriction of Chemicals). This European directive, introduced in 2007, had the aim to protect human health and protect the environment by communicating information on chemicals up and down the supply chain. This would ensure that manufacturers, importers, and their customers are aware of information relating to the health and safety of the products supplied.

The regulation is currently still suffering in execution as most of the reporting and evaluation of chemicals is done manually. Suppliers report their chemicals in documents, and companies report the total of chemicals in their summary reports. Then, finally, authorities have to go through these reports.

Where the scale of REACH is limited, the manual effort to have end-to-end reporting is relatively high. In addition, skilled workers are needed to do the job because reporting is done in a document-based manner.

Life Cycle Assessments (LCA)

Where you might think REACH is relatively simple, the real new challenges for companies are the need to perform Life Cycle Assessments for their products. In a Life Cycle Assessment. The Wiki definition of LCA says:

Life cycle assessment or LCA (also known as life cycle analysis) is a methodology for assessing environmental impacts associated with all the stages of the life cycle of a commercial product, process, or service. For instance, in the case of a manufactured product, environmental impacts are assessed from raw material extraction and processing (cradle), through the product’s manufacture, distribution and use, to the recycling or final disposal of the materials composing it (grave)

This will be a shift in the way companies need to define products. Much more thinking and analysis are required in the early design phases. Before committing to a physical solution, engineers and manufacturing engineers need to simulate and calculate the impact of their design decisions in the virtual world.

This is where the digital twin of the design and the digital twin of the manufacturing process becomes relevant. And remember: Digital Twins do not run on documents – you need connected data and various types of models to calculate and estimate the environmental impact.

LCA done in a document-based manner will make your company too slow and expensive.

I described this needed transformation in my series from last year: The road to model-based and connected PLM – nine posts exploring the technology and concept of a model-based, data-driven PLM infrastructure.

Digital Product Passport (DPP)

The European Commission has published an action plan for the circular economy, one of the most important building blocks of the European Green Deal. One of the defined measures is the gradual introduction of a Digital Product Passport (DPP). As the quality of an LCA depends on the quality and trustworthy information about products and materials, the DPP is targeting to ensure circular economy metrics become reliable.

This will be a long journey. If you want to catch a glimpse of the complexity, read this Medium article: The digital product passport and its technical implementation related to the DPP for batteries.

The innovation dilemma

Suppose you agree with my conclusion that companies need to change their current product or service development into a data-driven and model-based manner. In that case, the question will come up: where to start?

Becoming data-driven and model-based, of course, is not the business driver. However, this change is needed to be able to perform Life Cycle Assessments and comply with current and future regulations by remaining competitive.

A document-driven approach is a dead-end.

Now let’s look at the real dilemmas by comparing a startup (clean sheet / no legacy) and an existing enterprise (experience with the past/legacy). Is there a winning approach?

The Startup

Having lived in Israel – the nation where almost everyone is a startup – and working with startups afterward in the past 10 years, I always get inspired by these people’s energy in startup companies. They have a unique value proposition most of the time, and they want to be visible on the market as soon as possible.

This approach is the opposite of systems thinking. It is often a very linear process to deliver this value proposition without exploring the side effects of such an approach.

For example, the new “green” transportation hype. Many cities now have been flooded with “green” scooters and electric bikes to promote transportation as a service. The idea behind this concept is that citizens do not require to own polluting motorbikes or cars anymore, and transportation means will be shared. Therefore, the city will be cleaner and greener.

However, these “green” vehicles are often designed in the traditional linear way. Is there a repair plan or a plan to recycle the batteries? Reuse of materials used.? Most of the time, not. Please, if you have examples contradicting my observations, let me know. I like to hear good news.

When startup companies start to scale, they need experts to help them grow the company. Often these experts are seasoned people, perhaps close to retirement. They will share their experience and what they know best from the past:  traditional linear thinking.

As a result, even though startup companies can start with a clean sheet, their focus on delivering the product or service blocks further thinking. Instead, the seasoned experts will drive the company towards ways of working they know from the past.

Out of curiosity: Do you know or work in a startup that has started with a data-driven and model-based vision from scratch?  Please add the name of this company in the comments, and let’s learn how they did it.

The Existing company

Working in an established company is like being on board a big tanker. Changing its direction takes a clear eye on the target and navigation skills to come there. Unfortunately, most of the time, these changes take years as it is impossible to switch the PLM infrastructure and the people skills within a short time.

From the bimodal approach in 2015 to the hybrid approach for companies, inspired by this 2017 McKinsey article: Toward an integrated technology operating model, I discovered that this is probably the best approach to ensure a change will happen. In this approach – see image – the organization keeps running on its document-driven PLM infrastructure. This type of infrastructure becomes the system of record. Nothing different from what PLM currently is in most companies.

In parallel, you have to start with small groups of people who independently focus on a new product, a new service. Using the model-based approach, they work completely independently from the big enterprise in a data-driven approach. Their environment can be considered the future system of engagement.

The data-driven approach allows all disciplines to work in a connected, real-time manner. Mastering the new ways of working is usually the task of younger employees that are digital natives. These teams can be completed by experienced workers who behave as coaches. However, they will not work in the new environment; these coaches bring business knowledge to the team.

People cannot work in two modes, but organizations can. As you can see from the McKinsey chart, the digital teams will get bigger and more important for the core business over time. In parallel, when their data usage grows, more and more data integration will occur between the two operation modes. Therefore, the old PLM infrastructure can remain a System of Record and serve as a support backbone for the new systems of engagement.

The Innovation Dilemma conclusion

The upcoming ten years will push organizations to innovate their ways of working to become sustainable and competitive. As discussed before, they must learn to work in a data-driven, connected manner. Both startups and existing enterprises have challenges – they need to overcome the “thinking fast and acting slow” mindset. Do you see the change in your company?

 

Note: Before publishing this post, I read this interesting and complementary post from Jan Bosch Boost your digitalization: instrumentation.

It is in the air – grab it.

 

In my previous post, “My PLM Bookshelf,” on LinkedIn, I shared some of the books that influenced my thinking related to PLM. As you can see in the LinkedIn comments, other people added their recommendations for PLM-related books to get inspired or more knowledgeable.

 

Where reading a book is a personal activity, now I want to share with you how to get educated in a more interactive manner related to PLM. In this post, I talk with Peter Bilello, President & CEO of CIMdata. If you haven’t heard about CIMdata and you are active in PLM, more to learn on their website HERE. Now let us focus on Education.

CIMdata

Peter, knowing CIMdata from its research valid for the whole PLM community, I am curious to learn what is the typical kind of training CIMdata is providing to their customers.

Jos, throughout much of CIMdata’s existence, we have delivered educational content to the global PLM industry. With a core business tenant of knowledge transfer, we began offering a rich set of PLM-related tutorials at our North American and pan-European conferences starting in the earlier 1990s.

Since then, we have expanded our offering to include a comprehensive set of assessment-based certificate programs in a broader PLM sense. For example, systems engineering and digital transformation-related topics. In total, we offer more than 30 half-day classes. All of which can be delivered in-person as a custom configuration for a specific client and through public virtual-live or in-person classes. We have certificated more than 1,000 PLM professionals since the introduction in 2009 of this PLM Leadership offering.

Based on our experience, we recommend that an organization’s professional education strategy and plans address the organization’s specific processes and enabling technologies. This will help ensure that it drives the appropriate and consistent operations of its processes and technologies.

For that purpose, we expanded our consulting offering to include a comprehensive and strategic digital skills transformation framework. This framework provides an organization with a roadmap that can define the skills an organization’s employees need to possess to ensure a successful digital transformation.

In turn, this framework can be used as an efficient tool for the organization’s HR department to define its training and job progression programs that align with its overall transformation.

 

The success of training

We are both promoting the importance of education to our customers. Can you share with us an example where Education really made a difference? Can we talk about ROI in the context of training?

Jos, I fully agree. Over the years, we have learned that education and training are often minimized (i.e., sub-optimized). This is unfortunate and has usually led to failed or partially successful implementations.

In our view, both education and training are needed, along with strong organizational change management (OCM) and a quality assurance program during and after the implementation.

In our terms, education deals with the “WHY” and training with the “HOW”. Why do we need to change? Why do we need to do things differently? And then “HOW” to use new tools within the new processes.

We have seen far too many failed implementations where sub-optimized decisions were made due to a lack of understanding (i.e., a clear lack of education). We have also witnessed training and education being done too early or too late.

This leads to a reduced Return on Investment (ROI).

Therefore a well-defined skills transformation framework is critical for any company that wants to grow and thrive in the digital world. Finally, a skills transformation framework needs to be tied directly to an organization’s digital implementation roadmap and structure, state of the process, and technology maturity to maximize success.

 

Training for every size of the company?

When CIMdata conducts PLM training, is there a difference, for example, when working with a big global enterprise or a small and medium enterprise?

You might think the complexity might be similar; however, the amount of internal knowledge might differ. So how are you dealing with that?

We basically find that the amount of training/education required mostly depends on the implementation scope. Meaning the scope of the proposed digital transformation and the current maturity level of the impacted user community.

It is important to measure the current maturity and establish appropriate metrics to measure the success of the training (e.g., are people, once trained, using the tools correctly).

CIMdata has created a three-part PLM maturity model that allows an organization to understand its current PLM-related organizational, process, and technology maturity.

The three-part PLM maturity model

The PLM maturity model provides an important baseline for identifying and/or developing the appropriate courses for execution.

This also allows us, when we are supporting the definition of a digital skills transformation framework, to understand how the level of internal knowledge might differ within and between departments, sites, and disciplines. All of which help define an organization-specific action plan, no matter its size.

 

Where is CIMdata training different?

Most of the time, PLM implementers offer training too for their prospects or customers. So, where is CIMdata training different?

 

For this, it is important to differentiate between education and training. So, CIMdata provides education (the why) and training and education strategy development and planning.

We don’t provide training on how to use a specific software tool. We believe that is best left to the systems integrator or software provider.

While some implementation partners can develop training plans and educational strategies, they often fall short in helping an organization to effectively transform its user community. Here we believe training specialists are better suited.

 

Digital Transformation and PLM

One of my favorite topics is the impact of digitization in the area of product development. CIMdata introduced the Product Innovation Platform concept to differentiate from traditional PDM/PLM. Who needs to get educated to understand such a transformation, and what does CIMdata contribute to this understanding.

We often start with describing the difference between digitalization and digitization. This is crucial to be understood by an organization’s management team. In addition, management must understand that digitalization is an enterprise initiative.

It isn’t just about product development, sales, or enabling a new service experience. It is about maximizing a company’s ROI in applying and leveraging digital as needed throughout the organization. The only way an organization can do this successfully is by taking an end-to-end approach.

The Product Innovation Platform is focused on end-to-end product lifecycle management. Therefore, it must work within the context of other enterprise processes that are focused on the business’s resources (i.e., people, facilities, and finances) and on its transactions (e.g., purchasing, paying, and hiring).

As a result, an organization must understand the interdependencies among these domains. If they don’t, they will ultimately sub-optimize their investment. It is these and other important topics that CIMdata describes and communicates in its education offering.

The Product Innovation Platform in a digital enterprise

More than Education?

As a former teacher, I know that a one-time education, a good book or slide deck, is not enough to get educated. How does CIMdata provide a learning path or coaching path to their customers?

Jos, I fully agree. Sustainability of a change and/or improved way of working (i.e., long-term sustainability) is key to true and maximized ROI. Here I am referring to the sustainability of the transformation, which can take years.

With this, organizational change management (OCM) is required. OCM must be an integral part of a digital transformation program and be embedded into a program’s strategy, execution, and long-term usage. That means training, education, communication, and reward systems all have to be managed and executed on an ongoing basis.

For example, OCM must be executed alongside an organization’s digital skills transformation program. Our OCM services focus on strategic planning and execution support. We have found that most companies understand the importance of OCM, often don’t fully follow through on it.

 

A model-based future?

During the CIMdata Roadmap & PDT conferences, we have often discussed the importance of Model-Based Systems Engineering methodology as a foundation of a model-based enterprise. What do you see? Is it only the big Aerospace and Defense companies that can afford this learning journey, or should other industries also invest? And if yes, how to start.

Jos, here I need to step back for a minute. All companies have to deal with increasing complexity for their organization, supply chain, products, and more.

So, to optimize its business, an organization must understand and employ systems thinking and system optimization concepts. Unfortunately, most people think of MBSE as an engineering discipline. This is unfortunate because engineering is only one of the systems of systems that an organization needs to optimize across its end-to-end value streams.

The reality is all companies can benefit from MBSE. As long as they consider optimization across their specific disciplines, in the context of their products and services and where they exist within their value chain.

The MBSE is not just for Aerospace and Defense companies. Still, a lot can be learned from what has already been done. Also, leading automotive companies are implementing and using MBSE to design and optimize semi- and high-automated vehicles (i.e., systems of systems).

The starting point is understanding your systems of systems environment and where bottlenecks exist.

There should be no doubt, education is needed on MBSE and how MBSE supports the organization’s Model-Based Enterprise requirements.

Published work from the CIMdata administrated A&D PLM Action Group can be helpful. Also, various MBE and systems engineering maturity models, such as one that CIMdata utilizes in its consulting work.

Want to learn more?

Thanks, Peter, for sharing your insights. Are there any specific links you want to provide to get educated on the topics discussed? Perhaps some books to read or conferences to visit?

x
Jos, as you already mentioned:

x

  • the CIMdata Roadmap & PDT conferences have provided a wealth of insight into this market for more than 25 years.
    [Jos: Search for my blog posts starting with the text: “The weekend after ….”]
  • In addition, there are several blogs, like yours, that are worth following, and websites, like CIMdata’s pages for education or other resources which are filled with downloadable reading material.
  • Additionally, there are many user conferences from PLM solution providers and third-party conferences, such as those hosted by the MarketKey organization in the UK.

These conferences have taken place in Europe and North America for several years. Information exchange and formal training and education are offered in many events. Additionally, they provide an excellent opportunity for networking and professional collaboration.

What I learned

Talking with Peter made me again aware of a few things. First, it is important to differentiate between education and training. Where education is a continuous process, training is an activity that must take place at the right time. Unfortunately, we often mix those two terms and believe that people are educated after having followed a training.

Secondly, investing in education is as crucial as investing in hard- or software. As Peter mentioned:

We often start with describing the difference between digitalization and digitization. This is crucial to be understood by an organization’s management team. In addition, management must understand that digitalization is an enterprise initiative.

System Thinking is not just an engineering term; it will be a mandate for managing a company, a product and even a planet into the future

Conclusion

This time a quote from Albert Einstein, supporting my PLM coaching intentions:

“Education is not the learning of facts
but the training of the mind to think.”

 

After two quiet weeks of spending time with my family in slow motion, it is time to start the year.

First of all, I wish you all a happy, healthy, and positive outcome for 2022, as we need energy and positivism together. Then, of course, a good start is always cleaning up your desk and only leaving the relevant things for work on the desk.

Still, I have some books at arm’s length, either physical or on my e-reader, that I want to share with you – first, the non-obvious ones:

The Innovators Dilemma

A must-read book was written by Clayton Christensen explaining how new technologies can overthrow established big companies within a very short period. The term Disruptive Innovation comes up here. Companies need to remain aware of what is happening outside and ready to adapt to your business. There are many examples even recently where big established brands are gone or diminished in a short period.

In his book, he wrote about DEC (Digital Equipment Company)  market leader in minicomputers, not having seen the threat of the PC. Or later Blockbuster (from video rental to streaming), Kodak (from analog photography to digital imaging) or as a double example NOKIA (from paper to market leader in mobile phones killed by the smartphone).

The book always inspired me to be alert for new technologies, how simple they might look like, as simplicity is the answer at the end. I wrote about in 2012: The Innovator’s Dilemma and PLM, where I believed cloud, search-based applications and Facebook-like environments could disrupt the PLM world. None of this happened as a disruption; these technologies are now, most of the time, integrated by the major vendors whose businesses are not really disrupted. Newcomers still have a hard time to concur marketspace.

In 2015 I wrote again about this book, The Innovator’s dilemma and Generation change. – image above. At that time, understanding disruption will not happen in the PLM domain. Instead, I predict there will be a more evolutionary process, which I would later call: From Coordinated to Connected.

The future ways of working address the new skills needed for the future. You need to become a digital native, as COVID-19 pushed many organizations to do so. But digital native alone does not bring success. We need new ways of working which are more difficult to implement.

Sapiens

The book Sapiens by Yuval Harari made me realize the importance of storytelling in the domain of PLM and business transformation. In short, Yuval Harari explains why the human race became so dominant because we were able to align large groups around an abstract theme. The abstract theme can be related to religion, the power of a race or nation, the value of money, or even a brand’s image.

The myth (read: simplified and abstract story) hides complexity and inconsistencies. It allows everyone to get motivated to work towards one common goal. A Yuval says: “Fiction is far more powerful because reality is too complex”.

Too often, I have seen well-analyzed PLM projects that were “killed” by management because it was considered too complex. I wrote about this in 2019  PLM – measurable or a myth? claiming that the real benefits of PLM are hard to predict, and we should not look isolated only to PLM.

My 2020 follow-up post The PLM ROI Myth, eludes to that topic. However, even if you have a soundproof business case at the management level, still the myth might be decisive to justify the investment.

That’s why PLM vendors are always working on their myths: the most cost-effective solution, the most visionary solution, the solution most used by your peers and many other messages to influence your emotions, not your factual thinking. So just read the myths on their websites.

If you have no time to read the book, look at the above 2015 Ted to grasp the concept and use it with a PLM -twisted mind.

Re-use your CAD

In 2015, I read this book during a summer holiday (meanwhile, there is a second edition). Although it was not a PLM book, it was helping me to understand the transition effort from a classical document-driven enterprise towards a model-based enterprise.

Jennifer Herron‘s book helps companies to understand how to break down the (information) wall between engineering and manufacturing.

At that time, I contacted Jennifer to see if others like her and Action Engineering could explain Model-Based Definition comprehensively, for example, in Europe- with no success.

As the Model-Based Enterprise becomes more and more the apparent future for companies that want to be competitive or benefit from the various Digital Twin concepts. For that reason, I contacted Jennifer again last year in my post: PLM and Model-Based Definition.

As you can read, the world has improved, there is a new version of the book, and there is more and more information to share about the benefits of a model-based approach.

I am still referencing Action Engineering and their OSCAR learning environment for my customers. Unfortunately, many small and medium enterprises do not have the resources and skills to implement a model-based environment.

Instead, these companies stay on their customers’ lowest denominator: the 2D Drawing. For me, a model-based definition is one of the first steps to master if your company wants to provide digital continuity of design and engineering information towards manufacturing and operations. Digital twins do not run on documents; they require model-based environments.

The book is still on my desk, and all the time, I am working on finding the best PLM practices related to a Model-Based enterprise.

It is a learning journey to deal with a data-driven, model-based environment, not only for PLM but also for CM experts, as you might have seen from my recent dialogue with CM experts: The future of Configuration Management.

Products2019

This book was an interesting novelty published by John Stark in 2020. John is known for his academic and educational books related to PLM. However, during the early days of the COVID-pandemic, John decided to write a novel. The novel describes the learning journey of Jane from Somerset, who, as part of her MBA studies, is performing a research project for the Josef Mayer Maschinenfabrik. Her mission is to report to the newly appointed CEO what happens with the company’s products all along the lifecycle.

Although it is not directly a PLM book, the book illustrates the complexity of PLM. It Is about people and culture; many different processes, often disconnected. Everyone has their focus on their particular discipline in the center of importance. If you believe PLM is all about the best technology only, read this book and learn how many other aspects are also relevant.

I wrote about the book in 2020: Products2019 – a must-read if you are new to PLM if you want to read more details. An important point to pick up from this book is that it is not about PLM but about doing business.

PLM is not a magical product. Instead, it is a strategy to support and improve your business.

System Lifecycle Management

Another book, published a little later and motivated by the extra time we all got during the COVID-19 pandemic, was Martin Eigner‘s book System Lifecycle Management.

A 281-page journey from the early days of data management towards what Martin calls System Lifecycle Management (SysLM). He was one of the first to talk about System Lifecycle Management instead of PLM.

I always enjoyed Martin’s presentations at various PLM conferences where we met. In many ways, we share similar ideas. However, during his time as a professor at the University of Kaiserslautern (2003-2017), he explored new concepts with his students.

I briefly mentioned the book in my series The road to model-based and connected PLM (Part 5) when discussing SLM or SysLM. His academic research and analysis make this book very valuable. It takes you in a very structured way through the times that mechatronics becomes important, next the time that systems (hardware and software) become important.

We discussed in 2015 the applicability of the bimodal approach for PLM. However, as many enterprises are locked in their highly customized PDM/PLM environments, their legacy blocks the introduction of modern model-based and connected approaches.

Where John Stark’s book might miss the PLM details, Martin’s book brings you everything in detail and with all its references.

It is an interesting book if you want to catch up with what has happened in the past 20 years.

More Books …..

More books on my desk have helped me understand the past or that helped me shape the future. As this is a blog post, I will not discuss more books this time reaching my 1500 words.

Still books worthwhile to read – click on their images to learn more:

I discussed this book two times last year. An introduction in PLM and Modularity and a discussion with the authors and some readers of the book: The Modular Way – a follow-up discussion

x

x

A book I read this summer contributed to a better understanding of sustainability. I mentioned this book in my presentation for the Swedish CATIA Forum in October last year – slide 29 of The Challenges of model-based and traditional plm. So you could see it as an introduction to System Thinking from an economic point of view.

System Thinking becomes crucial for a sustainable future, as I addressed in my post PLM and Sustainability.

Sustainability is my area of interest at the PLM Green Global Alliance, an international community of professionals working with Product Lifecycle Management (PLM) enabling technologies and collaborating for a more sustainable decarbonized circular economy.

Conclusion

There is a lot to learn. Tell us something about your PLM bookshelf – which books would you recommend. In the upcoming posts, I will further focus on PLM education. So stay tuned and keep on learning.

As promised in my early November post – The road to model-based and connected PLM (part 9 – CM), I come back with more thoughts and ideas related to the future of configuration management. Moving from document-driven ways of working to a data-driven and model-based approach fundamentally changes how you can communicate and work efficiently.

Let’s be clear: configuration management’s target is first of all about risk management. Ensuring your company’s business remains sustainable, efficient, and profitable.

By providing the appropriate change processes and guidance,  configuration management either avoids costly mistakes and iterations during all phases of a product lifecycle or guarantees the quality of the product and information to ensure safety.

Companies that have not implemented CM practices probably have not observed these issues. Or they have not realized that the root cause of these issues is a lack of CM.

Similar to what is said in smaller companies related to PLM, CM is often seen as an overhead, as employees believe they thoroughly understand their products. In addition, CM is seen as a hurdle to innovation because of the standardization of practices. So yes, they think it is normal that there are sometimes problems. That’s life.

I already wrote about this topic in 2010 PLM, CM and ALM – not sexy 😦 – where ALM means Asset Lifecycle Management – my focus at that time.

Hear it from the experts

To shape the discussion related to the future of Configuration Management, I had a vivid discussion with three thought leaders in this field: Lisa Fenwick, Martijn Dullaart and Maxime Gravel. A short introduction of the three of them:

Lisa Fenwick, VP Product Development at CMstat, a leading company in Configuration Management and Data Management software solutions and consulting services for aviation, aerospace & defense, marine, and other high-tech industries. She has over 25 years of experience with CM and Deliverables Management, including both government and commercial environments.

Ms. Fenwick has achieved CMPIC SME, CMPIC CM Assessor, and CMII-C certifications. Her experience includes implementing CM software products, CM-related consulting and training, and participation in the SAE and IEEE standards development groups

Martijn Dullaart is the Lead Architect for Enterprise Configuration Management at ASML (Our Dutch national pride) and chairperson of the Industry 4.0 committee of the Institute  Process Excellence (IPX) Congress. Martijn has his own blog mdux.net, and you might have seen him recently during the PLM Roadmap & PDT Fall conference in November – his thoughts about the CM future can be found on his blog here

Maxime Gravel, Manager Model-Based Engineering at Moog Inc., a worldwide designer, manufacturer, and integrator of advanced motion control products. Max has been the director of the model-based enterprise at the Institute for Process Excellence (IPX) and Head of Configuration and Change Management at Gulfstream Aerospace which certified the first aircraft in a 3D Model-Based Environment.

What we discussed:

We had an almost one-hour discussion related to the following points:

  • The need for Enterprise Configuration Management – why and how
  • The needed change from document-driven to model-based – the impact on methodology and tools
  • The “neural network” of data – connecting CM to all other business domains, a similar view as from the PLM domain,

I kept from our discussion the importance of planning – as seen in the CMstat image on the left.

To plan which data you need to manage and how you will manage the data. How often are you doing this in your company’s projects?

Next, all participants stressed the importance of education and training on this topic – get educated. Configuration Management is not a topic that is taught at schools. Early next year, I will come back on education as the benefits of education are often underestimated. Not everything can be learned by “googling.”

 Conclusion

The journey towards a model-based and data-driven future is not a quick one to be realized by new technologies. However, it is interesting to learn that the future of connected data (the “neural network”) allows organizations to implement both CM and PLM in a similar manner, using graph databases and automation. When executed at the enterprise level, the result will be that CM and PLM become natural practices instead of other siloed system-related disciplines.

Most of the methodology is there; the implementation to make it smooth and embedded in organizations will be the topics to learn. Join us in discussing and learning!

 

In my previous post, I discovered that my header for this series is confusing. Although a future implementation of system lifecycle management (SLM/PLM) will rely on models, the most foundational change needed is a technical one to create a data-driven infrastructure for connected ways of working.

My previous article discussed the concept of the dataset, which led to interesting discussions on LinkedIn and in my personal interactions. Also, this time Matthias Ahrens (HELLA) shared again a relevant but very academic article in this context – how to harmonize company information.

For those who want to dive deeper into the concept of connected datasets, read this article: The euBusinessGraph ontology: A lightweight ontology for harmonizing basic company information.

The article illustrates that the topic is relevant for all larger enterprises (and it is not an easy topic).

This time I want to share my thoughts about the two statements from my introductory post, i.e.:

A model-based approach with connected datasets seems to be the way forward. Managing data in documents will become inefficient as they cannot contribute to any digital accelerator, like applying algorithms. Artificial Intelligence relies on direct access to qualified data.

A model-based approach with connected datasets

We discussed connected datasets in the previous post; now, let’s explore why models and datasets are related. In the traditional CAD-centric PLM domain, most people will associate the word model with a CAD model, to be more precise, the 3D CAD Model. However, there are many other types of models used related to product development, delivery and operations.

A model can be a:

Physical Model

  • A smaller-scale object for the first analysis, e.g., a city or building model, an airplane model

Conceptual Model

  • A conceptual model describes the entities and their relations, e.g., a Process Flow Diagram (PFD)
  • A mathematical model describes a system concept using a mathematical language, e.g., weather or climate models. Modelica and MATLAB would fall in this category
  • A CGI (Computer Generated Imagery) or 3D CAD model is probably the most associated model in the mind of traditional PLM practitioners
  • Functional and Logical Models describing the services and components of a system are crucial in an MBSE

Operational Model

  • A model providing performance analysis based on (real-time) data coming from selected data sources. It could be an operational business model, an asset performance model; even my Garmin’s training performance model is such an operating model.

The list of all models above is not extensive nor academically defined. Moreover, some model term definitions might overlap, e.g., where would we classify software models or manufacturing models?

All models are a best-so-far approach to describing reality. Based on more accurate data from observations or measurements, the model comes closer to what happens in reality.

A model and its data

Never blame the model when there is a difference between what the model predicts and the observed reality. It is still a model.  That’s why we need feedback loops from the actual physical world to the virtual world to fine-tune the model.

Part of what we call Artificial Intelligence is nothing more than applying algorithms to a model. The more accurate data available, the more “intelligent” the artificial intelligence solution will be.

By using data analysis complementary to the model, the model may get better and better through self-learning. Like our human brain, it starts with understanding the world (our model) and collecting experiences (improving our model).

There are two points I would like to highlight for this paragraph:

  • A model is never 100 % the same as reality – so don’t worry about deviations. There will always be a difference between virtual predicted and physical measured – most of the time because reality has much more influencing parameters.
  • The more qualified data we use in the model, the closer to reality – so focus on accurate (and the right) data for your model. Although, as most of the time, it is impossible to fully model a system, focus on the most significant data sources.

The ultimate goal: THE DIGITAL TWIN

The discussion related to data-driven and the usage of models might feel abstract and complex (and that’s the case). However the term “digital twin” is well known and even used in board rooms.

The great benefits of a digital twin for business operations and for sustainability are promoted by many software vendors and consultancy firms.

My statement and reason for this series of blog posts: Digital Twins do not run on documents, you need to have a data-driven, model-based infrastructure to efficiently benefit from digital twin concepts.

Unfortunate a reliable and sustainable implementation of a digital twin requires more than software – it is a learning journey to connect the right data to the right model.
A puzzle every company has to solve as there is no 100 percent blueprint at this time.

Are Low Code platforms the answer?

I mentioned the importance of accurate data. Companies have different systems or even platforms managing enterprise data. The digital dream is that by combining datasets from different systems and platforms, we can provide to any user the needed information in real-time. My statement from my introductory post was:

I don’t believe in Low-Code platforms that provide ad-hoc solutions on demand. The ultimate result after several years might be again a new type of spaghetti. On the other hand, standardized interfaces and protocols will probably deliver higher, long-term benefits. Remember: Low code: A promising trend or a Pandora’s Box?

Let’s look into some of the low-code platform messages mentioned by Low-Code advocates:

You will have an increasingly hard time finding developers to keep up with global app development demands (reason #1 for PEGA)

This statement reminded me of the early days of SmarTeam implementations. With a Data model Wizard, a Form Designer, and a Visual Basic COM API, you could create any kind of data management application with SmarTeam. By using its built-in behaviors for document lifecycle management, item lifecycle management, and CAD integrations combined with easy customizations.

The sky was the limit to satisfy end users.  No need for an experienced partner or to be a skilled programmer (this was 2003+). SmarTeam was a low-code platform the marketing department would say now.

A lot of my activities between 2003 and 2010 were related fixing the problems related to flexibility,  making sense (again) of customizations.  I wrote about this in a 2015 post: The importance of a (PLM) data model sharing the experiences of “fixing” issues created to flexibility.

Think first

The challenge is that an enthusiastic team creates a (low code) solution rapidly. Immediate success is celebrated by the people involved. However, the future impact of this solution is often forgotten – we did the job,  right?

Documentation and a broader visibility are often lacking when implementing such a solution.

For example, suppose your product data is going to be consumed by another app. In that case, you need to make sure that the information you consume is accurate. On the other hand, perhaps the information was valid when you created the app.

However, if your friendly co-worker has moved on to another job and someone with different data standards becomes responsible for the data you consume, the reliability might fail. So how do you guarantee its quality?

Easy tools have often led to spaghetti, starting from Clipper (the old days), Visual Basic (the less old days) to highly customizable systems (like Aras is promoting) and future low-code platforms (and Aras is there again).

However, the strength of being highly flexible is also the weaknesses if not managed and understood correctly. In particular, in a digital enterprise architecture, you need skilled people who guarantee a reliable anchorage of the solution.

The HBR article When Low-Code/No-Code Development Works — and When It Doesn’t mentions the same point:

There are great benefits from LC/NC software development, but management challenges as well. Broad use of these tools institutionalizes the “shadow IT phenomenon, which has bedeviled IT organizations for decades — and could make the problem much worse if not appropriately governed. Citizen developers tend to create applications that don’t work or scale well, and then they try to turn them over to IT. Or the person may leave the company, and no one knows how to change or support the system they developed.

The fundamental difference: from coordinated to connected

For the moment, I remain skeptical about the low-code hype, because I have seen this kind of hype before. The most crucial point companies need to understand is that the coordinated world and the connected world are incompatible.

Using new tools based on old processes and existing data is not a digital transformation. Instead, a focus on value streams and their needed (connected) data should lead to the design of a modern digital enterprise, not the optimization and connectivity between organizational siloes.
Before buying a tool (a medicine) to reduce the current pains, imagine your future ways of working, discover what is possible with your existing infrastructure and identify the gaps.

Next, you need to analyze if these gaps are so significant that it requires a technology change. Probably it does, as historically, systems were not designed to share data horizontally in an organization.

In this context, have a look at Lionel Grealou’s s article for Engineering.com:
Data Readiness in the new age of digital collaboration.

Conclusion

We discussed the crucial relation between models and data. Models have only value if they acquire the right and accurate data (exercise 1).

Next, even the simplest development platforms, like low-code platforms, require brains and a long-term strategy (exercise 2) – nothing is simple at this moment in transformational times.  

The next and final post in this series will focus on configuration management – a new approach is needed. I don’t have the answers, but I will share some thoughts

A recommended event and an exciting agenda and a good place to validate and share your thoughts.

I will be there and look forward to meeting you at this conference (unfortunate still virtually)

In my last post, I zoomed in on a preferred technical architecture for the future digital enterprise. Drawing the conclusion that it is a mission impossible to aim for a single connected environment. Instead, information will be stored in different platforms, both domain-oriented (PLM, ERP, CRM, MES, IoT) and value chain oriented (OEM, Supplier, Marketplace, Supply Chain hub).

In part 3, I posted seven statements that I will be discussing in this series. In this post, I will zoom in on point 2:

Data-driven does not mean we do not need any documents anymore. Read electronic files for documents. Likely, document sets will still be the interface to non-connected entities, suppliers, and regulatory bodies. These document sets can be considered a configuration baseline.

 

System of Record and System of Engagement

In the image below, a slide from 2016,  I show a simplified view when discussing the difference between the current, coordinated approach and the future, connected approach.  This picture might create the wrong impression that there are two different worlds – either you are document-driven, or you are data-driven.

In the follow-up of this presentation, I explained that companies need both environments in the future. The most efficient way of working for operations will be infrastructure on the right side, the platform-based approach using connected information.

For traceability and disconnected information exchanges, the left side will be there for many years to come. Systems of Record are needed for data exchange with disconnected suppliers, disconnected regulatory bodies and probably crucial for configuration management.

The System of Record will probably remain as a capability in every platform or cross-section of platform information. The Systems of Engagement will be the configured real-time environment for anyone involved in active company processes, not only ERP or MES, all execution.

Introducing SysML and SML

This summer, I received a copy of Martin Eigner’s System Lifecycle Management book, which I am reading at his moment in my spare moments. I always enjoyed Martin’s presentations. In many ways, we share similar ideas. Martin from his profession spent more time on the academic aspects of product and system lifecycle than I. But, on the other hand, I have always been in the field observing and trying to make sense of what I see and learn in a coherent approach. I am halfway through the book now, and for sure, I will come back on the book when I have finished.

A first impression: A great and interesting book for all. Martin and I share the same history of data management. Read all about this in his second chapter: Forty Years of Product Data Management

From PDM via PLM to SysLM, is a chapter that everyone should read when you haven’t lived it yourself. It helps you to understand the past (Learning for the past to understand the future). When I finish this series about the model-based and connected approach for products and systems, Martin’s book will be highly complementary given the content he describes.

There is one point for which I am looking forward to is feedback from the readers of this blog.

Should we, in our everyday language, better differentiate between Product Lifecycle Management (PLM) and System Lifecycle Management(SysLM)?

In some customer situations, I talk on purpose about System Lifecycle Management to create the awareness that the company’s offering is more than an electro/mechanical product. Or ultimately, in a more circular economy, would we use the term Solution Lifecycle Management as not only hardware and software might be part of the value proposition?

Martin uses consistently the abbreviation SysLM, where I would prefer the TLA SLM. The problem we both have is that both abbreviations are not unique or explicit enough. SysLM creates confusion with SysML (for dyslectic people or fast readers). SLM already has so many less valuable meanings: Simulation Lifecycle Management, Service Lifecycle Management or Software Lifecycle Management.

For the moment, I will use the abbreviation SLM, leaving it in the middle if it is System Lifecycle Management or Solution Lifecycle Management.

 

How to implement both approaches?

In the long term, I predict that more than 80 percent of the activities related to SLM will take place in a data-driven, model-based environment due to the changing content of the solutions offered by companies.

A solution will be based on hardware, the solid part of the solution, for which we could apply a BOM-centric approach. We can see the BOM-centric approach in most current PLM implementations. It is the logical result of optimizing the product lifecycle management processes in a coordinated manner.

However, the most dynamic part of the solution will be covered by software and services. Changing software or services related to a solution has completely different dynamics than a hardware product.

Software and services implementations are associated with a data-driven, model-based approach.

The management of solutions, therefore, needs to be done in a connected manner. Using the BOM-centric approach to manage software and services would create a Kafkaesque overhead.

Depending on your company’s value proposition to the market, the challenge will be to find the right balance. For example, when you keep on selling disconnectedhardware, there is probably no need to change your internal PLM processes that much.

However, when you are moving to a connected business model providing solutions (connected systems / Outcome-based services), you need to introduce new ways of working with a different go-to-market mindset. No longer linear, but iterative.

A McKinsey concept, I have been promoting several times, illustrates a potential path – note the article was not written with a PLM mindset but in a business mindset.

What about Configuration Management?

The different datasets defining a solution also challenge traditional configuration management processes. Configuration Management (CM) is well established in the aerospace & defense industry. In theory, proper configuration management should be the target of every industry to guarantee an appropriate performance, reduced risk and cost of fixing issues.

The challenge, however, is that configuration management processes are not designed to manage systems or solutions, where dynamic updates can be applied whether or not done by the customer.

This is a topic to solve for the modern Connected Car (system) or Connected Car Sharing (solution)

For that reason, I am inquisitive to learn more from Martijn Dullaart’s presentation at the upcoming PLM Roadmap/PDT conference. The title of his session: The next disruption please …

In his abstract for this session, Martijn writes:

From Paper to Digital Files brought many benefits but did not fundamentally impact how Configuration Management was and still is done. The process to go digital was accelerated because of the Covid-19 Pandemic. Forced to work remotely was the disruption that was needed to push everyone to go digital. But a bigger disruption to CM has already arrived. Going model-based will require us to reexamine why we need CM and how to apply it in a model-based environment. Where, from a Configuration Management perspective, a digital file still in many ways behaves like a paper document, a model is something different. What is the deliverable? How do you manage change in models? How do you manage ownership? How should CM adopt MBx, and what requirements to support CM should be considered in the successful implementation of MBx? It’s time to start unraveling these questions in search of answers.

One of the ideas I am currently exploring is that we need a new layer on top of the current configuration management processes extending the validation to software and services. For example, instead of describing every validated configuration, a company might implement the regular configuration management processes for its hardware.

Next, the systems or solutions in the field will report (or validate) their configuration against validation rules. A topic that requires a long discussion and more than this blog post, potentially a full conference.

Therefore I am looking forward to participating in the CIMdata/PDT FALL conference and pick-up the discussions towards a data-driven, model-based future with the attendees.  Besides CM, there are several other topics of great interest for the future. Have a look at the agenda here

 

Conclusion

A data-driven and model-based infrastructure still need to be combined with a coordinated, document-driven infrastructure.  Where the focus will be, depends on your company’s value proposition.

If we discuss hardware products, we should think PLM. When you deliver systems, you should perhaps talk SysML (or SLM). And maybe it is time to define Solution Lifecycle Management as the term for the future.

Please, share your thoughts in the comments.

 

My previous post introducing the concept of connected platforms created some positive feedback and some interesting questions. For example, the question from Maxime Gravel:

Thank you, Jos, for the great blog. Where do you see Change Management tool fit in this new Platform ecosystem?

is one of the questions I try to understand too. You can see my short comment in the comments here. However, while discussing with other experts in the CM-domain, we should paint the path forward. Because if we cannot solve this type of question, the value of connected platforms will be disputable.

It is essential to realize that a digital transformation in the PLM domain is challenging. No company or vendor has the perfect blueprint available to provide an end-to-end answer for a connected enterprise. In addition, I assume it will take 10 – 20 years till we will be familiar with the concepts.

It takes a generation to move from drawings to 3D CAD. It will take another generation to move from a document-driven, linear process to data-driven, real-time collaboration in an iterative manner.  Perhaps we can move faster, as the Automotive, Aerospace & Defense, and Industrial Equipment industries are not the most innovative industries at this time. Other industries or startups might lead us faster into the future.

Although I prefer discussing methodology, I believe before moving into that area, I need to clarify some more technical points before moving forward. My apologies for writing it in such a simple manner. This information should be accessible for the majority of readers.

What means data-driven?

I often mention a data-driven environment, but what do I mean precisely by that. For me, a data-driven environment means that all information is stored in a dataset that contains a single aspect of information in a standardized manner, so it becomes accessible by outside tools.

A document is not a dataset, as often it includes a collection of datasets. Most of the time, the information it is exposed to is not standardized in such a manner a tool can read and interpret the exact content. We will see that a dataset needs an identifier, a classification, and a status.

An identifier to be able to create a connection between other datasets – traceability or, in modern words, a digital thread.
A classification as the classification identifier will determine the type of information the dataset contains and potential a set of mandatory attributes

A status to understand if the dataset is stable or still in work.

Examples of a data-driven approach – the item

The most common dataset in the PLM world is probably the item (or part) in a Bill of Material. The identifier is the item number (ID + revision if revisions are used). Next, the classification will tell you the type of part it is.

Part classification can be a topic on its own, and every industry has its taxonomy.

Finally, the status is used to identify if the dataset is shareable in the context of other information (released, in work, obsolete), allowing tools to expose only relevant information.

In a data-driven manner, a part can occur in several Bill of Materials – an example of a single definition consumed in other places.

When the part information changes, the accountable person has to analyze the relations to the part, which is easy in a data-driven environment. It is normal to find this functionality in a PDM or ERP system.

When the part would change in a document-driven environment, the effort is much higher.

First, all documents need to be identified where this part occurs. Then the impact of change needs to be managed in document versions, which will lead to other related changes if you want to keep the information correct.

Examples of a data-driven approach – the requirement

Another example illustrating the benefits of a data-driven approach is implementing requirements management, where requirements become individual datasets.  Often a product specification can contain hundreds of requirements, addressing the needs of different stakeholders.

In addition, several combinations of requirements need to be handled by other disciplines, mechanical, electrical, software, quality and legal, for example.

As requirements need to be analyzed and ranked, a specification document would never be frozen. Trade-off analysis might lead to dropping or changing a single requirement. It is almost impossible to manage this all in a document, although many companies use Excel. The disadvantages of Excel are known, in particular in a dynamic environment.

The advantage of managing requirements as datasets is that they can be grouped. So, for example, they can be pushed to a supplier (as a specification).

Or requirements could be linked to test criteria and test cases, without the need to manage documents and make sure you work with them last updated document.

As you will see, also requirements need to have an Identifier (to manage digital relations), a classification (to allow grouping) and a status (in work / released /dropped)

Data-driven and Models – the 3D CAD model

3D PDF Model

When I launched my series related to the model-based approach in 2018, the first comments I got came from people who believed that model-based equals the usage of 3D CAD models – see Model-based – the confusion. 3D Models are indeed an essential part of a model-based infrastructure, as the 3D model provides an unambiguous definition of the physical product. Just look at how most vendors depict the aspects of a virtual product using 3D (wireframe) models.

Although we use a 3D representation at each product lifecycle stage, most companies do not have a digital continuity for the 3D representation. Design models are often too heavy for visualization and field services support. The connection between engineering and manufacturing is usually based on drawings instead of annotated models.

I wrote about modern PLM and Model-Based Definition, supported by Jennifer Herron from Action Engineering – read the post PLM and Model-Based Definition here.

If your company wants to master a data-driven approach, this is one of the most accessible learning areas. You will discover that connecting engineering and manufacturing requires new technology, new ways of working and much more coordination between stakeholders.

Implementing Model-Based Definition is not an easy process. However, it is probably one of the best steps to get your digital transformation moving. The benefits of connected information between engineering and manufacturing have been discussed in the blog post PLM and Model-Based Definition

Essential to realize all these exciting capabilities linked to Industry 4.0 require a data-driven, model-based connection between engineering and manufacturing.

If this is not the case, the projected game-changers will not occur as they become too costly.

Data-driven and mathematical models

To manage complexity, we have learned that we have to describe the behavior in models to make logical decisions. This can be done in an abstract model, purely based on mathematical equations and relations. For example, suppose you look at climate models, weather models or COVID infections models.

In that case, we see they all lead to discussions from so-called experts that believe a model should be 100 % correct and any exception shows the model is wrong.

It is not that the model is wrong; the expectations are false.

For less complex systems and products, we also use models in the engineering domain. For example, logical models and behavior models are all descriptive models that allow people to analyze the behavior of a product.

For example, how software code impacts the product’s behavior. Usually, we speak about systems when software is involved, as the software will interact with the outside world.

There can be many models related to a product, and if you want to get an impression, look at this page from the SEBoK wiki: Types of Models. The current challenge is to keep the relations between these models by sharing parameters.

The sharable parameters then again should be datasets in a data-driven environment. Using standardized diagrams, like SysML or UML,  enables the used objects in the diagram to become datasets.

I will not dive further into the modeling details as I want to remain at a high level.

Essential to realize digital models should connect to a data-driven infrastructure by sharing relevant datasets.

What does data-driven imply?

 

I want to conclude this time with some statements to elaborate on further in upcoming posts and discussions

  1. Data-driven does not imply there needs to be a single environment, a single database that contains all information. Like I mentioned in my previous post, it will be about managing connected datasets in a federated manner. It is not anymore about owned the data; it is about access to reliable data.
  2. Data-driven does not mean we do not need any documents anymore. Read electronic files for documents. Likely, document sets will still be the interface to non-connected entities, suppliers, and regulatory bodies. These document sets can be considered a configuration baseline.
  3. Data-driven means that we need to manage data in a much more granular manner. We have to look different at data ownership. It becomes more data accountability per role as the data can be used and consumed throughout the product lifecycle.
  4. Data-driven means that you need to have an enterprise architecture, data governance and a master data management (MDM) approach. So far, the traditional PLM vendors have not been active in the MDM domain as they believe their proprietary data model is leading. Read also this interesting McKinsey article: How enterprise architects need to evolve to survive in a digital world
  5. A model-based approach with connected datasets seems to be the way forward. Managing data in documents will become inefficient as they cannot contribute to any digital accelerator, like applying algorithms. Artificial Intelligence relies on direct access to qualified data.
  6. I don’t believe in Low-Code platforms that provide ad-hoc solutions on demand. The ultimate result after several years might be again a new type of spaghetti. On the other hand, standardized interfaces and protocols will probably deliver higher, long-term benefits. Remember: Low code: A promising trend or a Pandora’s Box?
  7. Configuration Management requires a new approach. The current methodology is very much based on hardware products with labor-intensive change management. However, the world of software products has different configuration management and change procedure. Therefore, we need to merge them in a single framework. Unfortunately, this cannot be the BOM framework due to the dynamics in software changes. An interesting starting point for discussion can be found here: Configuration management of industrial products in PDM/PLM

 

Conclusion

Again, a long post, slowly moving into the future with many questions and points to discuss. Each of the seven points above could be a topic for another blog post, a further discussion and debate.

After my summer holiday break in August, I will follow up. I hope you will join me in this journey by commenting and contributing with your experiences and knowledge.

 

 

 

 

So far, I have been discussing PLM experiences and best practices that have changed due to introducing electronic drawings and affordable 3D CAD systems for the mainstream. From vellum to PDM to item-centric PLM to manage product designs and manufacturing specifications.

Although the technology has improved, the overall processes haven’t changed so much. As a result, disciplines could continue to work in their own comfort zone, most of the time hidden and disconnected from the outside world.

Now, thanks to digitalization, we can connect and format information in real-time. Now we can provide every stakeholder in the company’s business to have almost real-time visibility on what is happening (if allowed). We have seen the benefits of platformization, where the benefits come from real-time connectivity within an ecosystem.

Apple, Amazon, Uber, Airbnb are the non-manufacturing related examples. Companies are trying to replicate these models for other businesses, connecting the concept owner (OEM ?), with design and manufacturing (services), with suppliers and customers. All connected through information, managed in data elements instead of documents – I call it connected PLM

Vendors have already shared their PowerPoints, movies, and demos from how the future would be in the ideal world using their software. The reality, however, is that implementing such solutions requires new business models, a new type of organization and probably new skills.

The last point is vital, as in schools and organizations, we tend to teach what we know from the past as this gives some (fake) feeling of security.

The reality is that most of us will have to go through a learning path, where skills from the past might become obsolete; however, knowledge of the past might be fundamental.

In the upcoming posts, I will share with you what I see, what I deduct from that and what I think would be the next step to learn.

I firmly believe connected PLM requires the usage of various models. Not only the 3D CAD model, as there are so many other models needed to describe and analyze the behavior of a product.

I hope that some of my readers can help us all further on the path of connected PLM (with a model-based approach). This series of posts will be based on the max size per post (avg 1500 words) and the ideas and contributes coming from you and me.

What is platformization?

In our day-to-day life, we are more and more used to direct interaction between resellers and services providers on one side and consumers on the other side. We have a question, and within 24 hours, there is an answer. We want to purchase something, and potentially the next day the goods are delivered. These are examples of a society where all stakeholders are connected in a data-driven manner.

We don’t have to create documents or specialized forms. An app or a digital interface allows us to connect. To enable this type of connectivity, there is a need for an underlying platform that connects all stakeholders. Amazon and Salesforce are examples for commercial activities, Facebook for social activities and, in theory, LinkedIn for professional job activities.

The platform is responsible for direct communication between all stakeholders.

The same applies to businesses. Depending on the products or services they deliver, they could benefit from one or more platforms. The image below shows five potential platforms that I identified in my customer engagements. Of course, they have a PLM focus (in the middle), and the grouping can be made differently.

Five potential business platforms

The 5 potential platforms

The ERP platform
is mainly dedicated to the company’s execution processes – Human Resources, Purchasing, Finance, Production scheduling, and potentially many more services. As platforms try to connect as much as possible all stakeholders. The ERP platform might contain CRM capabilities, which might be sufficient for several companies. However, when the CRM activities become more advanced, it would be better to connect the ERP platform to a CRM platform. The same logic is valid for a Product Innovation Platform and an ERP platform.  Examples of ERP platforms are SAP and Oracle (and they will claim they are more than ERP)

Note: Historically, most companies started with an ERP system, which is not the same as an ERP platform.  A platform is scalable; you can add more apps without having to install a new system. In a platform, all stored data is connected and has a shared data model.

The CRM platform

a platform that is mainly focusing on customer-related activities, and as you can see from the diagram, there is an overlap with capabilities from the other platforms. So again, depending on your core business and products, you might use these capabilities or connect to other platforms. Examples of CRM platforms are Salesforce and Pega, providing a platform to further extend capabilities related to core CRM.

The MES platform
In the past, we had PDM and ERP and what happened in detail on the shop floor was a black box for these systems. MES platforms have become more and more important as companies need to trace and guide individual production orders in a data-driven manner. Manufacturing Execution Systems (and platforms) have their own data model. However, they require input from other platforms and will provide specific information to other platforms.

For example, if we want to know the serial number of a product and the exact production details of this product (used parts, quality status), we would use an MES platform. Examples of MES platforms (none PLM/ERP related vendors) are Parsec and Critical Manufacturing

The IoT platform

these platforms are new and are used to monitor and manage connected products. For example, if you want to trace the individual behavior of a product of a process, you need an IoT platform. The IoT platform provides the product user with performance insights and alerts.

However, it also provides the product manufacturer with the same insights for all their products. This allows the manufacturer to offer predictive maintenance or optimization services based on the experience of a large number of similar products.  Examples of IoT platforms (none PLM/ERP-related vendors) are Hitachi and Microsoft.

The Product Innovation Platform (PIP)

All the above platforms would not have a reason to exist if there was not an environment where products were invented, developed, and managed. The Product Innovation Platform PIP – as described by CIMdata  -is the place where Intellectual Property (IP) is created, where companies decide on their portfolio and more.

The PIP contains the traditional PLM domain. It is also a logical place to manage product quality and technical portfolio decisions, like what kind of product platforms and modules a company will develop. Like all previous platforms, the PIP cannot exist without other platforms and requires connectivity with the other platforms is applicable.

Look below at the CIMdata definition of a Product Innovation Platform.

You will see that most of the historical PLM vendors aiming to be a PIP (with their different flavors): Aras, Dassault Systèmes, PTC and Siemens.

Of course, several vendors sell more than one platform or even create the impression that everything is connected as a single platform. Usually, this is not the case, as each platform has its specific data model and combining them in a single platform would hurt the overall performance.

Therefore, the interaction between these platforms will be based on standardized interfaces or ad-hoc connections.

Standard interfaces or ad-hoc connections?

Suppose your role and information needs can be satisfied within a single platform. In that case, most likely, the platform will provide you with the right environment to see and manipulate the information.

However, it might be different if your role requires access to information from other platforms. For example, it could be as simple as an engineer analyzing a product change who needs to know the actual stock of materials to decide how and when to implement a change.

This would be a PIP/ERP platform collaboration scenario.

Or even more complex, it might be a product manager wanting to know how individual products behave in the field to decide on enhancements and new features. This could be a PIP, CRM, IoT and MES collaboration scenario if traceability of serial numbers is needed.

The company might decide to build a custom app or dashboard for this role to support such a role. Combining in real-time data from the relevant platforms, using standard interfaces (preferred) or using API’s, web services, REST services, microservices (for specialists) and currently in fashion Low-Code development platforms, which allow users to combine data services from different platforms without being an expert in coding.

Without going too much in technology, the topics in this paragraph require an enterprise architecture and vision. It is opportunistic to think that your existing environment will evolve smoothly into a digital highway for the future by “fixing” demands per user. Your infrastructure is much more likely to end up congested as spaghetti.

In that context, I read last week an interesting post Low code: A promising trend or Pandora’s box. Have a look and decide for yourself

I am less focused on technology, more on methodology. Therefore, I want to come back to the theme of my series: The road to model-based and connected PLM. For sure, in the ideal world, the platforms I mentioned, or other platforms that run across these five platforms, are cloud-based and open to connect to other data sources. So, this is the infrastructure discussion.

In my upcoming blog post, I will explain why platforms require a model-based approach and, therefore, cause a challenge, particularly in the PLM domain.

It took us more than fifty years to get rid of vellum drawings. It took us more than twenty years to introduce 3D CAD for design and engineering. Still primarily relying on drawings. It will take us for sure one generation to switch from document-based engineering to model-based engineering.

Conclusion

In this post, I tried to paint a picture of the ideal future based on connected platforms. Such an environment is needed if we want to be highly efficient in designing, delivering, and maintaining future complex products based on hardware and software. Concepts like Digital Twin and Industry 4.0 require a model-based foundation.

In addition, we will need Digital Twins to reach our future sustainability goals efficiently. So, there is work to do.

Your opinion, Your contribution?

 

 

 

 

 

 

Translate

Categories

%d bloggers like this: