You are currently browsing the tag archive for the ‘Digital PLM’ tag.

The summer holidays are over, and with the PLM Global Green Alliance, we are glad to continue with our series: PLM and Sustainability, where we interview PLM-related software vendors, talking about their sustainability mission and offering.

We talked with SAP, Autodesk, and Dassault Systèmes. This week we spoke with Sustaira, and soon we will talk with Aras.  Sustaira, an independent Siemens partner, is the provider of a sustainability platform based on Mendix.

SUSTAIRA

The interview with Vincent de la Mar, founder and CEO of Sustaira, was quite different from the previous interviews. In the earlier interviews, we talked with people driving sustainability in their company and software portfolio. Now with Sustaira, we were talking with a relatively new company with a single focus on sustainability.

Sustaira provides an open platform targeting purely sustainability by offering relevant apps and infrastructure based on Mendix.

Listen to the interview and discover the differences and the potential for you.

Slides shown during the interview and additional company information: Sustaira Overview 2022.

What we have learned

Using the proven technology of the Mendix platform allows you to build a data-driven platform focused on sustainability for your company.

As I wrote in my post: PLM and Sustainability, there is the need to be data-driven and connected with federated data sources for accurate data.

This is a technology challenge. Sustaira, as a young company, has taken up this challenge and provides various apps related to sustainability topics on its platform. Still, they remain adaptable to your organization.

Secondly, I like the concept that although Mendix is part of the Siemens portfolio, you do not need to have Siemens PLM installed. The openness of the Sustaira platform allows you to implement it in your organization independent of your PLM infrastructure.

The final observation – the rule of people, process, and technology – is still valid. To implement Sustaira in an efficient and valuable manner, you need to be clear in your objectives and sustainability targets within the organization. And these targets should be more detailed than the corporate statement in the annual report.

 

Want to Learn more

To learn more about Sustaira and the wide variety of offerings, you can explore any of these helpful links:

 

Conclusion

It was interesting to learn about Sustaira and how they started with a proven technology platform (Mendix) to build their sustainability platform. Being sustainable involves using trusted data and calculations to understand the environmental impact at every lifecycle stage.

Again we can state that the technology is there. Now it is up to companies to act and connect the relevant data sources to underpin and improve their sustainability efforts.

 

While preparing my presentation for the Dutch Model-Based Definition solutions event, I had some reflections and experiences discussing Model-Based Definition. Particularly in traditional industries. In the Aerospace & Defense, and Automotive industry, Model-Based Definition has become the standard. However, other industries have big challenges in adopting this approach. In this post, I want to share my observations and bring clarifications about the importance.

 

What is a Model-Based Definition?

The Wiki-definition for Model-Based Definition is not bad:

Model-based definition (MBD), sometimes called digital product definition (DPD), is the practice of using 3D models (such as solid models, 3D PMI and associated metadata) within 3D CAD software to define (provide specifications for) individual components and product assemblies. The types of information included are geometric dimensioning and tolerancing (GD&T), component level materials, assembly level bills of materials, engineering configurations, design intent, etc.

By contrast, other methodologies have historically required the accompanying use of 2D engineering drawings to provide such details.

When I started to write about Model-Based definition in 2016, the concept of a connected enterprise was not discussed. MBD mainly enhanced data sharing between engineering, manufacturing, and suppliers at that time. The 3D PMI is a data package for information exchange between these stakeholders.

The main difference is that the 3D Model is the main information carrier, connected to 2D manufacturing views and other relevant data, all connected in this package.

 

MBD – the benefits

There is no need to write a blog post related to the benefits of MBD. With some research, you find enough reasons. The most important benefits of MBD are:

  • the information is and human-readable and machine-readable. Allowing the implementation of Smart Manufacturing / Industry 4.0 concepts
  • the information relies on processes and data and is no longer dependent on human interpretation. This leads to better quality and error-fixing late in the process.
  • MBD information is a building block for the digital enterprise. If you cannot master this concept, forget the benefits of MBSE and Virtual Twins. These concepts don’t run on documents.

To help you discover the benefits of MBD described by others – have a look here:

 

MBD as a stepping stone to the future

When you are able to implement model-based definition practices in your organization and connect with your eco-system, you are learning what it means to work in a connected matter. Where the scope is limited, you already discover that working in a connected manner is not the same as mandating everyone to work with the same systems or tools. Instead, it is about new ways of working (skills & people), combined with exchange standards (which to follow).

Where MBD is part of the bigger model-based enterprise, the same principles apply for connecting upstream information (Model-Based Systems Engineering) and downstream information(IoT-based operation and service models).

Oleg Shilovitsky addresses the same need from a data point of view in his recent blog: PLM Strategy For Post COVID Time. He makes an important point about the Digital Thread:

Digital Thread is one of my favorite topics because it is leading directly to the topic of connected data and services in global manufacturing networks.

I agree with that statement as the digital thread is like MBD, another steppingstone to organize information in a connected manner, even beyond the scope of engineering-manufacturing interaction. However, Digital Thread is an intermediate step toward a full data-driven and model-based enterprise.

To master all these new ways is working, it is crucial for the management of manufacturing companies, both OEM and their suppliers, to initiate learning programs. Not as a Proof of Concept but as a real-life, growing activity.

Why MBD is not yet a common practice?

If you look at the success of MBD in Aerospace & Defense and Automotive, one of the main reasons was the push from the OEMs to align their suppliers. They even dictated CAD systems and versions to enable smooth and efficient collaboration.

In other industries, there we not so many giant OEMs that could dictate their supply chain. Often also, the OEM was not even ready for MBD. Therefore, the excuse was often we cannot push our suppliers to work different, let’s remain working as best as possible (the old way and some automation)

Besides the technical changes, MBD also had a business impact. Where the traditional 2D-Drawing was the contractual and leading information carrier, now the annotated 3D Model has to become the contractual agreement. This is much more complex than browsing through (paper) documents; now, you need an application to open up the content and select the right view(s) or datasets.

In the interaction between engineering and manufacturing, you could hear statements like:

you can use the 3D Model for your NC programming, but be aware the 2D drawing is leading. We cannot guarantee consistency between them.

In particular, this is a business change affecting the relationship between an OEM and its suppliers. And we know business changes do not happen overnight.

Smaller suppliers might even refuse to work on a Model-Based definition, as it is considered an extra overhead they do not benefit from.

In particular, when working with various OEMs that might have their own preferred MBD package content based on their preferred usage. There are standards; however, OEMs often push for their preferred proprietary format.

It is about an orchestrated change.

Implementing MBD in your company, like PLM, is challenging because people need to be aligned and trained on new ways of working. In particular, this creates resistance at the end-user level.

Similar to the introduction of mainstream CAD (AutoCAD in the eighties) and mainstream 3D CAD (Solidworks in the late nineties), it requires new processes, trained people, and matching tools.

This is not always on the agenda of C-level people who try to avoid technical details (because they don’t understand them – read this great article: Technical Leadership: A Chronic Weakness in Engineering Enterprises.

I am aware of learning materials coming from the US, not so much about European or Asian thought leaders. Feel free to add other relevant resources for the readers in this post’s comments. Have a look and talk with:

Action Engineering with their OSCAR initiative: Bringing MBD Within Reach. I spoke with Jennifer Herron, founder of Action Engineering, a year ago about MBD and OSCAR in my blog post: PLM and Model-Based Definition.

Another interesting company to follow is Capvidia. Read their blog post to start with is MBD model-based definition in the 21st century.

The future

What you will discover from these two companies is that they focus on the connected flow of information between companies while anticipating that each stakeholder might have their preferred (traditional) PLM environment. It is about data federation.

The future of a connected enterprise is even more complex. So I was excited to see and download Yousef Hooshmand’s paper:  ”From a Monolithic PLM Landscape to a Federated Domain and Data Mesh”.

Yousef and some of his colleagues report about their PLM modernization project @Mercedes-Benz AG, aiming at transforming a monolithic PLM landscape into a federated Domain and Data Mesh.

This paper provides a lot of structured thinking related to the concepts I try to explain to my audience in everyday language. See my The road to model-based and connected PLM thoughts.

This paper has much more depth and is a must-read and must-discuss writing for those interested – perhaps an opportunity for new startups and a threat to traditional PLM vendors.

Conclusion

Vellum drawings are almost gone now – we have electronic 2D Drawings. The model-based definition has confirmed the benefits of improving the interaction between engineering, manufacturing & suppliers. Still, many industries are struggling with this approach due to process & people changes needed. If you are not able or willing to implement a model-based definition approach, be worried about the future. The eco-systems will only run efficiently (and survive) when their information exchange is based on data and models. Start learning now.

p.s. just out of curiosity:
If you are model-based advocate support this post with a

 

Once and a while, the discussion pops up if, given the changes in technology and business scope, we still should talk about PLM. John Stark and others have been making a point that PLM should become a profession.

In a way, I like the vagueness of the definition and the fact that the PLM profession is not written in stone. There is an ongoing change, and who wants to be certified for the past or framed to the past?

However, most people, particularly at the C-level, consider PLM as something complex, costly, and related to engineering. Partly this had to do with the early introduction of PLM, which was a little more advanced than PDM.

The focus and capabilities made engineering teams happy by giving them more access to their data. But unfortunately, that did not work, as engineers are not looking for more control.

Old (current) PLM

Therefore, I would like to suggest that when we talk about PLM, we frame it as Product Lifecycle Data Management (the definition). A PLM infrastructure or system should be considered the System of Record, ensuring product data is archived to be used for manufacturing, service, and proving compliance with regulations.

In a modern way, the digital thread results from building such an infrastructure with related artifacts. The digital thread is somehow a slow-moving environment, connecting the various as-xxx structures (As-Designed, As-Planned, As-Manufactured, etc.). Looking at the different PLM vendor images, Aras example above, I consider the digital thread a fancy name for traceability.

I discussed the topic of Digital Thread in 2018:  Document Management or Digital Thread. One of the observations was that few people talk about the quality of the relations when providing traceability between artifacts.

The quality of traceability is relevant for traditional Configuration Management (CM). Traditional CM has been framed, like PLM, to be engineering-centric.

Both PLM and CM need to become enterprise activities – perhaps unified.

Read my blog post and see the discussion with Martijn Dullaart, Lisa Fenwick and Maxim Gravel when discussing the future of Configuration Management.

New digital PLM

In my posts, I talked about modern PLM. I described it as data-driven, often in relation to a model-based approach. And as a result of the data-driven approach, a digital PLM environment could be connected to processes outside the engineering domain. I wrote a series of posts related to the potential of such a new PLM infrastructure (The road to model-based and connected PLM)

Digital PLM, if implemented correctly, could serve people along the full product lifecycle, from marketing/portfolio management until service and, if relevant, decommissioning). The bigger challenge is even connecting eco-systems to the same infrastructure, in particular suppliers & partners but also customers. This is the new platform paradigm.

Some years ago, people stated IoT is the new PLM  (IoT is the new PLM – PTC 2017). Or MBSE is the foundation for a new PLM (Will MBSE be the new PLM instead of IoT? A discussion @ PLM Roadmap conference 2018).

Even Digital Transformation was mentioned at that time. I don’t believe Digital Transformation is pointing to a domain, more to an ongoing process that most companies have t go through. And because it is so commonly used, it becomes too vague for the specifics of our domain. I liked Monica Schnitger‘s LinkedIn post: Digital Transformation? Let’s talk. There is enough to talk about; we have to learn and be more specific.

 

What is the difference?

The challenge is that we need more in-depth thinking about what a “digital transformed” company would look like. What would impact their business, their IT infrastructure, and their organization and people? As I discussed with Oleg Shilovitsky, a data-driven approach does not necessarily mean simplification.

I just finished recording a podcast with Nina Dar while writing this post. She is even more than me, active in the domain of PLM and strategic leadership toward a digital and sustainable future. You can find the pre-announcement of our podcast here (it was great fun to talk), and I will share the result later here too.

What is clear to me is that a new future data-driven environment becomes like a System of Engagement. You can simulate assumptions and verify and qualify trade-offs in real-time in this environment. And not only product behavior, but you can also simulate and analyze behaviors all along the lifecycle, supporting business decisions.

This is where I position the digital twin. Modern PLM infrastructures are in real-time connected to the business. Still, PLM will have its system of record needs; however, the real value will come from the real-time collaboration.

The traditional PLM consultant should transform into a business consultant, understanding technology. Historically this was the opposite, creating friction in companies.

Starting from the business needs

In my interactions with customers, the focus is no longer on traditional PLM; we discuss business scenarios where the company will benefit from a data-driven approach. You will not obtain significant benefits if you just implement your serial processes again in a digital PLM infrastructure.

Efficiency gains are often single digit, where new ways of working can result in double-digit benefits or new opportunities.

Besides traditional pressure on companies to remain competitive, there is now a new additional driver that I have been discussing in my previous post, the Innovation Dilemma. To survive on our planet, we and therefore also companies, need to switch to sustainable products and business models.

This is a push for innovation; however, it requires a coordinated, end-to-end change within companies.

Be the change

When do you decide to change your business model from pushing products to the marker into a business model of Product as a Service? When do you choose to create repairable and upgradeable products? It is a business need. Sustainability does not start with the engineer. It must be part of the (new) DNA of a company.

Interesting to read is this article from Jan Bosch that I read this morning: Resistance to Change. Read the article as it makes so much sense, but we need more than sense – we need people to get involved. My favorite quote from the article:

“The reasonable man adapts himself to the world; the unreasonable one persists in trying to adapt the world to himself. Therefore, all progress depends on the unreasonable man”.

Conclusion

PLM consultants should retrain themselves in System Thinking and start from the business. PLM technology alone is no longer enough to support companies in their (digital/sustainable) transformation. Therefore, I would like to introduce BLM (Business Lifecycle Management) as the new TLA.

However, BLM has been already framed as Black Lives Matter. I agree with that, extending it to ALM (All Lives Matter).

What do you think should we leave the comfortable term PLM behind us for a new frame?

Sustainability has been already a topic on my agenda for many years. So when Rich McFall asked me to start the PLM Global Green Alliance (PGGA) in 2018, I supported that initiative. You can read more about my PLM and Sustainability ideas in this post here.

I have been lecturing about the relation between PLM and Sustainability last year. In 2018, the PGGA was a niche alliance trying to find people who would like to work and share PLM-related practices with others for a greener and sustainable planet.

Thanks to, or actually due to, the pandemic, climate disasters and the return of the US supporting the Paris Climate agreements, it became clear companies need to act. And preferably as soon as possible, which led to sustainability activities in many companies.

Also, the main PLM vendors started to publish their support and vision for a sustainable future, the area where we believe the PGGA can contribute in spreading the practices and experiences.

For that reason, the PGGA is aiming this year to have a series of discussions with the main PLM Vendors and their sustainability programs.

SAP

This time we are happy to publish an interview with Darren West from SAP. Darren West is the product management lead for SAP’s Circular Economy solutions. His role is to work with customers, sales and pre-sales colleagues, partners, solutions teams and product owners to expand existing and build new sustainability products, particularly those impacting Circular Economy topics.

We are glad to speak with Darren, as we believe sustainability and the circular economy go hand in hand and it requires systems thinking. We believe SAP, strong in managing materials and manufacturing processes, should be a leader in providing insights in ESG reporting. Helping companies to improve their environmental impact of products and production processes as they have the data.

Have a look at this 34 minutes interview and discussion with Darren West

The slides shown in this recording can be found  here: Circular Economy -SAP for PLM Green Alliance

What we have learned

The interview showed that SAP is actively working on a sustainable future. Both by acting by themselves, but even more important, by helping their customers to change to more sustainable designs and production methods. There is still a way to go and we do not have too much time to sit back. The power of the current SAP Responsible Design and Production module is that it allows companies to understand their environmental impact and improve where possible. This is step 1 in my opinion to find a way to create sustainable products and business models.

The second, more general observation, is that we need to make our full product lifecycle management digital and connected. Data-driven is the only way to have efficient processes to estimate and calculate our environmental impact – my favorite From Coordinated to Connected topic.

Want to learn more?

In the context of this recording, Daren shared the following links for those of you who got inspired by the discussion (in alphabetical order):

Conclusion

This was a motivating session to see PLM-related vendors are taking action. Next time, you will learn more from the design side when we talk with Autodesk about their sustainability program.

Unfortunately the day after this motivating session we were shocked by the invasion of Ukraine by Russia.  So I am in a mixed mood, as having friends in both countries makes me realize that one dictator can kill people and hope.

Listen to president Zelensky’s speech to the Russian people and get inspired to act against any brainwashing or dictatorship. To my friends and readers, wherever you are, stay strong, informed and human.


 

In the past four weeks, I have been discussing PLM education from different angles through interviews with Peter Bilello (CIMdata), Helena Gutierrez (Share PLM), John Stark (John Stark Associates) and Dave Slawson (Quick Release). Each of these persons brought their specialized focus on PLM.

In this post, I want to conclude and put their expertise in the context of PLM – people, processes and tools.

CIMdata

Originally CIMdata became known for their CAD/CAM market analysis, later expanding into simulation and PLM vendors analysis. And they are still a reference for everyone following the PLM Market. They provide market numbers and projections related to PLM for that part. Together with ARC, they are for me the two sources to understand what is happening business-wise in the PLM market.

Thanks to the contacts with all the vendors, they have a good overview of what is happening. That makes their strategic advice and training useful for companies that want to benchmark where they are and understand the current trends, all vendor-independent.

Their PLM Roadmap conferences have been one of the few consistent vendor-independent conferences that still take place.

If you search for the term “The weekend after PLM Roadmap …..” you will find many of my reviews of these conferences.

Besides these activities, they are also facilitating industry action groups where similar companies in an industry discuss and evaluate various methodologies and how they could be implemented using various PLM systems – the most visible for me is the Aerospace & Defense PLM Action Group

Share PLM

Share PLM is still a young organization focusing on Humanizing PLM. Their focus is on the end-to-end PLM education process. Starting from an education strategy focusing on people, they can organize and help you build attractive and didactical training or elearnings related to your PLM processes and systems in use.

Besides their core offering, they are also justifying their name; they really share PLM information. So have a look at their Our Work tab with samples. In particular, as I mentioned in my interview with them, I like their podcasts.

 

In this post, I try to find similar people or companies to those I interviewed.

When looking at Share PLM, Action Engineering in the US comes to my mind. They are the specialists dedicated to helping organizations large and small achieve their Model-Based Definition (MBD) and Model-Based Enterprise (MBE) goals.

To refresh your memory, read my post with Jennifer Herron, the founder of Action Engineering here: PLM and Model-Based Definition

 

John Stark

Although John might be known as a leading writer of PLM books, he is also active in advising companies in their PLM journeys. Somehow similar to what I do, the big difference is that John takes the time to structure the information and write it down in a book. Just have a look at his list of published PLM books here.

My blog posts are less structured and reflect my observations depending on the companies and people I meet. Writing a foundational book about PLM would be challenging, as concepts are radically changing due to globalization and digitization.

John’s books are an excellent foundation for students who want to learn PLM’s various aspects during their academic years. Students can sit down and take the time to study PLM concepts. Later, suppose you want to acquire PLM knowledge relevant to your company.

In that case, you might focus on specialized training, like the ones CIMdata provides.

There are many books on PLM – have a look at this list. Which book to read depends probably a lot on your country and the university you are associated with. In my network, I have recently seen books from Martin Eigner and  Uthayan Elangovan.   Rosemary Astheimer’s book Model-Based Definition in the Product Lifecycle is still on my to-read list.

And then, there is a lot of research done by universities worldwide. So, if you are lucky, there is good education for PLM-related practices in your country.

Quick Release

My post with Quick Release illustrated the challenges of a PLM consultancy company. It showed their efforts to enable their consultants to be valuable for their customers and create a work environment that inspires them to grow and enjoy their work.

Quick Release aims for a competitive advantage to have their consultants participate in actual work for their customers.

Not only from the conceptual point of view but also to get their hands “dirty”.

There are many other PLM consultancy firms. Having worked with Atos, Accenture, Capgemini, Delloite, PWC, who have their PLM practices, you realize that these companies have their methodologies and preferences. The challenge of their engagements is often the translation of a vision into an affordable roadmap.

Example of Accenture Digital PLM message

Consultancy firms need to be profitable, too, and sometimes they are portrayed as a virus. Once they are in, it is hard to get rid of them.

I do not agree with that statement, as companies often keep relying on consultants because they do not invest in educating their own people. It is a lack of management prioritization or understanding of the importance. Sometimes the argument is: “We are too busy” – remember the famous cartoons.

Consultants cannot change your company; in the end, you have to own the strategy and execution.

And although large consultancy firms might have many trained resources, my experience with these companies is that success often depends on one or two senior consultants. Consultancy is also a human-centric job, being able to connect to the customer in their language and culture.

Good consultants show their value by creating awareness and clarity first. Next, by helping the customer execute their strategy without big risks or hiccups. Finally, a good consultant becomes redundant as the knowledge has been transferred and digested to the customer.

It is like growing up.

System Integrators

It is a small step from consultancy firms to system integrators, as many consultancy firms have specialists in their company that are familiar with certain vendors’ systems. And you might have discovered that the systems that require the most integration or configuration work have the largest practices globally.

So I did a “quick and dirty” search on LinkedIn, looking for people with the xxx PLM consultant role, where xxx is the name of the PLM Vendor.

This to understand how big is the job market for such a specialized PLM consultant.

The image shows the result and I let you draw your own conclusions.

System Integrators are usually the most important partners for a PLM implementation once you choose. Therefore, when I support a PLM selection process, I always look at the potential implementation partner. Their experience, culture and scale are as important as selecting the best tools.

System Integrators can benefit from their past experiences and best practices. It is a myth that every company is so unique and should be treated differently. Instead, companies are different because of historical reasons. And these differences to best practices are sometimes inhibitors instead of advantages.

Related to education, System Integrators are often focused on technical training. Still, they might also have separate experts in training or organizational change management.

 

PLM Vendors

For me, the PLM vendors are the ones that should inspire the customers. Have a look at the “famous” CIMdata slide illustrating the relation between vision, technology and implemented practices – there is a growing gap between the leaders and the followers.

PLM Vendors often use their unique technical capabilities as a differentiator to the competition and inspiration for C-level management. Just think about the terms: Industry 4.0, Digital Twin, Digital Thread, Digital Platform, Model-Based Enterprise and more about sustainability targeted offerings.

The challenge however is to implement these concepts in a consistent manner, allowing people in an organization to understand why and what needs to be done.

The PLM editor’s business model is based on software sales or rental. Therefore, they will focus on their benefits and what competitors fail to do. And as they have the largest marketing budgets, they are the most visible in the PLM-related media.

Of course reality is not that dramatic – education is crucial

You can compare PLM Vendors also with populists. The aim of a populist is to create an audience by claiming they can solve your problems (easily) by using simple framing sentences. However, the reality is that the world and the current digitalization in the PLM domain are not simple.

Therefore we need education, education and education from different sources to build our own knowledge. It is not about the tool first. It is people, process and then tools/technology

 

People, Process, Tools

Education and the right education for each aspect of PLM are crucial to making the right decision. To simplify the education message, I tried to visualize and rate each paragraph along with the People, Process and Tools assessment.

What do you think? Does this make sense related to education?

 

Conclusion

Education is crucial at every level of an organization and at every stage of your career. Take your time to read and digest the information you see and compare and discuss it with others. Be aware of the People, Process and Tools matrix when retrieving information. Where does it apply, and why.

I believe PLM is considered complex because we are dealing with people who all have different educational backgrounds and, therefore, an opinion. Invest in alignment to ensure the processes and tools will be used best.

After all my writing about The road to model-based and connected PLM, a topic that interests me significantly is the positive contribution real PLM can have to sustainability.

To clarify this statement, I have to explain two things:

  • First, for me, real PLM is a strategy that concerns the whole product lifecycle from conception, creation, usage, and decommissioning.

Real PLM to articulate the misconception that PLM is considered as an engineering infrastructure of even system. We discussed this topic related to this post (7 easy tips nobody told you about PLM adoption) from my SharePLM peers.

  • Second, sustainability should not be equated with climate change, which gets most of the extreme attention.

However, the discussion related to climate change and carbon gas emissions drew most of the attention. Also, recently it seemed that the COP26 conference was only about reducing carbon emissions.

Unfortunately, reducing carbon gas emissions has become a political and economic discussion in many countries. As I am not a climate expert, I will follow the conclusions of the latest IIPC report.

However, I am happy to participate in science-based discussions, not in conversations about failing statistics (lies, damned lies and statistics) or the mixture of facts & opinions.

The topic of sustainability is more extensive than climate change. It is about understanding that we live on a limited planet that cannot support the unlimited usage and destruction of its natural resources.

Enough about human beings and emotions, back to the methodology

Why PLM and Sustainability

In the section PLM and Sustainability of the PLM Global Green Alliance website,  we explain the potential of this relation:

The goals and challenges of Product Lifecycle Management and Sustainability share much in common and should be considered synergistic. Where in theory, PLM is the strategy to manage a product along its whole lifecycle, sustainability is concerned not only with the product’s lifecycle but should also address sustainability of the users, industries, economies, environment and the entire planet in which the products operate.

If you read further, you will bump on the term System Thinking. Again there might be confusion here between Systems Thinking and Systems Engineering. Let’s look at the differences

Systems Engineering

For Systems Engineering, I use the traditional V-shape to describe the process. Starting from the Needs on the left side, we have a systematic approach to come to a solution definition at the bottom. Then going upwards on the right side, we validate step by step that the solution will answer the needs.

The famous Boeing “diamond” diagram shows the same approach, complementing the V-shape with a virtual mirrored V-shape. In this way providing insights in all directions between a virtual world and a physical world. This understanding is essential when you want to implement a virtual twin of one of the processes/solutions.

Still, systems engineering starts from the needs of a group of stakeholders. So it works to the best technical and beneficial solution, most of the time only measured by money.

System Thinking

The image below from the Ellen McArthur Foundation is an example of system thinking. But, as you can see, it is not only about delivering a product.

Systems Thinking is a more holistic approach to bringing products to the market. It is about how we deliver a product to the market and what happens during its whole life cycle. The drivers for system thinking, therefore, are not only focusing on product performance at the most economical price, but we also take into account the impact on resource extraction in the world, the environmental impact during its active life (more and more regulated) and ultimately also how to minimize the waste to the eco-system. This means more recycling or reuse.

If you want to read more about systems thinking more professionally, read this blog post from the Millennium Alliance for Humanity and the Biosphere (MAHB) related to Systems Thinking: A beginning conversation.

Product as a Service (PaaS)

To ensure more responsibility for the product lifecycle, one of the European Green Deal aspects is promoting Product as a Service. There is already a trend towards products as a service, and I mentioned Ken Webster’s presentation at the PLM Roadmap & PDT Fall 2021 conference: In the future, you will own nothing, and you will be happy.

Because if we can switch to such an economy, the manufacturer will have complete control over the product’s lifecycle and its environmental impact. The manufacturer will be motivated to deliver product upgrades, create repairable products instead of dumping old or broken stuff because this is cheap for selling. PaaS brings opportunities for manufacturers, like greater customer loyalty, but also pushes manufacturers to stay away from so-called “greenwashing”. They become fully responsible for the entire lifecycle.

A different type of growth

The concept of Product as a Service is not something that typical manufacturing companies endorse. Instead, it requires them to restructure their business and restructure their product.

Delivering a Product as a Service requires a fast feedback loop between the products in the field and R&D deciding on improving or adding new features.

In traditional manufacturing companies, the service department is far from engineering due to historical reasons. However, with the digitization of our product information and connected products, we should be able to connect all stakeholders related to our products, even our customers.

A few years ago, I was working with a company that wanted to increase their service revenue by providing maintenance as a service on their products on-site. The challenge they had was that the total installation delivered at the customer site was done through projects. There was some standard equipment in their solution; however, ultimately, the project organization delivered the final result, and product information was scattered all around the company.

There was some resistance when I proposed creating an enterprise product information backbone (a PLM infrastructure) with aligned processes.  It would force people to work upfront in a coordinated manner. Now with the digitization of operations, this is no longer a point of discussion.

In this context, I will participate on December 7th in an open panel discussion Creating a Digital Enterprise: What are the Challenges and Where to Start? As part of the PI DX spotlight series. I invite you to join this event if you are interested in hearing various digital enterprise viewpoints.

Doing both?

As companies cannot change overnight, the challenge is to define a transformation path. The push for transformation for sure will come from governments and investors in the following decades. Therefore doing nothing is not a wise strategy.

Early this year, the Boston Consultancy Group published this interesting article: The Next Generation of Climate Innovation, showing different pathways for companies.

A trend that they highlighted was the fact that Shareholder Returns over the past ten years are negative for the traditional Oil & Gas and Construction industries (-18 till -6 %). However, the big tech and first generation of green industries provide high shareholders returns (+30 %), and the latest green champions are moving in that direction. In this way, promoting investors will push companies to become greener.

The article talks about the known threat of disrupters coming from outside. Still, it also talks about the decisions companies can make to remain relevant. Either you try to reduce the damage, or you have to innovate. (Click on the image below on the left).

As described before, innovating your business is probably the most challenging part. In particular, if you have many years of history in your industry. Processes and people are engraved in an almost optimal manner (for now).

An example of reducing the damage could be, for example, what is happening in the steel industry. As making steel requires a lot of (cheap) energy, this industry is powered by burning coal. Therefore, an innovation to reduce the environmental impact would be to redesign the process with green energy as described in this Swedish example: The first fossil-free production of steel.

On December 9th, I will discuss both strategies with Henrik Hulgaard from Configit. We will discuss how Product Lifecycle Management and Configuration Lifecycle Management can play a role in the future. Feel free to subscribe to this session and share your questions. Click on the image to see the details.

Note:  you might remember Henrik from my earlier post this year in January: PLM and Product Configuration Management (CLM)

Conclusion

Sustainability is a topic that will be more and more relevant for all of us, locally and globally. Real PLM, covering the whole product lifecycle, preferably data-driven, allows companies to transform their current business to future sustainable business. Systems Thinking is the overarching methodology we have to learn – let’s discuss

When I started this series in July, I expected to talk mostly about new ways of working, enabled through a data-driven and model-based approach. However, when analyzing what is needed for such a future (part 3), it became apparent that many of these new ways of working are dependent on technology.

From coordinated to connected sounds like a business change;

however, it all depends on technology. And here I have to thank Marc Halpern (Gartner’s Research VP, Engineering and Design Technologies)  again, who came with this brilliant scheme below:

So now it is time to address the last point from my starting post:

Configuration Management requires a new approach. The current methodology is very much based on hardware products with labor-intensive change management. However, the world of software products has different configuration management and change procedures. Therefore, we need to merge them into a single framework. Unfortunately, this cannot be the BOM framework due to the dynamics in software changes.

Configuration management at this moment

PLM and CM are often considered overlapping. My March 2019 post: PLM and Configuration Management – a happy marriage? shares some thoughts related to this point

Does having PLM or PDM installed mean you have implemented CM? There is this confusion because revision management is considered the same as configuration management. Read my March 2020 post: What the FFF is happening? Based on a vivid discussion launched by  Yoann Maingon, CEO and founder of Ganister, an example of a modern, graph database-based, flexible PLM solution.

To hear it from a CM-side,  I discussed it with Martijn Dullaart in my February 2021 post: PLM and Configuration Management. We also zoomed in on CM2 in this post as a methodology.

Martijn is the Lead Architect for Enterprise Configuration Management at ASML (Our Dutch national pride) and chairperson of the Industry 4.0 committee of the Integrated Process Excellence (IPX) Congress.

As mentioned before in a previous post (part 6), he will be speaking at the PLM Roadmap & PDT Fall conference starting this upcoming week.

In this post, I want to talk about the CM future. For understanding the current situation, you can find a broad explanation here on Wikipedia. Have a look at CM in the context of the product lifecycle, ensuring that the product As-Specified and As-Designed information matches the As-Built and As-Operated product information.

A mismatch or inconsistency between these artifacts can lead to costly errors, particularly in later lifecycle stages. CM originated from the Aerospace and Defense industry for that reason. However, companies in other industries might have implemented CM practices too. Either due to regulations or thanks to the understanding that configuration mistakes can cause significant damage to the company.

Historically configuration management addresses the needs of “slow-moving” products. For example, the design of an airplane could take years before manufacturing started. Tracking changes and ensuring consistency of all referenced datasets was often a manual process.

On purpose, I wrote “referenced datasets,” as the information was not connected in a single environment most of the time. The identifier of a dataset ( an item or a document) was the primary information carrier used for mentally connecting other artifacts to keep consistency.

The Institute of Process Excellence (IPX) has been one of the significant contributors to configuration management methodology. They have been providing (and still offer) CM2 training and certification.

As mentioned before, PLM vendors or implementers suggest that a PLM system could fully support Configuration Management. However, CM is more than change management, release management and revision management.

As the diagram from Martijn Dullaart shows, PLM is one facet of configuration management.

Of course, there are also (a few) separate CM tools focusing on the configuration management process. CMstat’s EPOCH CM tool is an example of such software. In addition, on their website, you can find excellent articles explaining the history and their future thoughts related to CM.

The future will undoubtedly be a connected, model-based, software-driven environment. Naturally, therefore, configuration management processes will have to change. (Impressive buzz word sentence, still I hope you get the message).

From coordinated to connected has a severe impact on CM. Let’s have a look at the issues.

Configuration Management – the future

The transition to a data-driven and model-based infrastructure has raised the following questions:

  • How to deal with the granularity of data – each dataset needs to be validated. For example, a document (a collection of datasets) needs to be validated in the document-based approach. How to do this efficiently?
  • The behavior of a product (or system) will more and more dependent on software. Product CM practices have been designed for the hardware domain; now, we need a mix of hardware and software CM practices.
  • Due to the increased complexity of products (or systems) and the rapid changes due to software versions, how do we guarantee the As-Operated product is still matching the As-Designed / As-Certified definitions.

I don’t have answers to these questions. I only share observations and trends I see in my actual world.

Granularity of data

The concept of datasets has been discussed in my post (part 6). Now it is about how to manage the right sets of connected data.

The image on the left, borrowed from Erik Herzog’s presentation at the PDM Roadmap & PDT Fall conference in 2020, is a good illustration of the challenge.

At that time, Erik suggested that OSLC could be the enabler of a digital CM backbone for an enterprise. Therefore, it was a pleasure to see Erik providing an update at the yearly OSLC Fest conference this week.

You can find the agenda and Erik’s presentation here on day 2.

OSLC as a framework seems to be a good candidate for supporting modern CM scenarios. It allows a company to build full traceability between all relevant artifacts (if digital available). I can see the beauty of the technical infrastructure.

Still, it is about people and processes first. Therefore, I am curious to learn from my readers who believe and experiment with such a federated infrastructure.

More software

Traditional working companies might believe that software should be treated as part of the Bill of Materials. In this theory, you treat software code as a part, with a part number and revision. In this way, you might believe configuration management practices do not have to change. However, there are some fundamental differences in why we should decouple hardware and software.

First, for the same hardware solution, there might be a whole collection of valid software codes. Just like your computer. How many valid software codes, even from the same application, can you run on this hardware? Managing a computer system and its software through a Bill of Materials is unimaginable.

A computer, of course, is designed for running all kinds of software versions. However, modern products in the field, like cars, machines, electrical devices, all will have a similar type of software-driven flexibility.

For that reason, I believe that companies that deliver software-driven products should design a mechanism to check if the combination of hardware and software is valid. For a computer system, a software mismatch might not be costly or painful; for an industrial system, it might be crucial to ensure invalid combinations can exist. Click on the image to learn more.

Solutions like Configit or pure::variants might lead to a solution. In Feb 2021, I discussed in PLM and Configuration Lifecycle Management with Henrik Hulgaard, the CTO from Configit, the unique features of their solution.

I hope to have a similar post shortly with Pure Systems to understand their added value to configuration management.

Software change management is entirely different from hardware change management. The challenge is to have two different change management approaches under one consistent umbrella without creating needless overhead.

Increased complexity – the digital twin?

With the increased complexity of products and many potential variants of a solution, how can you validate a configuration? Perhaps we should investigate the digital twin concept, with a twin for each instance we want to validate.

Having a complete virtual representation of a product, including the possibility to validate the software behavior on the virtual product, would allow you to run (automated) validation tests to certify and later understand a product in the field.

No need for inspection on-site or test and fix upgrades in the physical world. Needed for space systems for sure, but why not for every system in the long term. When we are able to define and maintain a virtual twin of our physical product (on-demand), we can validate.

I learned about this concept at the 2020 Digital Twin conference in the Netherlands. Bart Theelen from Canon Production Printing explained that they could feed their simulation models with actual customer data to simulate and analyze the physical situation. In some cases, it is even impossible to observe the physical behavior. By tuning the virtual environment, you might understand what happens in the physical world.

An eye-opener and an advocate for the model-based approach. Therefore, I am looking forward to the upcoming PLM Roadmap & PDT Fall conference. Hopefully, Martijn Dullaart will share his thoughts on combining CM and working in a model-based environment. See you there?

Conclusion

Finally, we have reached in this series the methodology part, particularly the one related to configuration management and traceability in a very granular, digital environment.  

After the PLM Roadmap & PDT fall conference, I plan to follow up with three thought leaders on this topic: Martijn Dullaart (ASML), Maxime Gravel (Moog) and Lisa Fenwick (CMstat).  What would you ask them?

In my previous post, I discovered that my header for this series is confusing. Although a future implementation of system lifecycle management (SLM/PLM) will rely on models, the most foundational change needed is a technical one to create a data-driven infrastructure for connected ways of working.

My previous article discussed the concept of the dataset, which led to interesting discussions on LinkedIn and in my personal interactions. Also, this time Matthias Ahrens (HELLA) shared again a relevant but very academic article in this context – how to harmonize company information.

For those who want to dive deeper into the concept of connected datasets, read this article: The euBusinessGraph ontology: A lightweight ontology for harmonizing basic company information.

The article illustrates that the topic is relevant for all larger enterprises (and it is not an easy topic).

This time I want to share my thoughts about the two statements from my introductory post, i.e.:

A model-based approach with connected datasets seems to be the way forward. Managing data in documents will become inefficient as they cannot contribute to any digital accelerator, like applying algorithms. Artificial Intelligence relies on direct access to qualified data.

A model-based approach with connected datasets

We discussed connected datasets in the previous post; now, let’s explore why models and datasets are related. In the traditional CAD-centric PLM domain, most people will associate the word model with a CAD model, to be more precise, the 3D CAD Model. However, there are many other types of models used related to product development, delivery and operations.

A model can be a:

Physical Model

  • A smaller-scale object for the first analysis, e.g., a city or building model, an airplane model

Conceptual Model

  • A conceptual model describes the entities and their relations, e.g., a Process Flow Diagram (PFD)
  • A mathematical model describes a system concept using a mathematical language, e.g., weather or climate models. Modelica and MATLAB would fall in this category
  • A CGI (Computer Generated Imagery) or 3D CAD model is probably the most associated model in the mind of traditional PLM practitioners
  • Functional and Logical Models describing the services and components of a system are crucial in an MBSE

Operational Model

  • A model providing performance analysis based on (real-time) data coming from selected data sources. It could be an operational business model, an asset performance model; even my Garmin’s training performance model is such an operating model.

The list of all models above is not extensive nor academically defined. Moreover, some model term definitions might overlap, e.g., where would we classify software models or manufacturing models?

All models are a best-so-far approach to describing reality. Based on more accurate data from observations or measurements, the model comes closer to what happens in reality.

A model and its data

Never blame the model when there is a difference between what the model predicts and the observed reality. It is still a model.  That’s why we need feedback loops from the actual physical world to the virtual world to fine-tune the model.

Part of what we call Artificial Intelligence is nothing more than applying algorithms to a model. The more accurate data available, the more “intelligent” the artificial intelligence solution will be.

By using data analysis complementary to the model, the model may get better and better through self-learning. Like our human brain, it starts with understanding the world (our model) and collecting experiences (improving our model).

There are two points I would like to highlight for this paragraph:

  • A model is never 100 % the same as reality – so don’t worry about deviations. There will always be a difference between virtual predicted and physical measured – most of the time because reality has much more influencing parameters.
  • The more qualified data we use in the model, the closer to reality – so focus on accurate (and the right) data for your model. Although, as most of the time, it is impossible to fully model a system, focus on the most significant data sources.

The ultimate goal: THE DIGITAL TWIN

The discussion related to data-driven and the usage of models might feel abstract and complex (and that’s the case). However the term “digital twin” is well known and even used in board rooms.

The great benefits of a digital twin for business operations and for sustainability are promoted by many software vendors and consultancy firms.

My statement and reason for this series of blog posts: Digital Twins do not run on documents, you need to have a data-driven, model-based infrastructure to efficiently benefit from digital twin concepts.

Unfortunate a reliable and sustainable implementation of a digital twin requires more than software – it is a learning journey to connect the right data to the right model.
A puzzle every company has to solve as there is no 100 percent blueprint at this time.

Are Low Code platforms the answer?

I mentioned the importance of accurate data. Companies have different systems or even platforms managing enterprise data. The digital dream is that by combining datasets from different systems and platforms, we can provide to any user the needed information in real-time. My statement from my introductory post was:

I don’t believe in Low-Code platforms that provide ad-hoc solutions on demand. The ultimate result after several years might be again a new type of spaghetti. On the other hand, standardized interfaces and protocols will probably deliver higher, long-term benefits. Remember: Low code: A promising trend or a Pandora’s Box?

Let’s look into some of the low-code platform messages mentioned by Low-Code advocates:

You will have an increasingly hard time finding developers to keep up with global app development demands (reason #1 for PEGA)

This statement reminded me of the early days of SmarTeam implementations. With a Data model Wizard, a Form Designer, and a Visual Basic COM API, you could create any kind of data management application with SmarTeam. By using its built-in behaviors for document lifecycle management, item lifecycle management, and CAD integrations combined with easy customizations.

The sky was the limit to satisfy end users.  No need for an experienced partner or to be a skilled programmer (this was 2003+). SmarTeam was a low-code platform the marketing department would say now.

A lot of my activities between 2003 and 2010 were related fixing the problems related to flexibility,  making sense (again) of customizations.  I wrote about this in a 2015 post: The importance of a (PLM) data model sharing the experiences of “fixing” issues created to flexibility.

Think first

The challenge is that an enthusiastic team creates a (low code) solution rapidly. Immediate success is celebrated by the people involved. However, the future impact of this solution is often forgotten – we did the job,  right?

Documentation and a broader visibility are often lacking when implementing such a solution.

For example, suppose your product data is going to be consumed by another app. In that case, you need to make sure that the information you consume is accurate. On the other hand, perhaps the information was valid when you created the app.

However, if your friendly co-worker has moved on to another job and someone with different data standards becomes responsible for the data you consume, the reliability might fail. So how do you guarantee its quality?

Easy tools have often led to spaghetti, starting from Clipper (the old days), Visual Basic (the less old days) to highly customizable systems (like Aras is promoting) and future low-code platforms (and Aras is there again).

However, the strength of being highly flexible is also the weaknesses if not managed and understood correctly. In particular, in a digital enterprise architecture, you need skilled people who guarantee a reliable anchorage of the solution.

The HBR article When Low-Code/No-Code Development Works — and When It Doesn’t mentions the same point:

There are great benefits from LC/NC software development, but management challenges as well. Broad use of these tools institutionalizes the “shadow IT phenomenon, which has bedeviled IT organizations for decades — and could make the problem much worse if not appropriately governed. Citizen developers tend to create applications that don’t work or scale well, and then they try to turn them over to IT. Or the person may leave the company, and no one knows how to change or support the system they developed.

The fundamental difference: from coordinated to connected

For the moment, I remain skeptical about the low-code hype, because I have seen this kind of hype before. The most crucial point companies need to understand is that the coordinated world and the connected world are incompatible.

Using new tools based on old processes and existing data is not a digital transformation. Instead, a focus on value streams and their needed (connected) data should lead to the design of a modern digital enterprise, not the optimization and connectivity between organizational siloes.
Before buying a tool (a medicine) to reduce the current pains, imagine your future ways of working, discover what is possible with your existing infrastructure and identify the gaps.

Next, you need to analyze if these gaps are so significant that it requires a technology change. Probably it does, as historically, systems were not designed to share data horizontally in an organization.

In this context, have a look at Lionel Grealou’s s article for Engineering.com:
Data Readiness in the new age of digital collaboration.

Conclusion

We discussed the crucial relation between models and data. Models have only value if they acquire the right and accurate data (exercise 1).

Next, even the simplest development platforms, like low-code platforms, require brains and a long-term strategy (exercise 2) – nothing is simple at this moment in transformational times.  

The next and final post in this series will focus on configuration management – a new approach is needed. I don’t have the answers, but I will share some thoughts

A recommended event and an exciting agenda and a good place to validate and share your thoughts.

I will be there and look forward to meeting you at this conference (unfortunate still virtually)

This week I attended the SCAF conference in Jonkoping. SCAF is an abbreviation of the Swedish CATIA User Group. First of all, I was happy to be there as it was a “physical” conference, having the opportunity to discuss topics with the attendees outside the presentation time slot.

It is crucial for me as I have no technical message. Instead, I am trying to make sense of the future through dialogues. What is sure is that the future will be based on new digital concepts, completely different from the traditional approach that we currently practice.

My presentation, which you can find here on SlideShare, was again zooming in on the difference between a coordinated approach (current) and a connected approach (the future).

The presentation explains the concepts of datasets, which I discussed in my previous blog post. Now, I focussed on how this concept can be discovered in the Dassault Systemes 3DExperience platform, combined with the must-go path for all companies to more systems thinking and sustainable products.

It was interesting to learn that the concept of connected datasets like the spider’s web in the image reflected the future concept for many of the attendees.

One of the demos during the conference illustrated that it is no longer about managing the product lifecycle through structures (EBOM/MBOM/SBOM).

Still, it is based on a collection of connected datasets – the path in the spider’s web.

It was interesting to talk with the present companies about their roadmap. How to become a digital enterprise is strongly influenced by their legacy culture and ways of working. Where to start to be connected is the main challenge for all.

A final positive remark.  The SCAF had renamed itself to SCAF (3DX), showing that even CATIA practices no longer can be considered as a niche – the future of business is to be connected.

Now back to the thread that I am following on the series The road to model-based. Perhaps I should change the title to “The road to connected datasets, using models”. The statement for this week to discuss is:

Data-driven means that you need to have an enterprise architecture, data governance and a master data management (MDM) approach. So far, the traditional PLM vendors have not been active in the MDM domain as they believe their proprietary data model is leading. Read also this interesting McKinsey article: How enterprise architects need to evolve to survive in a digital world

Reliable data

If you have been following my story related to PLM transition: From a connected to a coordinated infrastructure might have seen the image below:

The challenge of a connected enterprise is that you want to connect different datasets, defined in various platforms, to support any type of context. We called this a digital thread or perhaps even better framed a digital web.

This is new for most organizations because each discipline has been working most of the time in its own silo. They are producing readable information in neutral files – pdf drawings/documents. In cases where a discipline needs to deliver datasets, like in a PDM-ERP integration, we see IT-energy levels rising as integrations are an IT thing, right?

Too much focus on IT

In particular, SAP has always played the IT card (and is still playing it through their Siemens partnership). Historically, SAP claimed that all parts/items should be in their system. Thus, there was no need for a PDM interface, neglecting that the interface moment was now shifted to the designer in CAD. And by using the name Material for what is considered a Part in the engineering world, they illustrated their lack of understanding of the actual engineering world.

There is more to “blame” to SAP when it comes to the PLM domain, or you can state PLM vendors did not yet understand what enterprise data means. Historically ERP systems were the first enterprise systems introduced in a company; they have been leading in a transactional  “digital” world. The world of product development never has been a transactional process.

SAP introduced the Master Data Management for their customers to manage data in heterogeneous environments. As you can imagine, the focus of SAP MDM was more on the transactional side of the product (also PIM) than on the engineering characteristics of a product.

I have no problem that each vendor wants to see their solution as the center of the world. This is expected behavior. However, when it comes to a single system approach, there is a considerable danger of vendor lock-in, a lack of freedom to optimize your business.

In a modern digital enterprise (to be), the business processes and value streams should be driving the requirements for which systems to use. I was tempted to write “not the IT capabilities”; however, that would be a mistake. We need systems or platforms that are open and able to connect to other systems or platforms. The technology should be there, and more and more, we realize the future is based on connectivity between cloud solutions.

In one of my first posts (part 2), I referred to five potential platforms for a connected enterprise.  Each platform will have its own data model based on its legacy design, allowing it to service its core users in an optimized environment.

When it comes to interactions between two or more platforms, for example, between PLM and ERP, between PLM and IoT, but also between IoT and ERP or IoT and CRM, these interactions should first be based on identified business processes and value streams.

The need for Master Data Management

Defining horizontal business processes and value streams independent of the existing IT systems is the biggest challenge in many enterprises. Historically, we have been thinking around a coordinated way of working, meaning people shifting pieces of information between systems – either as files or through interfaces.

In the digital enterprise, the flow should be leading based on the stakeholders involved. Once people agree on the ideal flow, the implementation process can start.

Which systems are involved, and where do we need a connection between the two systems. Is the relationship bidirectional, or is it a push?

The interfaces need to be data-driven in a digital enterprise; we do not want human interference here, slowing down or modifying the flow. This is the moment Master Data Management and Data Governance comes in.

When exchanging data, we need to trust the data in its context, and we should be able to use the data in another context. But, unfortunately, trust is hard to gain.

I can share an example of trust when implementing a PDM system linked to a Microsoft-friendly ERP system. Both systems we able to have Excel as an interface medium – the Excel columns took care of the data mapping between these two systems.

In the first year, engineers produced the Excel with BOM information and manufacturing engineering imported the Excel into their ERP system. After a year, the manufacturing engineers proposed to automatically upload the Excel as they discovered the exchange process did not need their attention anymore – they learned to trust the data.

How often have you seen similar cases in your company where we insist on a readable exchange format?

When you trust the process(es), you can trust the data. In a digital enterprise, you must assume that specific datasets are used or consumed in different systems. Therefore a single data mapping as in the Excel example won’t be sufficient

Master Data Management and standards?

Some traditional standards, like the ISO 15926 or ISO 10303, have been designed to exchange process and engineering data – they are domain-specific. Therefore, they could simplify your master data management approach if your digitalization efforts are in that domain.

To connect other types of data, it is hard to find a global standard that also encompasses different kinds of data or consumers. Think about the GS1 standard, which has more of a focus on the consumer-side of data management.  When PLM meets PIM, this standard and Master Data Management will be relevant.

Therefore I want to point to these two articles in this context:

How enterprise architects need to evolve to survive in a digital world focusing on the transition of a coordinated enterprise towards a connected enterprise from the IT point of view.  And a recent LinkedIn post, Web Ontology Language as a common standard language for Engineering Networks? by Matthias Ahrens exploring the concepts I have been discussing in this post.

To me, it seems that standards are helpful when working in a coordinated environment. However, in a connected environment, we have to rely on master data management and data governance processes, potentially based on a clever IT infrastructure using graph databases to be able to connect anything meaningful and possibly artificial intelligence to provide quality monitoring.

Conclusion

Standards have great value in exchange processes, which happen in a coordinated business environment. To benefit from a connected business environment, we need an open and flexible IT infrastructure supported by algorithms (AI) to guarantee quality. Before installing the IT infrastructure, we should first have defined the value streams it should support.

What are your experiences with this transition?

In my last post in this series, The road to model-based and connected PLM, I mentioned that perhaps it is time to talk about SLM instead of PLM when discussing popular TLA’s for our domain of expertise. There were not so many encouraging statements for SLM so far.

SLM could mean for me, Solution Lifecycle Management, considering that the company’s offering more and more is a mix of products and services. Or SLM could mean System Lifecycle Management, in that case pushing the idea that more and more products are interacting with the outside world and therefore could be considered systems. Products are (almost) dead.

In addition, I mentioned that the typical product lifecycle and related configuration management concepts need to change as in the SLM domain. There is hardware and software with different lifecycles and change processes.

It is a topic I want to explore further. I am curious to learn more from Martijn Dullaart, who will be lecturing at the  PLM Road map and PDT 2021 fall conference in November. I hope my expectations are not too high, knowing it is a topic of interest for Martijn. Feel free to join this discussion

In this post, it is time to follow up on my third statement related to what data-driven implies:

Data-driven means that we need to manage data in a much more granular manner. We have to look different at data ownership. It becomes more about data accountability per role as the data can be used and consumed throughout the product lifecycle

On this topic, I have a list of points to consider; let’s go through them.

The dataset

In this post, I will often use the term dataset (you are also allowed to write the data set I understood).

A dataset means a predefined number of attributes and values that belong logically to each other. Datasets should be defined based on the purpose and, if possible, designated for a single goal. In this way, they can be stored in a database.

Combined with other datasets, a combination can result in relevant business information. Note a dataset is not only transactional data; a dataset could also describe geometry.

Identify the dataset

In the document-based world, a lot of information could be stored in a single file. In a data-driven world, we should define a dataset that contains a specific piece of information, logically belonging together. If we are more precise, a part would have various related datasets that make up the definition of a part. These definitions could be:

  • Core identification attributes like ID, Name, Type and Status
  • The Type could define a set of linked information. For example, a valve would have different characteristics as a resistor. Through classification, we can link data sets to the core definition of a part.
  • The part can have engineering-specific data (CAD and metadata), manufacturing-specific data, supplier-specific data, and service-specific data. Each of these datasets needs to be defined as a unique element in a data-driven environment
  • CAD is a particular case as most current CAD systems don’t treat geometry as a single dataset. In a file-based world, many other datasets are stored in the file (e.g., engineering or manufacturing details). In a data-driven environment, we want to have the CAD definition to be treated like a dataset. Dassault Systèmes with their CATIA V6 and 3DEXPERIENCE platform or PTC with OnShape are examples of this approach.Having CAD as separate datasets makes sharing and collaboration so much easier, as we can see from these solutions. The concept for CAD stored in a database is not new, and this approach has been used in various disciplines. Mechanical CAD was always a challenge.

Thanks to Moore’s Law (approximate every 2 years, processor power doubled – click on the image for the details) and higher network connection speed, it starts to make sense to have mechanical CAD also stored in a database instead of a file

An important point to consider is a kind of standardization of datasets. In theory, there should be a kind of minimum agreed collection of datasets. Industry standards provide these collections in their dictionary. Whenever you optimize your data model for a connected enterprise, make sure you look first into the standards that apply to your industry.

They might not be perfect or complete, but inventing your own new standard is a guarantee for legacy issues in the future. This remark is also valid for the software vendors in this domain. A proprietary data model might give you a competitive advantage.

Still, in the long term, there is always the need to connect with outside stakeholders.

 

Identify the RACI

To ensure a dataset is complete and well maintained, the concept of RACI could be used. RACI is the abbreviation for Responsible Accountable Consulted and Informed and a simplification of the RASCI Model, see also a responsibility assignment matrix.

In a data-driven environment, there is no data ownership anymore like you have for documents. The main reason that data ownership can no longer be used is that datasets can be consumed by anyone in the ecosystem. No longer only your department or the manufacturing or service department.

Data sets in a data-driven environment bring value when connected with other datasets in applications or dashboards.

A dataset describing the specification attributes of a part could be used in a spare part app and a service app. Of course, the dataset will be used in a different context – still, we need to ensure we can trust the data.

Therefore, per identified dataset, there should be governed by a kind of RACI concept. The RACI concept is a way to break the siloes in an organization.

Identify Inside / outside

There is a lot of fear that a connected, data-driven environment will expose Intellectual Property (IP). It came up in recent discussions. If you like storytelling and technology, read my old SmarTeam colleague Alex Bruskin’s post: The Bilbo Baggins Threat to PLM Assets. Alex has written some “poetry” with a deep technical message behind it.

It is true that if your data set is too big, you have the challenge of exposing IP when connecting this dataset with others. Therefore, when building a data model, you should make it possible to have datasets pure for internal usage and datasets for sharing.

When you use the concept of RACI, the difference should be defined by the I(informed) – is it PLM-data or PIM-data for example?

Tracking relations

Suppose we follow up on the concept of datasets. In that case, it becomes clear that relations between the datasets are as crucial as the dataset. In traditional PLM applications, these relations are often predefined as part of the core data model/

For example, the EBOM parts have relationships between themselves and specification data – see image.

The MBOM parts have links with the supplier data or the manufacturing process.

The prepared relations in a PLM system allow people to implement the system relatively quickly to map their approaches to this taxonomy.

However, traditional PLM systems are based on a document-based (or file-based) taxonomy combined with related metadata. In a model-based and connected environment, we have to get rid of the document-based type of data.

Therefore, the datasets will be more granular, and there is a need to manage exponential more relations between datasets.

This is why you see the graph database coming up as a needed infrastructure for modern connected applications. If you haven’t heard of a graph database yet, you are probably far from technology hypes. To understand the principles of a graph database you can read this article from neo4j:  Graph Databases for Beginners: Why graph technology is the future

As you can see from the 2020 Gartner Hype Cycle for Artificial Intelligence this technology is at the top of the hype and conceptually the way to manage a connected enterprise. The discussion in this post also demonstrates that besides technology there is a lot of additional conceptual thinking needed before it can be implemented.

Although software vendors might handle the relations and datasets within their platform, the ultimate challenge will be sharing datasets with other platforms to get a connected ecosystem.

For example, the digital web picture shown above and introduced by Marc Halpern at the 2018 PDT conference shows this concept. Recently CIMdata discussed this topic in a similar manner: The Digital Thread is Really a Web, with the Engineering Bill of Materials at Its Center
(Note I am not sure if CIMdata has published a recording of this webinar – if so I will update the link)

Anyway, these are signs that we started to find the right visuals to imagine new concepts. The traditional digital thread pictures, like the one below, are, for me, impressions of the past as they are too rigid and focusing on some particular value streams.

From a distance, it looks like a connected enterprise should work like our brain. We story information on different abstraction levels. We keep incredibly many relations between information elements. As the brain is a biological organ, connections degrade or get lost. Or the opposite other relationships become so strong that we cannot change them anymore. (“I know I am always right”)

Interestingly, the brain does not use the “single source of truth”-concept – there can be various “truths” inside a brain. This makes us human beings with all the good and the harmful effects of that.

As long as we realize there is no single source of truth.

In business and our technological world, we need sometimes the undisputed truth. Blockchain could be the basis for securing the right connections between datasets to guarantee the result is valid. I am curious if blockchain can scale to complex connected situations, although Moore’s Law might ultimately help us here too(if still valid).

The topic is not new – in 2014 I wrote a post with the title: PLM is doomed unless ….   Where I introduced the topic of owning and sharing in the context of the human brain.  In the post, I refer to the book On Intelligence by Jeff Hawkins how tries to analyze what is human-based intelligence and how could we apply it to our technology concepts. Still a fascinating book worth reading if you have the time and opportunity.

 

Conclusion

A data-driven approach requires a more granular definition of information, leading to the concepts of datasets and managing relations between datasets. This is a fundamental difference compared to the past, where we were operating systems with information. Now we are heading towards connected platforms that provide a filtered set of real-time data to act upon.

I am curious to learn more about how people have solved the connected challenges and in what kind of granularity. Let us know!

 

 

Translate

Email subscription to this blog

Categories

%d bloggers like this: