After “The Doctor is IN,” now again a written post in the category of PLM and complementary practices/domains. In January, I discussed together with Henrik Hulgaard from Configit the complementary value of PLM and CLM (Configuration Lifecycle Management). For me, CLM is a synonym for Product Configuration Management.

PLM and Complementary Practices (feedback)

As expected, readers were asking the question:

“What is the difference between CLM (Configuration Lifecycle Management) and CM(Configuration Management)?”

Good question.

As the complementary role of CM is also a part of the topics to discuss, I am happy to share this blog today with Martijn Dullaart. You probably know Martijn if you are actively following topics on PLM and CM.

Martijn has his own blog mdux.net, and you might have seen him recently in Jenifer Moore’s PLM TV-episode: Why CM2 for Faster Change and Better Documentation. Martijn is the Lead Architect for Enterprise Configuration Management at ASML (Our Dutch national pride) and chairperson of the Industry 4.0 committee of the Integrated Process Excellence (IPX) Congress. Let us start.

Configuration Management and CM2

Martijn, first of all, can you bring some clarity in terminology. When discussing Configuration Management, what is the pure definition, what is CM2 as a practice, and what is IpX‘s role and please explain where you fit in this picture?

Classical CM focuses mainly on the product, the product definition, and actual configurations like as-built and as-maintained of the product. CM2 extends the focus to the entire enterprise, e.g., the processes and procedures (ways of working) of a company, including the IT and facilities, to support the company’s value stream.

CM2 expands the scope to all information that could impact safety, security, quality, schedule, cost, profit, the environment, corporate reputation, or brand recognition.

Basically, CM2 shifts the focus to Integrated Process Excellence and promotes continual improvement.

Next to this, CM2 provides the WHAT and the HOW, something most standards lack. My main focus is still around the product and promoting the use of CM outside the product domain.

For all CM related documentation, we are already doing this.

Configuration Management and PLM

People claim that if you implement PLM as an enterprise backbone, not as an engineering tool, you can do Configuration Management with your PLM environment.

What is your opinion?

Yes, I think that this is possible, provided that the PLM tool has the right capabilities. Though the question should be: Is this the best way to go about it. For instance, some parts of Configuration Management are more transactional oriented, e.g., registering the parts you build in or out of a product.

Other parts of CM are more iterative in nature, e.g., doing impact analysis and making an implementation plan. I am not saying this cannot be done in a PLM tool as an enterprise backbone. Still, the nature of most PLM tools is to support iterative types of work rather than a transactional type of work.

I think you need some kind of enterprise backbone that manages the configuration as an As-Planned/As-Released baseline. A baseline that shows not only the released information but also all planned changes to the configuration.

Because the source of information in such a baseline comes from different tools, you need an overarching tool to connect everything. For most companies, this means that they require an overarching system with their current state of enterprise applications.

Preferably I would like to use the data directly from the sources. Still, connectivity and performance are not yet to a level that we can do this. Cloud and modern application and database architectures are very promising to this end.

 

Configuration Management for Everybody?

I can imagine companies in the Aerospace industry need to have proper configuration management for safety reasons. Also, I can imagine that proper configuration management can be relevant for other industries. Do they need to be regulated, or are there other reasons for a company to start implementing CM processes?

I will focus the first part of my answer within the context of CM for products only.

Basically, all products are regulated to some degree. Aerospace & Defense and Medical Device and Pharma are highly regulated for obvious reasons. Other industries are also regulated, for example, through environmental regulations like REACH, RoHS, WEEE or safety-related regulations like the CE marking or FCC marking.

Customers can also be an essential driver for the need for CM. If, as a customer, you buy expensive equipment, you expect that the supplier of that equipment can deliver per commitment. The supplier can also maintain and upgrade the equipment efficiently with as few disruptions to your operations as possible.

Not just customers but also consumers are critical towards the traceability of the product and all its components.

Even if you are sitting on a rollercoaster, you presume the product is well designed and maintained. In other words, there is often a case to be made to apply proper configuration management in any company. Still, the extent to which you need to implement it may vary based on your needs.

 

The need for Enterprise Configuration Management is even more significant because one of the hardest things is to change the way an organization works and operates.

Often there are different ways of doing the same thing. There is a lot of tribal knowledge, and ways of working are not documented so that people can easily find it, let alone that it is structured and linked so that you can do an impact analysis when you want to introduce a change in your organization.

 

CM and Digital Transformation

One of the topics that we both try to understand better is how CM will evolve in the future when moving to a more model-based approach. In the CM-terminology, we still talk about documents as information objects to be managed. What is your idea of CM and a model-based future?

It is indeed a topic where probably new or changed methodology is required, and I started already describing CM topics in several posts on my enterprise MDUX blog. Some of the relevant posts in this context are:

First, let me say that model-based has the future, although, at the same time, the CM aspects are often overlooked.

When managing changes, too much detail makes estimating cost and effort for a business case more challenging, and planning information that is too granular is not desirable. Therefore, CM2 looks at datasets. Datasets should be as small as possible but not smaller. Datasets are sets of information that need to be released as a whole. Still, they can be released independently from other datasets. For example, a bill of materials, a BOM line item is not a dataset, but the complete set of BOM line items that make up the BoM of an assembly is considered a dataset. I can release a BoM independent from a test plan.

Data models need to facilitate this. However, today, in many PLM systems, a BOM and the metadata of a part are using the same revision. This means that to change the metadata, I need a revision of the BoM, while the BoM might not change. Some changes to metadata might not be relevant for a supplier. Communicating the changes to your supplier could create confusion.

I know some people think this is about document vs. model-centric, but it is not. A part is identified in the ‘physical world’ by its part ID. Even if you talk about allowing revisions in the supply chain, including the part ID’s revision, you create a new identifier. Now every new revision will end up in a different stock location. Is that what we want?

In any case, we are still in the early days, and the thinking about this topic has just begun and needs to take shape in the coming year(s).

 

CM and/or CLM?

As in my shared blog post with Henrik Hulgaard related to CLM, can you make a clear differentiation between the two domains for the readers?

 

Configuration Lifecycle Management (CLM)  is mainly positioned towards Configurable Products and the configurable level of the product.

 

Why I think this, even though Configit’s  CLM declaration states that “Configuration Lifecycle Management (CLM) is the management of all product configuration definitions and configurations across all involved business processes applied throughout the lifecycle of a product.”,
it also states:

  • “CLM differs from other Enterprise Business Disciplines because it focuses on cross-functional use of configurable products.”
  • “Provides a Single Source of Truth for Configurable Data
  • “handles the ever-increasing complexity of Configurable Products“.

I find Configuration Lifecycle Management is a core Configuration Management practice you need to have in place for configurable products. The dependencies you need to manage are enormously complex. Software parameters that depend on specific hardware, hardware to hardware dependencies, commercial variants, and options.

Want to learn more?

In this post, we just touched the surface of PLM and Configuration Management. Where can an interested reader find more information related to CM for their company?

 

For becoming trained in CM2, people can reach out to the Institute for Process Excellence, a company that focuses on consultancy and methodology for many aspects of a modern, digital enterprise, including Configuration Management.

And there is more out there, e.g.:

Conclusion

Thanks, Martijn, for your clear explanations. People working seriously in the PLM domain managing the full product lifecycle should also learn and consider Configuration Management best practices. I look forward to a future discussion on how to perform Configuration Management in a model-based environment.

PLM, CLM, and CM – mind the overlap

 

 

 

 

As promised in my blog post:  PLM 2021 – My plans – your goals? I was planning to experiment with a format, which I labeled as: The PLM Doctor is IN.

The idea behind this format that anyone interested could ask a question – anonymous or through a video recording – and I would answer this single question.

As you can see from the survey result, many of the respondents (approx. 30 % that did not skip the question) had a question. Enough for the upcoming year to experiment – if the experiment works for you. As it is an experiment, I am also looking forward to your feedback to optimize this format.

Today the first episode: PLM and ROI

 

Relevant links discussed in this video

CIMdata webinar: PLM Benefits, Metrics & ROI with John MacKrell

VirtualDutchman: The PLM ROI Myth

 

Conclusion

What do you think? Does this format help you to understand and ask PLM related questions? Or should I not waste my time as there is already so much content out there. Let me know what you think in the comments.

Added February 10th

 

As the PLM Doctor sometimes talks like an oracle, it was great to see the summary written by SharePLM Learning Expert Helena Guitierrez.

Click on the image to see the full post.

 

 

First of all, thank you for the overwhelming response to the survey that I promoted last week: PLM 2021– your goals? It gave me enough inspiration and content to fill the upcoming months.

The first question of the survey was dealing with complementary practices or systems related to a traditional PLM-infrastructure.

As you can see, most of you are curious about Digital Twin management 68 % (it is hype). Second best are Configuration Management, Product Configuration Management and Supplier Collaboration Management, all with 58% of the votes. Click on the image to see the details. Note: you could vote for more than one topic.

Product Configuration Management

Therefore, I am happy to share this blog space with Configit’s CTO, Henrik Hulgaard. Configit is a company specialized in Product Configuration Management, or as they call it, Configuration Lifecycle Management (CLM).

Recently Henrik wrote an interesting article on LinkedIn: How to achieve End-To-End Configuration.  A question that I heard several times from my clients. How to align the selling and delivery of configurable products, including sales, engineering and manufacturing?

Configit – the company / the mission

Henrik, thanks for helping me explaining the complementary value of end-to-end Product Configuration Management to traditional PLM systems. First of all, can you give a short introduction to Configit as a company and the unique value you are offering to your clients?

Hi Jos, thank you for having me. Configit has worked with configuration challenges for the last 20 years. We are approximately 200 people and have offices in Denmark, Germany, India, and in the US (Atlanta and Detroit) and work with some of the world’s largest manufacturing companies.

We are founded on patented technology, called Virtual Tabulation. The YouTube movie below explains the term Virtual Tabulation.

Virtual Tabulation compiles EVERY possible configuration scenario and then compresses that data into a very small file so that it can be used by everyone in your team.

Virtual Tabulations enables important capabilities such as:

  • Consolidation of all configuration data, both Engineering and Sales related, into single-source-of-truth.
  • Effortless maintenance of complicated rule data.
  • Fast and error-free configuration engine that provides perfect guidance to the customer across multiple platforms and channels..

As the only vendor, Configit provides a configuration platform that fully supports end-to-end configuration processes, from early design and engineering, over sales and manufacturing to support and service configurable products.

This is what we understand by Configuration Lifecycle Management (CLM).

Why Configuration Lifecycle Management?

You have introduced the term Configuration Lifecycle Management – another TLA (Three Letter Acronym) and easy pronounce. However, why would a company being interested to implement Configuration Lifecycle Management (CLM)?

CLM is a way to break down the siloed systems traditionally found in manufacturing companies where products are defined in a PLM system, sold using a CRM/CPQ system, manufactured using an ERP system and serviced by typically ad-hoc and home-grown systems.  A CLM system feeds these existing systems with an aligned and consistent view of what variants of a configurable product is available.

Organizations obtain several benefits when aligning across functions on what product variants it offers:

  • Engineering: faster time-to-market, optimized variability, and the assurance to only engineer products that are sold
  • Sales: reducing errors, making sure that what gets quoted is accurate, and reducing the time to close the deal. The configurator provides current, up-to-date, and accurate information.
  • Manufacturing: reducing errors and production stoppages due to miss-builds
  • Service: accurate information about the product’s configuration. The service technician knows precisely what capabilities to expect on the particular product to be serviced.

For example, one of our customers experienced a 95% reduction in the time – from a year to two weeks – it took them to create the configuration models needed to build and sell their products. This reduction meant a significant reduction in time to market and allowed additional product lines to be introduced.

CLM for everybody?

I can imagine that companies with products that are organized for mass-production still wanting to have the mindset of being as flexible as possible on the sales side. What type of companies would benefit the most from a CLM approach?

Any company that offers customized or configurable products or services will need to ensure that what is engineered is aligned with what is sold and serviced. Our customers typically have relatively high complexity with hundreds to thousands of configuration parameters.

CLM is not just for automotive companies that have high volume and high complexity. Many of our customers are in industrial components and machinery, offering complex systems and services. A couple of examples:

Philips Healthcare sells advanced scanners to hospitals and uses CLM to ensure that what is sold is aligned with what can be offered. They also would like to move to sell scanners as a service where the hospital may pay per MR scan.

Thyssenkrupp Elevators sell elevators that are highly customizable based on the needs and environment. The engineering rules start in the CAD environment. They are combined with commercial rules to provide guidance to the customer about valid options.

CLM and Digital Transformation

For me, CLM is an excellent example of what modern, digital enterprises need to do. Having product data available along the whole lifecycle to make real-time decisions. CLM is a connecting layer that allows companies to break the siloes between marketing, sales, engineering and operations. At C-level get excited by that idea as I can see the business value.

Now, what would you recommend realizing this idea?

  • The first step is to move away from talking about parts and instead talk about features when communicating about product capabilities.

This requires that an organization establishes a common feature “language” (sometimes this is called a master feature dictionary) that is shared across the different functions.

As the feature codes are essential in the communication between the functions, the creation and updating of the feature language must be carefully managed by putting people and processes in place to manage them.

  • The next step is typically to make information about valid configurations available in a central place, sometimes referred to as the single source of truth for configuration.

We offer services to expose this information and integrate it into existing enterprise systems such as PLM, ERP and CRM/CPQ.  The configuration models may still be maintained in legacy systems. Still, they are imported and brought together in the CLM system.

Once consuming systems all share a single configuration engine, the organization may move on to improve on the rule authoring and replace the existing legacy rule authoring applications found in PLM and ERP systems with more modern applications such as Configit Ace.

Customer Example: Connecting Sales, R&D and ERP

As can be seen from above, these steps all go across the functional silos. Thus, it is essential that the CLM journey has top-level management support, typically from the CIO.

COVID-19?

Related to COVID-19, I believe companies realized that they had to reconsider their supply chains due to limiting dependencies on critical suppliers. Is this an area where Configit would contribute too?

The digital transformation that many manufacturing companies have worked on for years clearly has been accelerated by the COVID-19 situation, and indeed they might now start to encode information about the critical suppliers in the rules.

We have seen this happening in 2011 with the tsunami in Japan when suddenly supplier could not provide certain parts anymore.  The organization then has to quickly adapt the rules so that the options requiring those parts are no longer available to order.

Therefore, the CLM vision also includes suppliers as configuration knowledge has to be shared across organizations to ensure that what is ordered also can be delivered.

Learning more?

It is clear that CLM is a complementary layer to standard PLM-infrastructures and complementary to CRM and ERP.  A great example of what is possible in a modern, digital enterprise. Where can readers find more information?

Configit offers several resources on Configuration Lifecycle Management on our website, including our blog,  webinars and YouTube videos, e.g., Tech Chat on Manufacturing and Configuration Lifecycle Management (CLM)

Besides these continuous growing resources, there is the whitepaper “Accelerating Digital Transformation in Manufacturing with Configuration Lifecycle Management (CLM)” available here among other whitepapers.

What I have learned

  • Configuration Lifecycle Management is relevant for companies that want to streamline their business functions, i.e., sales, engineering, manufacturing, and service. CLM will reduce the number of iterations in the process, reduce costly fixing when trying to align to customer demands, and ultimately create more service offerings by knowing customer existing configurations.
  • The technology to implement CLM is there. Configit has shown in various industries, it is possible. It is an example of adding value on top of a digital information infrastructure (CRM, PLM, and ERP)
  • The challenge will be on aligning the different functions to agree and align on one standard configuration authority. Therefore, responsibility should lie at the top-level of an organization, likely the modern CIO or CDO.
  • I was glad to learn that Henrik stated:

    “The first step is to move away from talking about parts and instead talk about features when communicating about product capabilities”.

    A topic I will discuss soon when talking about Product & Portfolio Management with PLM.

Conclusion

It was a pleasure to work with Configit, in particular, Henrik Hulgaard, learning more about Configuration Lifecycle Management or whatever you may name it. More important, I hope you find this post insightful for your understanding if and where it applies to your business.

Always feel free to ask more questions related to the complimentary value of PLM and Product Configuration Management(CLM)

Last week I shared my plans for 2021 related to my blog, virtualdutchman.com. Those of you who follow my blog might have noticed my posts are never short as I try to discuss or explain a topic from various aspects. This sometimes requires additional research from my side. The findings will provide benefits for all of us. We keep on learning.

At the end of the post, I asked you to participate in a survey to provide feedback on the proposed topics. So far, only one percent of my readers have responded to this short survey. The last time I shared a short survey in 2018, the response was much more significant.

Perhaps you are tired of the many surveys; perhaps you did not make it to the end. Please make an effort this time. Here is on more time the survey

The results so far

To understand the topics below, please make sure you have read the previous blog post to understand each paragraph’s context.

PLM understanding

For PLM-related topics that I proposed, Product Configuration Management, Supplier Collaboration Management, and  Digital Twin Management got the most traction. I started preparing for them, combined with a few new suggested topics that I will further explore. You can click on the images below to read the details.

PLM Deep dive

From the suggested topics for a PLM deep-dive, it is interesting to see most respondents want to learn more about Product Portfolio Management and Systems Engineering within PLM. Traditional topics like Enterprise/Engineering Change Management, BOM Management, or PLM implementation methodologies have been considered less relevant.

The PLM Doctor is in

Several questions were coming in for the “PLM Doctor,” and I started planning the first episodes. The formula: A single question and an answer through a video recording – max. 2 – 3 minutes. Suitable for fast consumers of information.

PLM and Sustainability

Here we can see the majority is observing what is happening. Only a few persons reported interest in sustainability and probably not disconnected; they work for a company that takes sustainability seriously.

 

 

PLM and digitization

When discussing PLM’s digitization, I believe one of the fundamental changes that we need to implement (and learn to master) is a more Model-Based approach for each phase of the product life cycle. Also, most respondents have a notion of what model-based means and want to apply these practices to engineering and manufacturing.

 

Your feedback

I think you all have heard this statement before about Lies and Statistics. Especially with social media, there are billions of people digging for statistics to support their theories. Don’t worry about my situation; I would like to make my statement based on some larger numbers, so please take the survey here if you haven’t done so.

 

Conclusion

I am curious about your detailed inputs, and the next blog post will be the first of the 2021 series.

 

 

 

 

 

It Is 2021, and after two weeks’ time-out and reflection, it is time to look forward. Many people have said that 2020 was a “lost year,” and they are looking forward to a fresh restart, back to the new normal. For me, 2020 was the contrary of a lost year. It was a year where I had to change my ways of working. Communication has changed, digitization has progressed, and new trends have become apparent.

If you are interested in some of the details, watch the conversation I had with Rob Ferrone from QuickRelease, just before Christmas: Two Santas looking back to 2020.

It was an experiment with video, and you can see there is a lot to learn for me. I agree with Ilan Madjar’s comment that it is hard to watch two people talking for 20 minutes. I prefer written text that I can read at my own pace, short videos (max 5 min), or long podcasts that I can listen to, when cycling or walking around.

So let me share with you some of the plans I have for 2021, and I am eager to learn from you where we can align.

PLM understanding

I plan a series of blog posts where I want to share PLM-related topics that are not necessarily directly implemented in a PLM-system or considered in PLM-implementations as they require inputs from multiple sources.  Topics in this context are: Configuration Management, Product Configuration Management, Product Information Management, Supplier Collaboration Management, Digital Twin Management, and probably more.

For these posts, I will discuss the topic with a subject matter expert, potentially a vendor or a consultant in that specific domain, and discuss the complementary role to traditional PLM. Besides a blog post, this topic might also be more explained in-depth in a podcast.

The PLM Doctor is in

Most of you might have seen Lucy from the Charley Brown cartoon as the doctor giving advice for 5¢. As an experiment, I want to set up a similar approach, however, for free.

These are my conditions:

  • Only one question at a time.
  • The question and answer will be published in a 2- 3 minute video.
  • The question is about solving a pain.

If you have such a question related to PLM, please contact me through a personal message on LinkedIn, and I will follow-up.

PLM and Sustainability

A year ago, I started with Rich McFall, the PLM Green Global Alliance.  Our purpose to bring people together, who want to learn and share PLM-related practices, solutions,  ideas contributing to a greener and more sustainable planet.

We do not want to compete or overlap with more significant global or local organizations, like the Ellen McArthur Foundation or the European Green Deal.

We want to bring people together to dive into the niche of PLM and its related practices.  We announced the group on LinkedIn; however, to ensure a persistent referential for all information and interactions, we have launched the website plmgreenaliance.com.

Here I will moderate and focus on PLM and Sustainability topics. I am looking forward to interacting with many of you.

PLM and digitization

For the last two years, I have been speaking and writing about the gap between current PLM-practices, based on shareable documents and files and the potential future based on shareable data, the Model-Based Enterprise.

Last year I wrote a series of posts giving insights on how we reached the current PLM-practices. Discovering sometimes inconsistencies and issues due to old habits or technology changes. I grouped these posts on a single blog page with the title:  Learning from the past.

This year I will create a collection of posts focusing on the transition towards a Model-Based Enterprise. Probably the summary page will be called: Working towards the future currently in private mode.

Your feedback

I am always curious about your feedback – to understand in which kind of environment your PLM activities take place. Which topics are unclear? What am I missing in my experience?

Therefore, I created a small anonymous survey for those who want to be interacting with me. On purpose, the link is at the bottom of the post, so when you answer the survey, you get my double appreciation, first for reaching the end of this post and second for answering the survey.

Take the survey here.

Conclusion

Most of us will have a challenging year ahead of us. Sharing and discussing challenges and experiences will help us all to be better in what we are doing. I look forward to our 2021 journey.

For those living in the Northern Hemisphere: This week, we had the shortest day, or if you like the dark, the longest night. This period has always been a moment of reflection. What have we done this year?

Rob Ferrone (Quick Release), the Santa on the left (the leftist), and Jos Voskuil (TacIT), the Santa on the right (the rightist), share in a dialogue their highlights from 2020

Wishing you all a great moment of reflection and a smooth path into a Corona-proof future.

It will be different; let’s make it better.

 

I am still digesting all the content of the latest PLM Roadmap / PDT Fall 2020 conference and the new reality that starts to appear due to COVID-19. There is one common theme:

The importance of a resilient and digital supply chain.

Most PLM implementations focus on aligning disciplines internally; the supply chain’s involvement has always been the next step. Perhaps now it is time to make it the first step? Let’s analyze.

No Time to Market improvement due to disconnected supply chains?

During the virtual fireplace chat at the PLM Roadmap/PDT conference, just as a small bonus. You can read the full story here – the quote:

Marc mentioned a survey Gartner has done with companies in fast-moving industries related to the benefits of PLM. Companies reported improvements in accuracy of product data and product development. They did not see so much a reduced time to market or reduced product development costs. After analysis, Gartner believes the real issue is related to collaboration processes and supply chain practices. Here lead times did not change, nor the number of changes.

Of course, he spoke about fast-moving industries where the interaction was done in a disconnected manner. Gartner believes that the cloud would, for sure, start creating these benefits of a reduced time to market and cost of change when the supply chain is connected.

Therefore I want to point again to an old McKinsey article named The case for Digital Reinvention, published in February 2017. Here the authors looked at the various areas of investment in digital technologies and their ROI.  See the image on the left for the areas investigated and the percentage of companies that invested in these areas at that time.

In the article, you will see the ROI analysis for these areas. For example, the marketing and distribution investments did not necessarily have a positive ROI when disconnected from other improvement areas. Digital supply chains were mentioned as the area with the potential highest ROI. However, another important message in the article for all these areas is: You need to have a complete digitization strategy. This is a point I fail to see in many companies. Often an area gets all the attention, however as it remains disconnected from the rest, the real efficiencies are not there. The McKinsey article ends with the conclusion that the digital winners at that time are the ones with bold strategies win:

we found a mismatch between today’s digital investments and the dimensions in which digitization is most significantly affecting revenue and profit growth. We also confirmed that winners invest more and more broadly and boldly than other companies do

The “connected” supply chain

Image: A&D Action Group – Global Collaboration

Of course, the traditional industries that invented PLM have invested in a kind of connected supply chain. However, is it really a connected supply chain? Aerospace and Defense companies had their supplier portals.

A supplier had to download their information or upload their designs combined with additional metadata.

These portals were completely bespoke and required on both sides “backbreaking” manual work to create, deliver, and validate the required exchange packages. The OEMs were driving the exchange process. More or less, by this custom approach, they made it difficult for suppliers to have their own PLM-environment. The downside of this approach was that the supplier had separate environments for each OEM.

In 2006 I worked with SmarTeam on the concept of the “Supply Chain Express,” an offering that allowed a supplier to have their own environment using SmarTeam as a PDM/PLM-system the Supply Chain Express package to create an intelligent import and export package. The content was all based on files and configurable metadata based on the OEM-Supplier relation.

Some other PLM-vendors or implementers have built similar exchange solutions to connect the world of the OEM and the supplier.

The main characteristic was that it is file-based with custom metadata, often in an XML-format or otherwise using Excel as the metadata carrier.

In my terminology of Coordinated – Connected, this would be Coordinated and “old school.”

 

The “better connected” supply chain

As I mentioned in my previous post about the PLM Roadmap/PDT Fall conference,  Katheryn Bell (Pratt & Whitney Canada) presented the progress of the A&D Global Collaboration workgroup. As part of the activities, they classified the collaboration between the OEM and the supplier in 3 levels, as you can see from the image:

This post mainly focuses on the L1 collaboration as this is probably the most used scenario.

In the Aerospace and Automotive industry, the OEM and suppliers’ data exchange has improved twofold by using Technical Data Packages where the content is supported by Model-Based Definition.

The first advantages of Model-Based Definition are mainly related to a consistent information package where the model is leading. The manufacturing views are explicitly defined on the 3D Model. Therefore there is a reduced chance of error for a misconnect between the “drawings” and the 3D Model.

The Model-Based definition still does not solve working with the latest (approved) version of the information. This still remains a “human-based” process in this case, and Kathryn Bell confirmed this was the biggest problem to solve.

The second advantage of using one of the interoperability standards for Model-Based Definition is the disconnect between application-specific data on the OEM side and the supplier side.

A significant advantage of Model-Based Definition is that there are a few interoperability standards, i.e., ISO 10303 – STEP, ISO14306 – JT, and  ISO32000/14739 (PRC for 3D PDF). In the end, the ideal would be that these standards merge into one standard, completely vendor-independent with a clearly defined scope of its purpose.

The benefit of these standards is also they increase the longevity of product data as the information is stored in an application-independent format. As long as the standard does not change (fast), storing data even internally in these neutral formats can save upgrade or maintenance costs.

However, I think you all know the joke below.

 

The connected supply chain

The ultimate goal in the long term will be the connected supply chain. Information shared between an OEM, and a supplier does not require human-based interfaces to ensure everyone works with the correct data.

The easiest way, and this is what some of the larger OEMs have done, is to consider suppliers as part of your PLM-infrastructure and give them access to all relevant data in the context of the system, the product, or the part they are responsible for. For the OEM, the challenge will be to connect suppliers – to motivate and train them to work in this environment.

For the supplier, the challenge is their IP-management. If they work for 100 percent in the OEM-environment, everything is exposed. If they want to work in their own environment, there is probably double work and a disconnect.

Of course, everything depends on the complexity of your interaction with the supplier.

With its Fusion Cloud Product Lifecycle Management (PLM), Oracle was one of the first to shift the attention to the connected supply chain.

If you search for PLM on the Oracle website, you will find it under Fusion Supply Chain and Manufacturing. It is a logical step as traditional ERP-vendors have never provided a full, rich portfolio for product design. CAD-integrations do not get a focus, and the future path to Model-Bases approaches (MBSE / MBD /MBE) is not visible at all.

Almost similar to what the Siemens-SAP alliance is showing. SAP more or less confirms that you should not rely on SAP PLM for more advanced PLM-scenarios but on Siemens’s offering.

For less complex but fast-moving products, for example, in the apparel industry, you see the promise of connecting all suppliers in one environment is time to market and traceability. This industry does not suffer from products with a long lifecycle with upgrades and services.

So far, the best collaboration platform in the cloud I have seen in Shareaspace from Eurostep. Its foundation based on the PLCS standard allows an OEM and Supplier to connect through their “shared space” – you can look at their supply chain offering here.

Slide: PDT Europe 2016 RENAULT PLM Challenges

In the various PDT-conferences, we have seen how even two OEMs could work in a joined environment (Renault-Nissan-Daimler) or how  BAE Systems used the ShareAspace environment to collaborate and consolidate all the data coming from the various system suppliers into one standards-based environment.

In 2021, I plan to write a series of blog posts related to possible add-on services for PLM. Supplier collaboration platforms, Configuration Management, End-to-end configurators, Product Information Management, are some of the themes I am currently exploring.

Conclusion

COVID-19 has illustrated the volatility of supply chains. Changing suppliers, working with suppliers in the traditional ways, still hinder reducing time to market. However, the promise of a real connected supply chain is enormous. As Boeing demonstrated in my previous post and explained in this post, standards are needed to become future proof.

Will 2021 have more focus on the connected supply chain?

 

Last week I shared my first review of the PLM Roadmap / PDT Fall 2020 conference, organized by CIMdata and Eurostep. Having digested now most of the content in detail, I can state this was the best conference of 2020. In my first post, the topics I shared were mainly the consultant’s view of digital thread and digital twin concepts.

This time, I want to focus on the content presented by the various Aerospace & Defense working groups who shared their findings, lessons-learned (so far) on topics like the Multi-view BOM, Supply Chain Collaboration, MBSE Data interoperability.

These sessions were nicely wrapped with presentations from Alberto Ferrari (Raytheon), discussing the digital thread between PLM and Simulation Lifecycle Management and Jeff Plant (Boeing) sharing their Model-Based Engineering strategy.

I believe these insights are crucial, although there might be people in the field that will question if this research is essential. Is not there an easier way to achieve to have the same results?

Nicely formulated by Ilan Madjar as a comment to my first post:

Ilan makes a good point about simplifying the ideas to the masses to make it work. The majority of companies probably do not have the bandwidth to invest and understand the future benefits of a digital thread or digital twins.

This does not mean that these topics should not be studied. If your business is in a small, simple eco-system and wants to work in a connected mode, you can choose a vendor and a few custom interfaces.

However, suppose you work in a global industry with an extensive network of partners, suppliers, and customers.

In that case, you cannot rely on ad-hoc interfaces or a single vendor. You need to invest in standards; you need to study common best practices to drive methodology, standards, and vendors to align.

This process of standardization is so crucial if you want to have a sustainable, connected enterprise. In the end, the push from these companies will lead to standards, allowing the smaller companies to ad-here or connect to.

The future is about Connected through Standards, as discussed in part 1 and further in this post. Let’s go!

Global Collaboration – Defining a baseline for data exchange processes and standards

Katheryn Bell (Pratt & Whitney Canada) presented the progress of the A&D Global Collaboration workgroup. As you can see from the project timeline, they have reached the phase to look towards the future.

Katheryn mentioned the need to standardize terminology as the first point of attention. I am fully aligned with that point; without a standardized terminology framework, people will have a misunderstanding in communication.

This happens even more in the smaller businesses that just pick sometimes (buzz) terms without a full understanding.

Several years ago, I talked with a PLM-implementer telling me that their implementation focus was on systems engineering. After some more explanations, it appeared they were making an attempt for configuration management in reality. Here the confusion was massive. Still, a standard, common terminology is crucial in our domain, even if it seems academic.

The group has been analyzing interoperability standards, standards for long-time archival and retrieval (LOTAR), but also has been studying the ISO 44001 standard related to Collaborative business relationship management systems

In the Q&A session, Katheryn explained that the biggest problem to solve with collaboration was the risk of working with the wrong version of data between disciplines and suppliers.

Of course, such errors can lead to huge costs if they are discovered late (or too late). As some of the big OEMs work with thousands of suppliers, you can imagine it is not an issue easily discovered in a more ad-hoc environment.

The move to a standardized Technical Data Package based on a Model-Based Definition is one of these initiatives in this domain to reduce these types of errors.

You can find the proceedings from the Global Collaboration working group here.

 

Connect, Trace, and Manage Lifecycle of Models, Simulation and Linked Data: Is That Easy?

I loved Alberto Ferrari‘s (Raytheon) presentation how he described the value of a model-based digital thread, positioning it in a targeted enterprise.

Click on the image and discover how business objectives, processes and models go together supported by a federated infrastructure.

Alberto’s presentation was a kind of mind map from how I imagine the future, and it is a pity if you have not had the chance to see his session.

Alberto also focused on the importance of various simulation capabilities combined with simulation lifecycle management. For Alberto, they are essential to implement digital twins. Besides focusing on standards, Alberto pleas for a semantic integration, open service architecture with the importance of DevSecOps.

Enough food for thought; as Alberto mentioned, he presented the corporate vision, not the current state.

More A&D Action Groups

There were two more interesting specialized sessions where teams from the A&D action groups provided a status update.

Brandon Sapp (Boeing) and Ian Parent (Pratt & Whitney) shared the activities and progress on Minimum Model-Based Definition (MBD) for Type Design Certification.

As Brandon mentioned, MBD is already a widely used capability; however, MBD is still maturing and evolving.  I believe that is also one of the reasons why MBD is not yet accepted in mainstream PLM. Smaller organizations will wait; however, can your company afford to wait?

More information about their progress can be found here.

Mark Williams (Boeing) reported from the A&D Model-Based Systems Engineering action group their first findings related to MBSE Data Interoperability, focusing on an Architecture Model Exchange Solution.  A topic interesting to follow as the promise of MBSE is that it is about connected information shared in models. As Mark explained, data exchange standards for requirements and behavior models are mature, readily available in the tools, and easily adopted. Exchanging architecture models has proven to be very difficult. I will not dive into more details, respecting the audience of this blog.

For those interested in their progress, more information can be found here

Model-Based Engineering @ Boeing

In this conference, the participation of Boeing was significant through the various action groups. As the cherry on the cake, there was Jeff Plant‘s session, giving an overview of what is happening at Boeing. Jeff is Boeing’s director of engineering practices, processes, and tools.

In his introduction, Jeff mentioned that Boeing has more than 160.000 employees in over 65 countries. They are working with more than 12.000 suppliers globally. These suppliers can be manufacturing, service or technology partnerships. Therefore you can imagine, and as discussed by others during the conference, streamlined collaboration and traceability are crucial.

The now-famous MBE Diamond symbol illustrates the model-based information flows in the virtual world and the physical world based on the systems engineering approach. Like Katheryn Bell did in her session related to Global Collaboration, Jeff started explaining the importance of a common language and taxonomy needed if you want to standardize processes.

Zoom in on the Boeing MBE Taxonomy, you will discover the clarity it brings for the company.

I was not aware of the ISO 23247 standard concerning the Digital Twin framework for manufacturing, aiming to apply industry standards to the model-based definition of products and process planning. A standard certainly to follow as it brings standardization on top of existing standards.

As Jeff noted: A practical standard for implementation in a company of any size. In my opinion, mandatory for a sustainable, connected infrastructure.

Jeff presented the slide below, showing their standardization internally around federated platforms.

This slide resembles a lot the future platform vision I have been sharing since 2017 when discussing PLM’s future at PLM conferences, when explaining the differences between Coordinated and Connected – see also my presentation here on Slideshare.

You can zoom in on the picture to see the similarities. For me, the differences were interesting to observe. In Jeff’s diagram, the product lifecycle at the top indicates the platform of (central) interest during each lifecycle stage, suggesting a linear process again.

In reality, the flow of information through feedback loops will be there too.

The second exciting detail is that these federated architectures should be based on strong interoperability standards. Jeff is urging other companies, academics and vendors to invest and come to industry standards for Model-Based System Engineering practices.  The time is now to act on this domain.

It reminded me again of Marc Halpern’s message mentioned in my previous post (part 1) that we should be worried about vendor alliances offering an integrated end-to-end data flow based on their solutions. This would lead to an immense vendor-lock in if these interfaces are not based on strong industry standards.

Therefore, don’t watch from the sideline; it is the voice (and effort) of the companies that can drive standards.

Finally, during the Q&A part, Jeff made an interesting point explaining Boeing is making a serious investment, as you can see from their participation in all the action groups. They have made the long-term business case.

The team is confident that the business case for such an investment is firm and stable, however in such long-term investment without direct results, these projects might come under pressure when the business is under pressure.

The virtual fireside chat

The conference ended with a virtual fireside chat from which I picked up an interesting point that Marc Halpern was bringing in. Marc mentioned a survey Gartner has done with companies in fast-moving industries related to the benefits of PLM. Companies reported improvements in accuracy and product development. They did not see so much a reduced time to market or cost reduction. After analysis, Gartner believes the real issue is related to collaboration processes and supply chain practices. Here lead times did not change, nor the number of changes.

Marc believes that this topic will be really showing benefits in the future with cloud and connected suppliers. This reminded me of an article published by McKinsey called The case for digital reinvention. In this article, the authors indicated that only 2 % of the companies interview were investing in a digital supply chain. At the same time, the expected benefits in this area would have the most significant ROI.

The good news, there is consistency, and we know where to focus for early results.

Conclusion

It was a great conference as here we could see digital transformation in action (groups). Where vendor solutions often provide a sneaky preview of the future, we saw people working on creating the right foundations based on standards. My appreciation goes to all the active members in the CIMdata A&D action groups as they provide the groundwork for all of us – sooner or later.

Last week I was happy to attend the PLM Roadmap / PDT Fall 2020 conference as usual organized by CIMdata and Eurostep. I wrote about the recent PI DX conference, which touched a lot on the surface of PLM and Digital Transformation. This conference is really a conference for those who want to understand the building blocks needed for current and future PLM.

In this conference, usually with approximately 150 users on-site, now with over 250 connected users for 3 (half) days. Many of us, following every session of the conference. As an active participant in the physical events, it was a little disappointing not to be in the same place with the other participants this time. The informal network meetings in this conference have always been special thanks to a relatively small but stable group of experts.  Due to the slightly reduced schedule, there was this time, less attention for some of the typical PDT-topics most of the time coming from Sweden and related to sustainability.

The conference’s theme was Digital Thread—the PLM Professionals’ Path to Delivering Innovation, Efficiency, and Quality and might sound like a marketing statement.  However, the content presented was much more detailed than just marketing info. The fact that you watched the presentation on your screen made it an intense conference with many valuable details.

Have a look at the agenda, and I will walk you through some of the highlights for me. As there was so much content to discuss, I will share this time part 1. Next week, in part 2, you will see the coherence of all the presentations.

As if there was a Coherent Thread.

Digital Twin, It Requires a Digital Thread

Peter Bilello, President & CEO, CIMdata, ‘s keynote with the title Digital Twin, It Requires a Digital Thread was immediately an illustration of discussing reality.  When I stated at the Digital Twin conference in the Netherlands that “Digital Twins do not run on Documents“, it had the same meaning as when Peter stated,” A Digital Twin without a Digital Thread is an orphan”.

Digital Thread

And Peter’s statement, “All companies do PLM, most of the time however disconnected”, is another way to stimulate companies working in a connected manner.

As usual Peter’s session was a good overview of the various aspect related to the Digital Thread and Digital Twin.

Digital Twin

The concept of a virtual twin is not new. The focus is as mentioned before now more on the term “Connected” Peter provided the CIMdata definition for Digital Thread and Digital Twin. Click on the images to the left to read the full definition.

Peter’s overview also referred to the Boeing Diamond, illustrating the mapping of the physical and virtual world, connected through a Digital Thread the various Digital Twins that can exist. The Boeing Diamond was one of the favorites during the conference.

When you look at Peter’s conclusions, there is an alignment with what I wrote in the post: A Digital Twin for Everyone and the fact that we need to strive for a connected enterprise. Only then we can benefit from a Digital Twin concept.

 

The Multi-view BOM Solution Evaluation
– Process, Results, and Industry Impacts

The reports coming from the various A&D PLM action groups are always engaging sessions to watch. Here, nine companies, even competitors, discuss and explore PLM themes between themselves supported by CIMdata.

These companies were the first that implemented PLM; it is interesting to watch how they move forward like supertankers. They cannot jump from one year to another year on a new fashionable hype. Their PLM-infrastructure needs to be consistent and future-proof due to their data’s longevity and the high standards for regulatory compliance and safety.

However, these companies are also pioneers for the future. They have been practicing Model-Based approaches for over ten years already and are still learning. In next week’s post, you will read later that these frontrunners are pushing for standards to make a Model-Based future affordable and achievable.

In that context, the action group Multi-View BOM shared their evaluation results for a study related to the multi-view BOM. A year ago, I wrote about this topic when Fred Feru from Airbus presented the intermediate results at the CIMdata Roadmap/PDT 2019 conference.

Dan Ganser (Gulfstream) and Javier Reines (Airbus) presented the findings. The conclusion was that the four vendors evaluated, i.e., Aras, Dassault Systems, PTC and Siemens, all passed the essential requirements and use cases. You can find the report and the findings here: Multi-view Bill of Materials

One interesting remark.

When the use cases were evaluated, the vendors could score on a level from 0 to 5, see picture. Interesting to see that apparently, it was possible to exceed the requirement, something that seems like a contradiction.

In particular, in this industry, where formal requirements management is a must – either you meet a requirement or not.

Dan Ganser explained that the current use cases were defined based on the minimum expectations, therefore there was the option to exceed the requirement. I still would be curious to see what does it mean to exceed the requirement. Is it usability, time, or something innovative we might have missed?

 

5G for Digital Twins & Shadows

I learned a lot from the presentation from Niels Koenig, working at the Fraunhofer Institute for Production Technology. Niels explained how important 5G is for realizing the Industry 4.0 targets. At the 5G Industry Campus, several projects are running to test and demonstrate the value of 5G in relation to manufacturing.

If you want to get an impression of the 5G Industry Campus – click on the Youube movie.

One of the examples Niels discussed was closed-loop manufacturing. Thanks to the extremely low latency (< 1ms), a connected NC machine can send real-time measurements to be compared with the expected values. For example, in the case of resonance, the cutting might not be smooth. Thanks to the closed-loop, the operator will be able to interfere or adjust the operation. See the image below.

Digital Thread: Be Careful What you Wish For, It Just Might Come True

I was looking forward to Marc Halpern‘s presentation. Marc often brings a less technical viewpoint but a more business-related viewpoint to the discussion. Over the past ten years, there have been many disruptive events, most recently the COVID-pandemic.

Companies are asking themselves how they can remain resilient. Marc shared some of his thoughts on how Digital Twins and Digital Threads can support resilience.

In that context, Gartner saw a trend that their customers are now eagerly looking for solutions related to Digital Twin, Digital Thread, Model-Based Approaches, combined with the aim to move to the cloud. Related to Digital Thread and Digital Twin, most of Gartner’s clients are looking for traceability and transparency along the product lifecycle. Most Digital Twin initiatives focus on a twin of operational assets, particularly inside the manufacturing facility. Nicely linking to Niels Konig’s session related to 5G.

Marc stated that there seems to be a consensus that a Digital Thread is compelling enough for manufacturers to invest. In the end, they will have to. However, there are also significant risks involved. Marc illustrated the two extremes; in reality, companies will end up somewhere in the middle, illustrated later by Jeff Plant from Boeing. The image on the left is a sneaky preview for next week.

When discussing the Digital Thread, Marc again referred to it more as a Digital Net, a kind of connected infrastructure for various different threads based on the various areas of interest.

I show here a slide from Marc’s presentation at the PDT conference in 2018. It is more an artist’s impression of the same concept discussed during this conference again, the Boeing Diamond.

Related to the risk of implementing a Digital Thread and Digital Twin, Marc showed another artistic interpretation; The two extremes of two potential end states of Digital Thread investment. Marc shared the critical risks for both options.

For the Vendor Black Hole, his main points were that if you choose a combined solution, diminished negotiating power, higher implementation costs, and potentially innovative ideas might not be implemented as they are not so relevant for the vendor. They have the power!

As an example of combined solutions Marc mentioned, the recently announced SAP-Siemens partnership, the Rockwell Automation-PTC partnership, the Schneider Electric-Aveva-partnership, and the ABB-Dassault Systemes partnership.

Once you are in the black hole, you cannot escape. Therefore, Marc recommended making sure you do not depend on a few vendors for your Digital Twin infrastructure.

The picture on the left illustrates the critical risks of the Enterprise Architecture “Mess”. It is a topic that I am following for a long time. Suppose you have a collection of services related to the product lifecycle, like Workflow-services, 3D Modeling-services, BOM-services, Manufacturing-services.

Together they could provide a PLM-infrastructure.

The idea behind this is that thanks to openness and connectivity, every company can build its own unique enterprise architecture. No discussion about standard best practices. You build your company’s best practices (for the future, the current ?)

It is mainly promoted as a kind of bottom-up PLM. If you are missing capabilities, just build them yourselves, using REST-services, APIs, using Low-Code platforms. It seems attractive for the smaller enterprises, however most of the time, only a short time. I fully concur with Marc’s identified risks here.

As I often illustrated in presentations related to a digital future, you will need a mix of both. Based on your point of focus, you could imagine five major platforms being connected together to cover all aspects of a business. Depending on your company’s business model and products, one of them might be the dominant one. With my PLM-focus, this would be the Product Innovation Platform, where the business is created.

Marc ended with five priorities to enable a long-term Digital Thread success.

  • First of all – set the ground rules for data governance. A topic often mentioned but is your company actively engaging on that already?
  • Next, learn from Model-Based Systems Engineering as a foundation for a Model-Based Enterprise.  A topic often discussed during the previous CIMdata Roadmap / PDT-conference.
  • The change from storing and hiding information in siloes towards an infrastructure and mindset of search and access of data, in particular, the access to Bill of Materials

The last point induced two more points.

  • The need for an open architecture and standards. We would learn more on this topic on day 3 of the conference.
  • Make sure your digital transformation sticks within the organization by investing and executing on organizational change management.

Conclusion

The words “Digital Thread” and “Digital Twin” are mentioned 18 times in this post and during the conference even more. However, at this conference, they were not hollow marketing terms. They are part of a dictionary for the future, as we will see in next week’s post when discussing some of the remaining presentations.

Closing this time with a point we all agreed upon: “A Digital Twin without a Digital Thread is an orphan”. Next week more!

About a year ago we started the PLM Global Green Alliance, further abbreviated as the PGGA. Rich McFall, the main driver behind the PGGA started the website, The PLM Green Alliance, to have a persistent place to share information.

Also, we launched the PLM Global Alliance LinkedIn group to share our intentions and create a community of people who would like to share knowledge through information or discussion.

Our mission statement is:

The mission of the new PLM Green Alliance is to create global connection, communication, and community between professionals who use, develop, market, or support Product Lifecycle Management (PLM) related technologies and software solutions that have value in addressing the causes and consequences of climate change due to human-generated greenhouse gas emissions. We are motivated by the technological challenge to help create a more sustainable and green future for our economies, industries, communities, and all life forms on our planet that depend on healthy ecosystems.

My motivation

My personal motivation to support and join the PGGA was driven by the wish to combine my PLM-world with interest to create a more sustainable society for anyone around the world. It is a challenging combination. For example, PLM is born in the Aerospace and Defense industries, probably not the most sustainable industries.

Having worked with some companies in the Apparel and Retail industry, I have seen that these industries care more about their carbon footprint. Perhaps because they are “volume-industries” closely connected to their consumers, these industries actively build practices to reduce their carbon footprint and impact societies. The sense or non-sense of recycling is such a topic to discuss and analyze.

At that time, I got inspired by a session during the PLM Roadmap / PDT 2019 conference.

Graham Aid‘s from the Ragn-Sells group was a call to action. Sustainability and a wealthy economy go together; however, we have to change our habits & think patterns.  You can read my review from this session in this blog post: The weekend after PLM Roadmap / PDT 2019 – Day 1

Many readers of this post have probably never heard of the Ragn-Sells group or followed up on a call for action.  I have the same challenge. Being motivated beyond your day-to-day business (the old ways of working) and giving these activities priority above exploring and learning more about applying sustainability in my PLM practices.

And then came COVID-19.

I think most of you have seen the image on the left, which started as a joke. However, looking back, we all have seen that COVID-19 has led to a tremendous push for using digital technologies to modernize existing businesses.

Personally, I was used to traveling every 2 – 3 weeks to a customer, now I have left my home office only twice for business. Meanwhile, I invested in better communication equipment and a place to work. And hé, it remains possible to work and communicate with people.

Onboarding new people, getting to know new people takes more social interaction than a camera can bring.

In the PGGA LinkedIn community, we had people joining from all over the world. We started to organize video meetings to discuss their expectations and interest in this group with some active members.

We learned several things from these calls.

First of all, finding a single timeslot that everyone worldwide could participate in is a challenge. A late Friday afternoon is almost midnight in Asia and morning in the US. And is Friday the best day – we do not know yet.

Secondly, we realized that posts published in our LinkedIn group did not appear in everyone’s LinkedIn feed due to LinkedIn’s algorithms. For professionals, LinkedIn becomes less and less attractive as the algorithms seem to prefer frequency/spam above content.

For that reason, we are probably moving to the PLM Green Alliance website and combine this environment with a space for discussion outside the LinkedIn scope. More to come on the PGGA website.

Finally, we will organize video discussion sessions to ask the participants to prepare themselves for a discussion. Any member of the PGGA can bring in the discussion topics.

It might be a topic you want to clarify or better understand.

What’s next

For December 4th, we have planned a discussion meeting related to the Exponential Roadmap 2019 report, where  36  solutions to halve carbon emission by 2030 are discussed. In our video discussion, we want to focus on the chapter: Digital Industries.

We believe that this topic comes closest to our PLM domain and hopes that participants will share their thinking and potential activities within their companies.

You can download the Exponential Roadmap here or by clicking on the image. More details about the PLM Global Green Alliance you will find in the LinkedIn group. If you want to participate, let us know.

The PGGA website will be the place where more and more information will be collected per theme, to help you understand what is happening worldwide and the place where you can contribute to let us know what is happening at your side.

Conclusion

The PLM Global Green Alliance exists now for a year with 192 members. With approximately five percent active members, we have the motivation to grow our efforts and value. We learned from COVID-19 there is a need to become proactive as the costs of prevention are always lower than the costs of (trying) fixing afterward.

And each of us has the challenge to behave a little differently than before.

Will you be one of them ?

%d bloggers like this: