You are currently browsing the tag archive for the ‘Data centric’ tag.

With great pleasure, I am writing this post, part of a tradition that started for me in 2014. Posts starting with “The weekend after …. “describing what happened during a PDT conference, later the event merged with CIMdata becoming THE PLM event for discussions beyond marketing.

For many of us, this conference was the first time after COVID-19 in 2020. It was a 3D (In person) conference instead of a 2D (digital) conference. With approximately 160 participants, this conference showed that we wanted to meet and network in person and the enthusiasm and interaction were great.

The conference’s theme, Digital Transformation and PLM – a call for PLM Professionals to redefine and re-position the benefits and value of PLM, was quite open.

There are many areas where digitization affects the way to implement a modern PLM Strategy.

Now some of my highlights from day one. I needed to filter to remain around max 1500 words. As all the other sessions, including the sponsor vignettes, were informative, they increased the value of this conference.


Digital Skills Transformation -Often Forgotten Critical Element of Digital Transformation

Day 1 started traditionally with the keynote from Peter Bilello, CIMdata’s president and CEO. In previous conferences, Peter has recently focused on explaining the CIMdata’s critical dozen (image below). If you are unfamiliar with them, there is a webinar on November 10 where you can learn more about them.

All twelve are equally important; it is not a sequence of priorities. This time Peter spent more time on Organisational Change management (OCM), number 12 of the critical dozen – or, as stated, the Digital Transformation’s Achilles heel. Although we always mention people are important, in our implementation projects, they often seem to be the topic that gets the less focus.

We all agree on the statement: People, Process, Tools & Data. Often the reality is that we start with the tools, try to build the processes and push the people in these processes. Is it a coincidence that even CIMdata puts Digital Skills transformation as number 12? An unconscious bias?

This time, the people’s focus got full attention. Peter explained the need for a digital skills transformation framework to educate, guide and support people during a transformation. The concluding slide below says it all.


Transformation Journey and PLM & PDM Modernization to the Digital Future

The second keynote of the day was from Josef Schiöler, Head of Core Platform Area PLM/PDM from the Volvo Group. Josef and his team have a huge challenge as they are working on a foundation for the future of the Volvo Group.

The challenge is that it will provide the foundation for new business processes and the various group members, as the image shows below:


As Josef said, it is really the heart of the heart, crucial for the future. Peter Bilello referred to this project as open-heart surgery while the person is still active, as the current business must go on too.

The picture below gives an impression of the size of the operation.

And like any big transformation project also, the Volvo Group has many questions to explore as there is no existing blueprint to use.

To give you an impression:

  • How to manage complex documentation with existing and new technology and solution co-existing?
    (My take: the hybrid approach)
  • How to realize benefits and user adoption with user experience principles in mind?
    (My take: Understand the difference between a system of engagement and a system of record)
  • How to avoid seeing modernization as pure an IT initiative and secure that end-user value creation is visible while still keeping a focus on finalizing the technology transformation?
    (My take: think hybrid and focus first on the new systems of engagement that can grow)
  • How to efficiently partner with software vendors to ensure vendor solutions fit well in the overall PLM/PDM enterprise landscape without heavy customization?
    (My take: push for standards and collaboration with other similar companies – they can influence a vendor)

Note: My takes are just a starting point of the conversation. There is a discussion in the PLM domain, which I described in my blog post: A new PLM paradigm.

 

The day before the conference, we had a ½ day workshop initiated by SAAB and Eurostep where we discussed the various angles of the so-called Federated PLM.

I will return to that topic soon after some consolidation with the key members of that workshop.


Steering future Engineering Processes with System Lifecycle Management

Patrick Schäfer‘s presentation was different than the title would expect. Patrick is the IT Architect Engineering IT from ThyssenKrupp Presta AG. The company provides steering systems for the automotive industry, which is transforming from mechanical to autonomous driving, e-mobility, car-to-car connectivity, stricter safety, and environmental requirements.

The steering system becomes a system depending on hardware and software. And as current users of Agile PLM, the old Eigner PLM software, you can feel Martin Eigner’s spirit in the project.

I briefly discussed Martin’s latest book on System Lifecycle Management in my blog post, The road to model-based and connected PLM (part 5).

Martin has always been fighting for a new term for modern PLM, and you can see how conservative we are – for sometimes good reasons.

Still, ThyssenKrupp Presta has the vision to implement a new environment to support systems instead of hardware products. And in addition, they had to work fast to upgrade their current almost obsolete PLM environment to a new supported environment.

The wise path they chose was first focusing on a traditional upgrade, meaning making sure their PLM legacy data became part of a modern (Teamcenter) PLM backbone. Meanwhile, they started exploring the connection between requirements management for products and software, as shown below.

From my perspective, I would characterize this implementation as the coordinated approach creating a future option for the connected approach when the organization and future processes are more mature and known.

A good example of a pragmatic approach.


Digital Transformation in the Domain of Products and Plants at Siemens Energy

Per Soderberg, Head of Digital PLM at Siemens Energy, talked about their digital transformation project that started 6 – 7 years ago. Knowing the world of gas- and steam turbines, it is a domain where a lot of design and manufacturing information is managed in drawings.

The ultimate vision from Siemens Energy is to create an Industrial Metaverse for its solutions as the benefits are significant.

Is this target too ambitious, like GE’s 2014 Industrial Transformation with Predix? Time will tell. And I am sure you will soon hear more from Siemens Energy; therefore, I will keep it short. An interesting and ambitious program to follow. Sure you will read about them in the near future. 


Accelerating Digitalization at Stora Enso

Stora Enso is a Finish company, a leading global provider of renewable solutions in packaging, biomaterials, wooden construction and paper. Their director of Innovation Services, Kaisa Suutari, shared Stora Enso’s digital transformation program that started six years ago with a 10 million/year budget (some people started dreaming too). Great to have a budget but then where to start?

In a very systematic manner using an ideas funnel and always starting from the business need, they spend the budget in two paths, shown in the image below.

Their interesting approach was in the upper path, which Kaisa focused on. Instead of starting with an analysis of how the problem could be addressed, they start by doing and then analyze the outcome and improve.

I am a great fan of this approach as it will significantly reduce the time to maturity. However, how much time is often wasted in conducting the perfect analysis?

Their Digi Fund process is a fast process to quickly go from idea to concept, to POC and to pilot, the left side of the funnel. After a successful pilot, an implementation process starts small and scales up.

There were so many positive takeaways from this session. Start with an MVP (Minimal Viable Product) to create value from the start. Next, celebrate failure when it happens, as this is the moment you learn. Finally, continue to create measurable value created by people – the picture below says it all.

It was the second time I was impressed by Stora Enso’s innovative approach. During the PI PLMX 2020 London, Samuli Savo, Chief Digital Officer at Stora Enso, gave us insights into their innovation process. At that time, the focus was a little bit more on open innovation with startups. See my post:  The weekend after PI PLMx London 2020. An interesting approach for other businesses to make their digital transformation business-driven and fun for the people


 A day-one summary

There was Kyle Hall, who talked about MoSSEC and the importance of this standard in a connected enterprise. MoSSEC (Modelling and Simulation information in a collaborative Systems Engineering Context) is the published ISO standard (ISO 10303-243) for improving the decision-making process for complex products. Standards are a regular topic for this conference, more about MoSSEC here.

There was Robert Rencher, Sr. Systems Engineer, Associate Technical Fellow at Boeing, talking about the progress that the A&D action group is making related to Digital Thread, Digital Twins. Sometimes asking more questions than answers as they try to make sense of the marketing definition and what it means for their businesses. You can find their latest report here.

There was Samrat Chatterjee, Business Process Manager PLM at the ABB Process Automation division. Their businesses are already quite data-driven; however, by embedding PLM into the organization’s fabric, they aim to improve effectiveness, manage a broad portfolio, and be more modular and efficient.

The day was closed with a CEO Spotlight, Peter Bilello. This time the CEOs were not coming from the big PLM vendors but from complementary companies with their unique value in the PLM domain. Henrik Reif Andersen, co-founder of Configit; Dr. Mattias Johansson, CEO of Eurostep; Helena Gutierrez, co-founder of Share PLM; Javier Garcia, CEO of The Reuse Company and  Karl Wachtel, CEO, XPLM discussed their various perspectives on the PLM domain.

 

Conclusion

Already so much to say; sorry, I reached the 1500 words target; you should have been there. Combined with the networking dinner after day one, it was a great start to the conference. Are you curious about day 2 – stay tuned, and your curiosity will be rewarded.

 

Thanks to Ewa Hutmacher, Sumanth Madala and Ashish Kulkarni, who shared their pictures of the event on LinkedIn. Clicking on their names will lead you to the relevant posts.

 

The summer holidays are over, and with the PLM Global Green Alliance, we are glad to continue with our series: PLM and Sustainability, where we interview PLM-related software vendors, talking about their sustainability mission and offering.

We talked with SAP, Autodesk, and Dassault Systèmes. This week we spoke with Sustaira, and soon we will talk with Aras.  Sustaira, an independent Siemens partner, is the provider of a sustainability platform based on Mendix.

SUSTAIRA

The interview with Vincent de la Mar, founder and CEO of Sustaira, was quite different from the previous interviews. In the earlier interviews, we talked with people driving sustainability in their company and software portfolio. Now with Sustaira, we were talking with a relatively new company with a single focus on sustainability.

Sustaira provides an open platform targeting purely sustainability by offering relevant apps and infrastructure based on Mendix.

Listen to the interview and discover the differences and the potential for you.

Slides shown during the interview and additional company information: Sustaira Overview 2022.

What we have learned

Using the proven technology of the Mendix platform allows you to build a data-driven platform focused on sustainability for your company.

As I wrote in my post: PLM and Sustainability, there is the need to be data-driven and connected with federated data sources for accurate data.

This is a technology challenge. Sustaira, as a young company, has taken up this challenge and provides various apps related to sustainability topics on its platform. Still, they remain adaptable to your organization.

Secondly, I like the concept that although Mendix is part of the Siemens portfolio, you do not need to have Siemens PLM installed. The openness of the Sustaira platform allows you to implement it in your organization independent of your PLM infrastructure.

The final observation – the rule of people, process, and technology – is still valid. To implement Sustaira in an efficient and valuable manner, you need to be clear in your objectives and sustainability targets within the organization. And these targets should be more detailed than the corporate statement in the annual report.

 

Want to Learn more

To learn more about Sustaira and the wide variety of offerings, you can explore any of these helpful links:

 

Conclusion

It was interesting to learn about Sustaira and how they started with a proven technology platform (Mendix) to build their sustainability platform. Being sustainable involves using trusted data and calculations to understand the environmental impact at every lifecycle stage.

Again we can state that the technology is there. Now it is up to companies to act and connect the relevant data sources to underpin and improve their sustainability efforts.

 

As human beings, we believe in the truth. We claim the truth. During my holiday in Greece, the question was, did the Greek Prime Minister tell the truth about the internal spy scandal?

In general, we can say, politicians never speak the real truth, and some countries are trying to make sure there is only one single source of truth – their truth. The concept of a Single Source Of Truth (SSOT) is difficult to maintain in politics.

On social media, Twitter and Facebook, people are claiming their truth. But unfortunately, without any scientific background, people know better than professionals by cherry-picking messages, statistics or even claiming non-existing facts.

Nicely described in The Dunning-Kruger effect. Unfortunately, this trend will not disappear.

If you want to learn more about the impact of social media, read this long article from The Atlantic:  Why the Past 10 Years of American Life Have Been Uniquely Stupid. Although the article is about the US, the content is valid for all countries where social media are still allowed.

The PLM and CM domain is the only place where people still rely on the truth defined by professionals. Manufacturing companies depend on reliable information to design, validate, manufacture and support their products. Compliance and safe products require an accurate and stable product definition based on approved information. Therefore, the concept of SSOT is crucial along the product lifecycle.

The importance may vary depending on the product type. The difference in complexity between an airplane and a plastic toy, for example. It is all about the risk and impact of a failure caused by the product.

During my holiday, the SSOT discussion was sparked on LinkedIn by Adam Keating, and the article starts with:

The “Single Source of Truth (SSOT)” wasn’t built for you. It was built for software vendors to get rich. Not a single company in the world has a proper SSOT.

A bit provocative, as there is nothing wrong with software vendors being profitable. Profitability guarantees the long-time support of the software solution. Remember the PLM consolidation around 2006, when SmarTeam, Matrix One (Dassault), Agile and Eigner & Partner (Oracle) were acquired, disappeared or switched to maintenance mode.

Therefore it makes sense to have a profitable business model or perhaps a real open source business model.

Still, the rest of the discussion was interesting, particularly in the LinkedIn comments. Adam mentioned the Authoritative Source of Truth (ASOT) as the new future. And although this concept becomes more and more visible in the PLM domain, I believe we need both. So, let’s have a look at these concepts.

 

Truth 1.0 – SSOT

Historically, manufacturing companies stored the truth in documents, first paper-based, later in electronic file formats and databases.

The truth consists of drawings, part lists, specifications, and other types of information.

Moreover, the information is labeled with revisions and versions to identify the information.

By keeping track of the related information through documents or part lists with significant numbers, a person in the company could find the correct corresponding information at any stage of the lifecycle.

Later, by storing all the information in a central (PLM) system, the impression might be created that this system is the Single Source Of Truth. The system Adam Keating agitated against in his LinkedIn post.

Although for many companies, the ERP has been the SSOT  (and still is). All relevant engineering information was copied into the ERP system as attached files. Documents are the authoritative, legal pieces of information that a company shares with suppliers, authorities, or customers. They can reside in PLM but also in ERP. Therefore, you need an infrastructure to manage the “truth.”

Note: The Truth 1.0 story is very much a hardware story.

Even for hardware, ensuring a consistent single version of the truth for each product remains difficult. In theory, its design specifications should match the manufacturing definition. The reality, however, shows that often this is not the case. Issues discovered during the manufacturing process are fixed in the plant – redlining the drawing  – is not always processed by engineering.

As a result, Engineering and Manufacturing might have a different version of what they consider the truth.

The challenge for a service engineer in the field is often to discover the real truth. So the “truth” might not always be in the expected place – no guaranteed Single Source Of Truth.

Configuration Management is a discipline connected to PLM to ensure that the truth is managed so that as-specified, as-manufactured, and as-delivered information has been labeled and documented unambiguously. In other words, you could say Configuration Management(CM) is aiming for the Single Source Of Truth for a product.

If you want to read more about the relation between PLM and CM  – read this post: PLM and Configuration Management (CM), where I speak with Martijn Dullaart about the association between PLM and CM.

Martijn has his blog mdux.net and is the Lead Architect for Enterprise Configuration Management at our Dutch pride ASML. Martijn is also Chairperson I4.0 Committee IPX Congress.

Summarizing: The Single Source Of Truth 1.0 concept is document-based and should rely on CM practices, which require skilled people and the right methodology. In addition, some industries require Truth 1.0.

Others take the risk of working without solid CM practices, and the PLM system might create the impression of the SSOT; it will not be the case, even for only hardware.

 Truth 2.0 – ASOT

Products have become more complex, mainly due to the combination of electronics and software. Their different lifecycles and the speed of change are hard to maintain using the traditional PLM approach of SSOT.

It will be impossible to maintain an SSOT, particularly if it is based on documents.

As CM is the discipline to ensure data consistency, it is important to look into the future of CM. At the end of last year, I discussed this topic with 3 CM thought leaders. Martijn Dullaart, Maxime Gravel and Lisa Fenwick discussed with me what they believe the change would be. Read and listen here: The future of Configuration Management.


From the discussion, it became clear that managing all the details is impossible; still, you need an overreaching baseline to identify the severity and impact of a change along the product lifecycle.

New methodologies can be developed for this, as reliable data can be used in algorithms to analyze a change impact. This brings us to the digital thread. According to the CIMdata definition used in the A&D digital twin phase 2 position paper:

The digital thread provides the ability for a business to have an Authoritative Source of Truth(ASOT), which is information available and connected in a core set of the enterprise systems across the lifecycle and supplier networks

The definition implies that, in the end, a decision is made on data from the most reliable, connected source. There might be different data in other locations. However, this information is less reliable. Updating or fixing this information does not make sense as the effort and cost of fixing will be too expensive and give no benefit.

Obviously, we need reliable data to implement the various types of digital twins.

As I am intrigued by the power of the brain – its strengths and weaknesses – the concept of ASOT can also be found in our brains. Daniel Kahneman’s book, Thinking Fast and Slow talks about the two systems/modes our brain uses. The Fast one (System 1 – low energy usage) could be the imaginary SSOT, whereas the Slow one (System 2 – high energy required) is the ASOT. The brain needs both, and I believe this is the same in our PLM domain.

A new PLM Paradigm

In this context, there is a vivid discussion about the System of Record and Systems of Engagement. I wrote about it in June (post: A new PLM paradigm); other authors name it differently, but all express a similar concept. Have a look at these recent articles and statements from:

Author Link to content

Authentise

 

The challenge of cross-discipline collaboration …….

Beyond PLM

 

When is the right time to change your PLM system + discussion

Colab

 

The Single Source Of Truth wasn’t built for you …….

Fraunhofer institute

 

Killing the PLM Monolith – the Emergence of cloud-native System Lifecycle Management (SysLM)

SAAB Group

 

Don’t mix the tenses. Managing the Present and the Future in an MBSE context

Yousef Hooshmand

 

From a Monolithic PLM Landscape to a Federated Domain and Data Mesh

If you want to learn more about these concepts and discuss them with some of the experts in this domain, come to the upcoming PLM Roadmap PTD Europe conference on 18-19 October in Gothenburg, Sweden. Have a look at the final agenda here

Register before September 12 to benefit from a 15 % Early Bird discount, which you can spend for the dinner after day 1. I look forward to discussing the SSOT/ASOT topics there.


Conclusion

The Single Source Of Truth (SSOT) and the Authoritative Source of Truth (ASOT) are terms that illustrate the traditional PLM paradigm is changing thanks to digitization and connected stakeholders. The change is in the air. Now, the experience has to come. So be part of the change and discuss with us.

 

As I promised I would be enjoying my holidays in the upcoming month there as still a few points I want to share with you.

Not a real blog post, more an agenda and a set of questions for potential follow-up.

Here are five topics for the upcoming months, potentially also relevant and interesting for you. Have a look.

 

Peer Check

This week the discussion I had with Adam Keating, Colab’s CEO and founder, was published on their podcast channel, Peer Check. As I slowly discovered the content, I mentioned their podcast in my last blog post.  I was impressed by the first episodes I could listen to and listened to all of them last week.

Digesting the content from these episodes, I have the impression that we are following Adam’s or Collab’s lifecycle. From understanding the market, the people, and the industry towards the real collaboration topics, like MBD, their product offering and ultimately the connection with PLM. I am curious about what is next.

For me discovering their podcast and being able to participate was an exciting and learning moment. I am still waiting for the readers of this blog to mention their favorite podcasts.

Let us know in the comments.

PLM Global Green Alliance

With the PLM Global Green Alliance (PGGA), we plan to have monthly ZOOM discussions with our LinkedIn members, moderated by one of the PGGA core team members.

The idea of these sessions is that we pick a topic, the moderator sets the scene and then it is up to the members to discuss.

Participants can ask questions and bring in their points. In our understanding, many companies believe they have to do something about sustainability beyond writing it in their mission, but where and how to start?

So the PGGA discussion will be a place to get inspired and act.

Potential topics for the discussion are: What technologies must I master to become more sustainable? How can I motivate my company to become real sustainable? What is a lifecycle assessment (LCA), and how to introduce it in my company? What is the circular economy, and what is needed to become more circular in the context of PLM?

If you like one of the topics, let us know in the comments or add your favorite discussion topic. More on the agenda in early September

 

PGGA meets ….

In this series with PLM vendors and solution providers, we try to understand their sustainability drivers, their solutions, their roadmap and their perception of what is happening in the field. So far, SAP, Autodesk and Dassault Systèmes have contributed to these series. After the summer, we continue with two interviews:

Early in September, the PGGA will discuss sustainability with Sustaira. Sustaira is a Siemens partner, and they offer an all-in-one Sustainability platform, domain-specific Sustainability app templates, and custom Sustainability web and mobile initiatives. Expect the interview to be published early in September.

In the last week of September, the PGGA will have a meeting with Aras in our series related to sustainability. Aras is one of the main PLM providers and we will discuss sustainability even more with them as you can read further on in this agenda. Expect the interview to be released by the end of September.

No actions here for you, just stay tuned in September with the PGGA.

 

CIMdata PLM Roadmap and PDT

On 18 and 19 October, the CIMdata PLM Road Map and PDT 2022 Conference is scheduled as an in-person event in Gothenburg.

The agenda is almost secured and can be found here.

It will be a conference with guidance from CIMdata and Eurostep completed with major Aerospace, Defense and Automotive companies sharing their experience towards a model-based and digital enterprise.

So no marketing but real content; however, there will also be forward-looking presentations related to new PLM paradigms and the relation to data and sustainability.

So if you are curious, come to his conference as you will be triply rewarded: by the content, the keynotes and discussions with your peers.

Register before September 12 to benefit from a 15 % Early Bird discount, which you can spend for the dinner after day 1. The conference dinner has always been a good moment for networking and discussion.

 

A Sustainable Future – Seize Opportunities When Someone Else Sees Costs

Last part of this agenda.

On  October 25th, I will participate as a PGGA member in a webinar with Aras, discussing sustainability in more depth compared to our earlier mentioned standard PGGA interview.

Here I will be joined by Patrick Willemsen from Aras. Patrick is the technical director of the Aras EMEA community, and together we will explore how companies aiming to deliver profitable products and solutions also can contribute to a more sustainable future for our planet.

Feel free to subscribe to this free webinar and discuss your thoughts with us in the Q&A session – here is the registration link.

 

Conclusion

No conclusion this time – all thinking is in progress and I hope to see your feedback or contribution to one of these events in person or through social media.

In the last weeks, I had several discussions related to sustainability. What can companies do to become sustainable and prove it? But, unfortunately, there is so much greenwashing at this moment.

Look at this post: 10 Companies and Corporations Called Out For Greenwashing.

Therefore I thought about which practical steps a company should take to prepare for a sustainable future, as the change will not happen overnight. It reminds me of the path towards a digital, model-based enterprise (my other passion). In my post Why Model-Based definition is important for all, I mentioned that MBD (Model-Based Definition) could be considered the first stepping-stone toward a Model-Based enterprise.

The analogy for Material Compliance came after an Aras seminar I watched a month ago. The webinar How PLM Paves the Way for Sustainability with  Insensia (an Aras implementer) demonstrates how material compliance is the first step toward sustainable product development.

Let’s understand why

The first steps

Companies that currently deliver solutions mostly only focus on economic gains. The projects or products they sell need to be profitable and competitive, which makes sense if you want a future.

And this would not have changed if the awareness of climate impact has not become apparent.

First, CFKs and hazardous materials lead to new regulations. Next global agreements to fight climate change – the Paris agreement and more to come – have led and will lead to regulations that will change how products will be developed. All companies will have to change their product development and delivery models when it becomes a global mandate.

A required change is likely going to happen. In Europe, the Green Deal is making stable progress. However, what will happen in the US will be a mystery as even their supreme court becomes a political entity against sustainability (money first).

Still, compliance with regulations will be required if a company wants to operate in a global market.

What is Material Compliance?

In 2002, the European Union published a directive to restrict hazardous substances in materials. The directive, known as RoHS (Restriction of Hazardous Substances), was mainly related to electronic components. In the first directive, six hazardous materials were restricted.

The most infamous are Cadmium(Cd), Lead(Pb), and Mercury (Hg). In 2006 all products on the EU market must pass RoHS compliance, and in 2011 was now connected the CE marking of products sold in the European market was.

In 2015 four additional chemical substances were added, most softening PVC but also affecting the immune system. Meanwhile, other countries have introduced similar RoHS regulations; therefore, we can see it as a global restricting. Read more here: The RoHS guide.

Consumers buying RoHS-compliant products now can be assured that none of the threshold values of the substances is reached in the product. The challenge for the manufacturer is to go through each of the components of the MBOM. To understand if it contains one of the ten restricted substances and, if yes, in which quantity.

Therefore, they need to get that information from each relevant supplier a RoHS declaration.

Besides RoHS, additional regulations protect the environment and the consumer. For example, REACH (Registration, Evaluation, Authorization and Restriction of Chemicals) compliance deals with the regulations created to improve the environment and protect human health. In addition, REACH addresses the risks associated with chemicals and promotes alternative methods for the hazard assessment of substances.

The compliance process in four steps

Material compliance is most of all the job of engineers. Therefore around 2005, some of my customers started to add RoHS support to their PLM environment.

 

Step 1

The image below shows the simple implementation – the PDF-from from the supplier was linked to the (M)BOM part.

An employee had to manually add the substances into a table and ensure the threshold values were not reached. But, of course, there was already a selection of preferred manufacturer parts during the engineering phase. Therefore RoHS compliance was almost guaranteed when releasing the EBOM.

But this process could be done more cleverly.

 

Step 2

So the next step was that manufacturers started to extend their PLM data model with the additional attributes for RoHS compliance. Again, this could be done cleverly or extremely generic, adding the attributes to all parts.

So now, when receiving the material declaration, a person just has to add the substance values to the part attributes. Then, through either standard functionality or customization, a compliance report could be generated for the (M)BOM. So this already saves some work.

 

Step 3

The next step was to provide direct access to these attributes to the supplier and push the supplier to do the work.

Now the overhead for the manufacturer has been reduced again. This is because only the supplier needs to do the job for his customer.

 

Step 4

In step 4, we see a real connected environment, where information is stored only once, referenced by manufacturers, and kept actual by the part suppliers.

Who will host the RoHS databank? From some of my customer projects, I recall IHS as a data provider – it seems they are into this business when you look at their website HERE.

 

Where is your company at this moment?

Having seen the four stepping-stones leading towards efficient RoHS compliance, you see the challenge of moving from a document-driven approach to a data-driven approach.

Now let’s look into the future. Concepts like Life Cycle Assessment (LCA) or a Digital Product Passport (DPP) will require a fully connected approach.

Where is your company at this moment – have you reached RoHS compliance step 3 or 4? A first step to learn and work connected and data-driven.

 

Life Cycle Assessment – the ultimate target

A lifecycle assessment, or lifecycle analysis (two times LCA again), is a methodology to assess the environmental impact of a product (or solution) through its whole lifecycle. From materials sourcing, manufacturing, transportation, usage, service, and decommissioning. And by assessing, we mean a clear, verifiable, and shareable manner, not just guessing.

Traditional engineering education is not bringing these skills, although LCA is not new, as this 10-years old YouTube movie from Autodesk illustrates:

What is new is that due to global understanding, we are reaching the limits of what our planet can endure; we must act now. Upcoming international regulations will enforce life cycle analysis reporting for manufacturers or service providers. This will happen gradually.

Meanwhile, we all should work on a circular economy, the major framework for a sustainable planet- click on the image on the left.

In my post, I wrote about these combined topics: SYSTEMS THINKING – a must-have skill in the 21st century.

 

Life Cycle Analysis – Digital Twin – Digitization

The big elephant in the room is that when we talk about introducing LCA in your company, it has a lot to do with the digitization of your company. Assessment data in a document can require too much human effort to maintain the data at the right quality. The costs are not affordable if your competitor is more efficient.

When coming to the Analysis part, here, a model-based, data-driven infrastructure is the most efficient way to run virtual analysis, using digital twin concepts at each stage of the product lifecycle.

Virtual models for design, manufacturing and operations allow your company to make trade-off studies with low cost before committing to the physical world. 80 % of the environmental impact of a product comes from decisions in the virtual world.

Once you have your digital twins for each phase of the product lifecycle, you can benchmark your models with data reported from the physical world. All these interactions can be found in the beautiful Boeing diamond below, which I discussed before – Read A digital twin for everybody.

 

Conclusion

Efficient and sustainable life cycle assessment and analysis will come from connected information sources. The old document-driven paradigm is too costly and too slow to maintain. In particular, when the scope is not only a subset of your product, it is your full product and its full lifecycle with LCA. Another stepping stone towards the near future. Where are you?

 

Stepping-stone 1:            From Model-Based Definition to an efficient Model-Based, Data-driven Enterprise

Stepping-stone 2:            For RoHS compliance to an efficient and sustainable Model-Based, data-driven enterprise.

A month ago, I wrote: It is time for BLM – PLM is not dead, which created an anticipated discussion. It is practically impossible to change a framed acronym. Like CRM and ERP, the term PLM is there to stay.

However, it was also interesting to see that people acknowledge that PLM should have a business scope and deserves a place at the board level.

The importance of PLM at business level is well illustrated by the discussion related to this LinkedIn post from Matthias Ahrens referring to the CIMdata roadmap conference CEO discussion.

My favorite quote:

Now it’s ‘lifecycle management,’ not just EDM or PDM or whatever they call it. Lifecycle management is no longer just about coming up with new stuff. We’re seeing more excitement and passion in our customers, and I think this is why.”

But it is not that simple

This is a perfect message for PLM vendors to justify their broad portfolio. However, as they do not focus so much on new methodologies and organizational change, their messages remain at the marketing level.

In the field, there is more and more awareness that PLM has a dual role. Just when I planned to write a post on this topic, Adam Keating, CEO en founder of CoLab, wrote the post System of Record meet System of Engagement.

Read the post and the comments on LinkedIn. Adam points to PLM as a System of Engagement, meaning an environment where the actual work is done all the time. The challenge I see for CoLab, like other modern platforms, e.g., OpenBOM, is how it can become an established solution within an organization. Their challenge is they are positioned in the engineering scope.

I believe for these solutions to become established in a broader customer base, we must realize that there is a need for a System of Record AND System(s) of Engagement.

In my discussions related to digital transformation in the PLM domain, I addressed them as separate, incompatible environments.

See the image below:

Now let’s have a closer look at both of them

What is a System of Record?

For me, PLM has always been the System of Record for product information. In the coordinated manner, engineers were working in their own systems. At a certain moment in the process, they needed to publish shareable information, a document(e.g., PDF) or BOM-table (e.g., Excel). The PLM system would support New Product Introduction processes, Release and Change Processes and the PLM system would be the single point of reference for product data.

The reason I use the bin-image is that companies, most of the time, do not have an advanced information-sharing policy. If the information is in the bin, the experts will find it. Others might recreate the same information elsewhere,  due to a lack of awareness.

Most of the time, engineers did not like PLM systems caused by integrations with their tools. Suddenly they were losing a lot of freedom due to check-in / check-out / naming conventions/attributes and more. Current PLM systems are good for a relatively stable product, but what happens when the product has a lot of parallel iterations (hardware & software, for example). How to deal with Work In Progress?

Last week I visited the startup company PAL-V in the context of the Dutch PDM Platform. As you can see from the image, PAL-V is working on the world’s first Flying Car Production Model. Their challenge is to be certified for flying (here, the focus is on the design) and to be certified for driving (here, the focus is on manufacturing reliability/quality).

During the PDM platform session, they showed their current Windchill implementation, which focused on managing and providing evidence for certification. For this type of company, the System of Record is crucial.

Their (mainly) SolidWorks users are trained to work in a controlled environment. The Aerospace and Automotive industries have started this way, which we can see reflected in current PLM systems.

Image: Aras impression of the digital thread

And to finish with a PLM buzzword: modern systems of record provide a digital thread.

 

What is a System of Engagement?

The characteristic of a system of engagement is that it supports the user in real-time. This could be an environment for work in progress. Still, more importantly, all future concepts from MBSE, Industry 4.0 and Digital Twins rely on connected and real-time data.

As I previously mentioned, Digital Twins do not run on documents; they run on reliable data.

A system of engagement is an environment where different disciplines work together, using models and datasets. I described such an environment in my series The road to model-based and connected PLM. The System of Engagement environment must be user-friendly enough for these experts to work.

Due to the different targets of a system engagement, I believe we have to talk about Systems of Engagement as there will be several engagement models on a connected (federated) set of data.

Yousef Hooshmand shared the Daimler paper: “From a Monolithic PLM Landscape to a Federated Domain and Data Mesh” in that context. Highly recommended to read if you are interested in a potential PLM future infrastructure.

Let’s look at two typical Systems of Engagement without going into depth.

The MBSE System of Engagement

In this environment, systems engineering is performed in a connected manner, building connected artifacts that should be available in real-time, allowing engineers to perform analysis and simulations to construct the optimal virtual solution before committing to physical solutions.

It is an iterative environment. Click on the image for an impression.

The MBSE space will also be the place where sustainability needs to start. Environmental impact, the planet as a stakeholder,  should be added to the engineering process. Life Cycle Assessment (LCA) defining the process and material choices will be fed by external data sources, for example, managed by ecoinvent, Higg and others to come. It is a new emergent market.

The Digital Twin

In any phase of the product lifecycle, we can consider a digital twin, a virtual data-driven environment to analyze, define and optimize a product or a process. For example, we can have a digital twin for manufacturing, fulfilling the Industry 4.0 dreams.

We can have a digital twin for operation, analyzing, monitoring and optimizing a physical product in the field. These digital twins will only work if they use connected and federated data from multiple sources. Otherwise, the operating costs for such a digital twin will be too high (due to the inefficiency of accurate data)

In the end, you would like to have these digital twins running in a connected manner. To visualize the high-level concept, I like Boeing’s diamond presented by Don Farr at the PDT conference in 2018 – Image below:

Combined with the Daimler paper “From a Monolithic PLM Landscape to a Federated Domain and Data Mesh.” or the latest post from Oleg Shilovistky How PLM Can Build Ontologies? we can start to imagine a Systems of Engagement infrastructure.

 

You need both

And now the unwanted message for companies – you need both: a system of record and potential one or more systems of engagement. A System of Record will remain as long as we are not all connected in a blockchain manner. So we will keep producing reports, certificates and baselines to share information with others.

It looks like the Gartner bimodal approach.

An example: If you manage your product requirements in your PLM system as connected objects to your product portfolio, you will and still can generate a product specification document to share with a supplier, a development partner or a certification company.

So do not throw away your current System of Record. Instead, imagine which types of Systems of Engagement your company needs. Most Systems of Engagement might look like a siloed solution; however, remember they are designed for the real-time collaboration of a certain community – designers, engineers, operators, etc.

The real challenge will be connecting them efficiently with your System of Record backbone, which is preferable to using standard interface protocols and standards.

 

The Hybrid Approach

For those of you following my digital transformation story related to PLM, this is the point where the McKinsey report from 2017 becomes actual again.

 

Conclusion

The concepts are evolving and maturing for a digital enterprise using a System of Record and one or more Systems of Engagement. Early adopters are now needed to demonstrate these concepts to agree on standards and solution-specific needs. It is time to experiment (fast). Where are you in this process of learning?

 

 

 

 

 

 

 

 

 

While preparing my presentation for the Dutch Model-Based Definition solutions event, I had some reflections and experiences discussing Model-Based Definition. Particularly in traditional industries. In the Aerospace & Defense, and Automotive industry, Model-Based Definition has become the standard. However, other industries have big challenges in adopting this approach. In this post, I want to share my observations and bring clarifications about the importance.

 

What is a Model-Based Definition?

The Wiki-definition for Model-Based Definition is not bad:

Model-based definition (MBD), sometimes called digital product definition (DPD), is the practice of using 3D models (such as solid models, 3D PMI and associated metadata) within 3D CAD software to define (provide specifications for) individual components and product assemblies. The types of information included are geometric dimensioning and tolerancing (GD&T), component level materials, assembly level bills of materials, engineering configurations, design intent, etc.

By contrast, other methodologies have historically required the accompanying use of 2D engineering drawings to provide such details.

When I started to write about Model-Based definition in 2016, the concept of a connected enterprise was not discussed. MBD mainly enhanced data sharing between engineering, manufacturing, and suppliers at that time. The 3D PMI is a data package for information exchange between these stakeholders.

The main difference is that the 3D Model is the main information carrier, connected to 2D manufacturing views and other relevant data, all connected in this package.

 

MBD – the benefits

There is no need to write a blog post related to the benefits of MBD. With some research, you find enough reasons. The most important benefits of MBD are:

  • the information is and human-readable and machine-readable. Allowing the implementation of Smart Manufacturing / Industry 4.0 concepts
  • the information relies on processes and data and is no longer dependent on human interpretation. This leads to better quality and error-fixing late in the process.
  • MBD information is a building block for the digital enterprise. If you cannot master this concept, forget the benefits of MBSE and Virtual Twins. These concepts don’t run on documents.

To help you discover the benefits of MBD described by others – have a look here:

 

MBD as a stepping stone to the future

When you are able to implement model-based definition practices in your organization and connect with your eco-system, you are learning what it means to work in a connected matter. Where the scope is limited, you already discover that working in a connected manner is not the same as mandating everyone to work with the same systems or tools. Instead, it is about new ways of working (skills & people), combined with exchange standards (which to follow).

Where MBD is part of the bigger model-based enterprise, the same principles apply for connecting upstream information (Model-Based Systems Engineering) and downstream information(IoT-based operation and service models).

Oleg Shilovitsky addresses the same need from a data point of view in his recent blog: PLM Strategy For Post COVID Time. He makes an important point about the Digital Thread:

Digital Thread is one of my favorite topics because it is leading directly to the topic of connected data and services in global manufacturing networks.

I agree with that statement as the digital thread is like MBD, another steppingstone to organize information in a connected manner, even beyond the scope of engineering-manufacturing interaction. However, Digital Thread is an intermediate step toward a full data-driven and model-based enterprise.

To master all these new ways is working, it is crucial for the management of manufacturing companies, both OEM and their suppliers, to initiate learning programs. Not as a Proof of Concept but as a real-life, growing activity.

Why MBD is not yet a common practice?

If you look at the success of MBD in Aerospace & Defense and Automotive, one of the main reasons was the push from the OEMs to align their suppliers. They even dictated CAD systems and versions to enable smooth and efficient collaboration.

In other industries, there we not so many giant OEMs that could dictate their supply chain. Often also, the OEM was not even ready for MBD. Therefore, the excuse was often we cannot push our suppliers to work different, let’s remain working as best as possible (the old way and some automation)

Besides the technical changes, MBD also had a business impact. Where the traditional 2D-Drawing was the contractual and leading information carrier, now the annotated 3D Model has to become the contractual agreement. This is much more complex than browsing through (paper) documents; now, you need an application to open up the content and select the right view(s) or datasets.

In the interaction between engineering and manufacturing, you could hear statements like:

you can use the 3D Model for your NC programming, but be aware the 2D drawing is leading. We cannot guarantee consistency between them.

In particular, this is a business change affecting the relationship between an OEM and its suppliers. And we know business changes do not happen overnight.

Smaller suppliers might even refuse to work on a Model-Based definition, as it is considered an extra overhead they do not benefit from.

In particular, when working with various OEMs that might have their own preferred MBD package content based on their preferred usage. There are standards; however, OEMs often push for their preferred proprietary format.

It is about an orchestrated change.

Implementing MBD in your company, like PLM, is challenging because people need to be aligned and trained on new ways of working. In particular, this creates resistance at the end-user level.

Similar to the introduction of mainstream CAD (AutoCAD in the eighties) and mainstream 3D CAD (Solidworks in the late nineties), it requires new processes, trained people, and matching tools.

This is not always on the agenda of C-level people who try to avoid technical details (because they don’t understand them – read this great article: Technical Leadership: A Chronic Weakness in Engineering Enterprises.

I am aware of learning materials coming from the US, not so much about European or Asian thought leaders. Feel free to add other relevant resources for the readers in this post’s comments. Have a look and talk with:

Action Engineering with their OSCAR initiative: Bringing MBD Within Reach. I spoke with Jennifer Herron, founder of Action Engineering, a year ago about MBD and OSCAR in my blog post: PLM and Model-Based Definition.

Another interesting company to follow is Capvidia. Read their blog post to start with is MBD model-based definition in the 21st century.

The future

What you will discover from these two companies is that they focus on the connected flow of information between companies while anticipating that each stakeholder might have their preferred (traditional) PLM environment. It is about data federation.

The future of a connected enterprise is even more complex. So I was excited to see and download Yousef Hooshmand’s paper:  ”From a Monolithic PLM Landscape to a Federated Domain and Data Mesh”.

Yousef and some of his colleagues report about their PLM modernization project @Mercedes-Benz AG, aiming at transforming a monolithic PLM landscape into a federated Domain and Data Mesh.

This paper provides a lot of structured thinking related to the concepts I try to explain to my audience in everyday language. See my The road to model-based and connected PLM thoughts.

This paper has much more depth and is a must-read and must-discuss writing for those interested – perhaps an opportunity for new startups and a threat to traditional PLM vendors.

Conclusion

Vellum drawings are almost gone now – we have electronic 2D Drawings. The model-based definition has confirmed the benefits of improving the interaction between engineering, manufacturing & suppliers. Still, many industries are struggling with this approach due to process & people changes needed. If you are not able or willing to implement a model-based definition approach, be worried about the future. The eco-systems will only run efficiently (and survive) when their information exchange is based on data and models. Start learning now.

p.s. just out of curiosity:
If you are model-based advocate support this post with a

 

Once and a while, the discussion pops up if, given the changes in technology and business scope, we still should talk about PLM. John Stark and others have been making a point that PLM should become a profession.

In a way, I like the vagueness of the definition and the fact that the PLM profession is not written in stone. There is an ongoing change, and who wants to be certified for the past or framed to the past?

However, most people, particularly at the C-level, consider PLM as something complex, costly, and related to engineering. Partly this had to do with the early introduction of PLM, which was a little more advanced than PDM.

The focus and capabilities made engineering teams happy by giving them more access to their data. But unfortunately, that did not work, as engineers are not looking for more control.

Old (current) PLM

Therefore, I would like to suggest that when we talk about PLM, we frame it as Product Lifecycle Data Management (the definition). A PLM infrastructure or system should be considered the System of Record, ensuring product data is archived to be used for manufacturing, service, and proving compliance with regulations.

In a modern way, the digital thread results from building such an infrastructure with related artifacts. The digital thread is somehow a slow-moving environment, connecting the various as-xxx structures (As-Designed, As-Planned, As-Manufactured, etc.). Looking at the different PLM vendor images, Aras example above, I consider the digital thread a fancy name for traceability.

I discussed the topic of Digital Thread in 2018:  Document Management or Digital Thread. One of the observations was that few people talk about the quality of the relations when providing traceability between artifacts.

The quality of traceability is relevant for traditional Configuration Management (CM). Traditional CM has been framed, like PLM, to be engineering-centric.

Both PLM and CM need to become enterprise activities – perhaps unified.

Read my blog post and see the discussion with Martijn Dullaart, Lisa Fenwick and Maxim Gravel when discussing the future of Configuration Management.

New digital PLM

In my posts, I talked about modern PLM. I described it as data-driven, often in relation to a model-based approach. And as a result of the data-driven approach, a digital PLM environment could be connected to processes outside the engineering domain. I wrote a series of posts related to the potential of such a new PLM infrastructure (The road to model-based and connected PLM)

Digital PLM, if implemented correctly, could serve people along the full product lifecycle, from marketing/portfolio management until service and, if relevant, decommissioning). The bigger challenge is even connecting eco-systems to the same infrastructure, in particular suppliers & partners but also customers. This is the new platform paradigm.

Some years ago, people stated IoT is the new PLM  (IoT is the new PLM – PTC 2017). Or MBSE is the foundation for a new PLM (Will MBSE be the new PLM instead of IoT? A discussion @ PLM Roadmap conference 2018).

Even Digital Transformation was mentioned at that time. I don’t believe Digital Transformation is pointing to a domain, more to an ongoing process that most companies have t go through. And because it is so commonly used, it becomes too vague for the specifics of our domain. I liked Monica Schnitger‘s LinkedIn post: Digital Transformation? Let’s talk. There is enough to talk about; we have to learn and be more specific.

 

What is the difference?

The challenge is that we need more in-depth thinking about what a “digital transformed” company would look like. What would impact their business, their IT infrastructure, and their organization and people? As I discussed with Oleg Shilovitsky, a data-driven approach does not necessarily mean simplification.

I just finished recording a podcast with Nina Dar while writing this post. She is even more than me, active in the domain of PLM and strategic leadership toward a digital and sustainable future. You can find the pre-announcement of our podcast here (it was great fun to talk), and I will share the result later here too.

What is clear to me is that a new future data-driven environment becomes like a System of Engagement. You can simulate assumptions and verify and qualify trade-offs in real-time in this environment. And not only product behavior, but you can also simulate and analyze behaviors all along the lifecycle, supporting business decisions.

This is where I position the digital twin. Modern PLM infrastructures are in real-time connected to the business. Still, PLM will have its system of record needs; however, the real value will come from the real-time collaboration.

The traditional PLM consultant should transform into a business consultant, understanding technology. Historically this was the opposite, creating friction in companies.

Starting from the business needs

In my interactions with customers, the focus is no longer on traditional PLM; we discuss business scenarios where the company will benefit from a data-driven approach. You will not obtain significant benefits if you just implement your serial processes again in a digital PLM infrastructure.

Efficiency gains are often single digit, where new ways of working can result in double-digit benefits or new opportunities.

Besides traditional pressure on companies to remain competitive, there is now a new additional driver that I have been discussing in my previous post, the Innovation Dilemma. To survive on our planet, we and therefore also companies, need to switch to sustainable products and business models.

This is a push for innovation; however, it requires a coordinated, end-to-end change within companies.

Be the change

When do you decide to change your business model from pushing products to the marker into a business model of Product as a Service? When do you choose to create repairable and upgradeable products? It is a business need. Sustainability does not start with the engineer. It must be part of the (new) DNA of a company.

Interesting to read is this article from Jan Bosch that I read this morning: Resistance to Change. Read the article as it makes so much sense, but we need more than sense – we need people to get involved. My favorite quote from the article:

“The reasonable man adapts himself to the world; the unreasonable one persists in trying to adapt the world to himself. Therefore, all progress depends on the unreasonable man”.

Conclusion

PLM consultants should retrain themselves in System Thinking and start from the business. PLM technology alone is no longer enough to support companies in their (digital/sustainable) transformation. Therefore, I would like to introduce BLM (Business Lifecycle Management) as the new TLA.

However, BLM has been already framed as Black Lives Matter. I agree with that, extending it to ALM (All Lives Matter).

What do you think should we leave the comfortable term PLM behind us for a new frame?

Yes, it is not a typo. Clayton Christensen famous book written in 1995 discussed the Innovator’s Dilemma when new technologies cause great firms to fail. This was the challenge two decades ago. Existing prominent companies could become obsolete quickly as they were bypassed by new technologies.

The examples are well known. To mention a few: DEC (Digital Equipment Corporation), Kodak, and Nokia.

Why the innovation dilemma?

This decade the challenge has become different. All companies are forced to become more sustainable in the next ten years. Either pushed by global regulations or because of their customer demands. The challenge is this time different. Besides the priority of reducing greenhouse gas emissions, there is also the need to transform our society from a linear, continuous growth economy into a circular doughnut economy.

The circular economy makes the creation, the usage and the reuse of our products more complex as the challenge is to reduce the need for raw materials and avoid landfills.

The circular economy concept – the regular product lifecycle in the middle

The doughnut economy makes the values of an economy more complex as it is not only about money and growth, human and environmental factors should also be considered.

Doughnut Economics: Trying to stay within the green boundaries

To manage this complexity, I wrote SYSTEMS THINKING – a must-have skill in the 21st century, focusing on the logical part of the brain. In my follow-up post, Systems Thinking: a second thought, I looked at the human challenge. Our brain is not rational and wants to think fast to solve direct threats. Therefore, we have to overcome our old brains to make progress.

An interesting and thought-provoking was shared by Nina Dar in this discussion, sharing the video below. The 17 Sustainability Development Goals (SDGs) describe what needs to be done. However, we also need the Inner Development Goals (IDGs) and the human side to connect. Watch the movie:

Our society needs to change and innovate; however, we cannot. The Innovation Dilemma.

The future is data-driven and digital.

What is clear to me is that companies developing products and services have only one way to move forward: becoming data-driven and digital.

Why data-driven and digital?

Let’s look at something companies might already practice, REACH (Registration, Evaluation, Authorization and Restriction of Chemicals). This European directive, introduced in 2007, had the aim to protect human health and protect the environment by communicating information on chemicals up and down the supply chain. This would ensure that manufacturers, importers, and their customers are aware of information relating to the health and safety of the products supplied.

The regulation is currently still suffering in execution as most of the reporting and evaluation of chemicals is done manually. Suppliers report their chemicals in documents, and companies report the total of chemicals in their summary reports. Then, finally, authorities have to go through these reports.

Where the scale of REACH is limited, the manual effort to have end-to-end reporting is relatively high. In addition, skilled workers are needed to do the job because reporting is done in a document-based manner.

Life Cycle Assessments (LCA)

Where you might think REACH is relatively simple, the real new challenges for companies are the need to perform Life Cycle Assessments for their products. In a Life Cycle Assessment. The Wiki definition of LCA says:

Life cycle assessment or LCA (also known as life cycle analysis) is a methodology for assessing environmental impacts associated with all the stages of the life cycle of a commercial product, process, or service. For instance, in the case of a manufactured product, environmental impacts are assessed from raw material extraction and processing (cradle), through the product’s manufacture, distribution and use, to the recycling or final disposal of the materials composing it (grave)

This will be a shift in the way companies need to define products. Much more thinking and analysis are required in the early design phases. Before committing to a physical solution, engineers and manufacturing engineers need to simulate and calculate the impact of their design decisions in the virtual world.

This is where the digital twin of the design and the digital twin of the manufacturing process becomes relevant. And remember: Digital Twins do not run on documents – you need connected data and various types of models to calculate and estimate the environmental impact.

LCA done in a document-based manner will make your company too slow and expensive.

I described this needed transformation in my series from last year: The road to model-based and connected PLM – nine posts exploring the technology and concept of a model-based, data-driven PLM infrastructure.

Digital Product Passport (DPP)

The European Commission has published an action plan for the circular economy, one of the most important building blocks of the European Green Deal. One of the defined measures is the gradual introduction of a Digital Product Passport (DPP). As the quality of an LCA depends on the quality and trustworthy information about products and materials, the DPP is targeting to ensure circular economy metrics become reliable.

This will be a long journey. If you want to catch a glimpse of the complexity, read this Medium article: The digital product passport and its technical implementation related to the DPP for batteries.

The innovation dilemma

Suppose you agree with my conclusion that companies need to change their current product or service development into a data-driven and model-based manner. In that case, the question will come up: where to start?

Becoming data-driven and model-based, of course, is not the business driver. However, this change is needed to be able to perform Life Cycle Assessments and comply with current and future regulations by remaining competitive.

A document-driven approach is a dead-end.

Now let’s look at the real dilemmas by comparing a startup (clean sheet / no legacy) and an existing enterprise (experience with the past/legacy). Is there a winning approach?

The Startup

Having lived in Israel – the nation where almost everyone is a startup – and working with startups afterward in the past 10 years, I always get inspired by these people’s energy in startup companies. They have a unique value proposition most of the time, and they want to be visible on the market as soon as possible.

This approach is the opposite of systems thinking. It is often a very linear process to deliver this value proposition without exploring the side effects of such an approach.

For example, the new “green” transportation hype. Many cities now have been flooded with “green” scooters and electric bikes to promote transportation as a service. The idea behind this concept is that citizens do not require to own polluting motorbikes or cars anymore, and transportation means will be shared. Therefore, the city will be cleaner and greener.

However, these “green” vehicles are often designed in the traditional linear way. Is there a repair plan or a plan to recycle the batteries? Reuse of materials used.? Most of the time, not. Please, if you have examples contradicting my observations, let me know. I like to hear good news.

When startup companies start to scale, they need experts to help them grow the company. Often these experts are seasoned people, perhaps close to retirement. They will share their experience and what they know best from the past:  traditional linear thinking.

As a result, even though startup companies can start with a clean sheet, their focus on delivering the product or service blocks further thinking. Instead, the seasoned experts will drive the company towards ways of working they know from the past.

Out of curiosity: Do you know or work in a startup that has started with a data-driven and model-based vision from scratch?  Please add the name of this company in the comments, and let’s learn how they did it.

The Existing company

Working in an established company is like being on board a big tanker. Changing its direction takes a clear eye on the target and navigation skills to come there. Unfortunately, most of the time, these changes take years as it is impossible to switch the PLM infrastructure and the people skills within a short time.

From the bimodal approach in 2015 to the hybrid approach for companies, inspired by this 2017 McKinsey article: Toward an integrated technology operating model, I discovered that this is probably the best approach to ensure a change will happen. In this approach – see image – the organization keeps running on its document-driven PLM infrastructure. This type of infrastructure becomes the system of record. Nothing different from what PLM currently is in most companies.

In parallel, you have to start with small groups of people who independently focus on a new product, a new service. Using the model-based approach, they work completely independently from the big enterprise in a data-driven approach. Their environment can be considered the future system of engagement.

The data-driven approach allows all disciplines to work in a connected, real-time manner. Mastering the new ways of working is usually the task of younger employees that are digital natives. These teams can be completed by experienced workers who behave as coaches. However, they will not work in the new environment; these coaches bring business knowledge to the team.

People cannot work in two modes, but organizations can. As you can see from the McKinsey chart, the digital teams will get bigger and more important for the core business over time. In parallel, when their data usage grows, more and more data integration will occur between the two operation modes. Therefore, the old PLM infrastructure can remain a System of Record and serve as a support backbone for the new systems of engagement.

The Innovation Dilemma conclusion

The upcoming ten years will push organizations to innovate their ways of working to become sustainable and competitive. As discussed before, they must learn to work in a data-driven, connected manner. Both startups and existing enterprises have challenges – they need to overcome the “thinking fast and acting slow” mindset. Do you see the change in your company?

 

Note: Before publishing this post, I read this interesting and complementary post from Jan Bosch Boost your digitalization: instrumentation.

It is in the air – grab it.

 

In the past four weeks, I have been writing about the various aspects related to PLM Education. First, starting from my bookshelf, zooming in on the strategic angle with CIMdata (Part 1).

Next, I was looking at the educational angle and motivational angle with Share PLM (Part 2).

And the last time,  I explored with John Stark the more academic view of PLM education. How do you – students and others – learn and explore the full context of PLM (Part 3)?

Now I am talking with Dave Slawson from Quick Release_ , exploring their onboarding and educational program as a consultancy firm.

How do they ensure their consultants bring added value to PLM-related activities, and can we learn something from that four our own practices?

Quick Release

Dave, can you tell us something more about Quick Release, further abbreviated to QR, and your role in the organization?
.

Quick Release is a specialist PDM and PLM consultancy working primarily in the automotive sector in Europe, North America, and Australia. Robust data management and clear reporting of complex subjects are essential.

Our sole focus is connecting the data silos within our client’s organizations, reducing program or build delays through effective change management.

Quick Release promise – PDT 2019

I am QR’s head of Learning and Development, and I’ve been with the company since late 2014.

I’ve always had a passion for developing people and giving them a platform to push themselves to realize their potential. QR wants to build talent from within instead of just hiring experienced people.

However, with our rapid growth, it became necessary to have dedicated full-time resources to faster onboarding and upskilling our employees. This is combined with having an ongoing development strategy and execution.

QRs Learning & Development approach

Let’s focus on Learning & Development internally at QR first. What type of effort and time does it take to onboard a new employee, and what is their learning program?
.

We have a six-month onboarding program for new employees. Most starters join one of our “boot camps”, a three-week intensive program where a cohort of between 6 and 14 new starters receive classroom-style sessions led by our subject matter experts.

During this, new starters learn about technical PDM and PLM and high-performance business skills that will help them deliver excellence for or clients and feel confident in their work.

Quick Release BoB track process – click to enlarge

While the teams spend a lot of time with the program coordinator, we also bring in our various Subject Matter Experts (SMEs) to ensure the highest quality and variety in these sessions. Some of these sessions are delivered by our founders and directors.

As a business, we believe in investing senior leadership time to ensure quality training and give our team members access to the highest levels of the company.

Since the Covid-19 pandemic started, we moved our training program to be primarily distance learning. However, some sessions are in person, with new starters attending workshops in our regional offices. Our sessions focus on engagement and “doing” instead of just watching a presentation. New starters have fed back that they are still just as enjoyable via distance learning.

Following boot camp, team members will start work on their client projects, supported by a Project Manager and a mentor. During this period, their mentor will help them use the on-the-job experience to build up their technical knowledge on top of their bootcamp learning. The mentor is also there to help them cope with what we know is a steep learning curve. Towards the end of the six months program, each new starter will carry out a self-evaluation designed to help them recognize their achievements to date and identify areas of focus for ongoing personal development.

We gather feedback from the trainers and trainees throughout the onboarding programs, ensuring that the former is shared with their mentors to help with coaching.

The latter is used to help us continuously improve our offering. Our trainers are subject matter experts, but we encourage them to evolve their content and approach based on feedback.

 

The learning journey

Some might say you only learn on the job – how do you relate to this statement? Where does QR education take place? Can you make a statement on ROI for Learning & Development?

It is important to always be curious related to your work. We encourage our team members to challenge themselves to learn new things and dig deeper. Indeed, constant curiosity is one of our core values. We encourage people to challenge the status quo, challenge themselves, and adopt a growth mindset through all development and feedback cycles.

The learning curve in PDM and PLM can be steep; therefore, we must give people the tools and feedback that they can use to grow. At QR, this starts with our onboarding program and flows into an employee’s full career with us. In addition, at the end of every quarter, team members receive performance feedback from their managers, which feeds into their development target setting.

We have a wealth of internal resources to support development, from structured training materials to our internally compiled PDM Wiki and our suite of development “playbooks” (curated learning journeys catering to a range of learning styles).

On-the-job learning is critically important. So after the boot camp, we put our team members straight into projects to make sure they apply and build on their baseline knowledge through real-world experience. Still, they are supported with formal training and ongoing access to development resources.

Regarding Return on Investment, while it is impossible to give a specific number, we would say that quality training is invaluable to our clients and us. In seven years, the company has grown from 60 to 300 employees. In addition, it now operates in three other continents, illustrating that our clients trust the quality of how we train our consultants!

We also carried out internal studies regarding the long-term retention of team members relative to onboarding quality. These studies show that team members who experience a more controlled and structured onboarding program are mostly more successful in roles.

Investing in education?

I understood some of your customers also want to understand PLM processes better and ask for education from your side. Would the investment in education be similar? Would they be able to afford such an effort?

Making a long-term and tangible impact for our clients is the core foundation of what QR are trying to achieve. We do not want to come in to resolve a problem, only for it to resurface once we’ve left. Nor do we want to do work that our clients could easily hire someone to do themselves.

Therefore the idea of delivering a version of our training and onboarding program to clients is very attractive to us. We offer clients a shortened version of our bootcamp (focused on technical PDM, PLM and complexity management without the consultancy skills to our clients).

This is combined with an ongoing support program that transitions the responsibilities within the client team away from our consultants towards the client’s own staff.

We’d look to run that program over approximately 6 months so that the client can be confident that their staff has reached the level of technical expertise. There would be an upfront cost to the client to manage this.

However, the program is designed to support quality skills development within their organization.

 

PLM and Digital Transformation?

Education and digital transformation is a question I always ask. Although QR is already established in the digital era, your customers are not. What are the specific parts of digital transformation that you are teaching your employees and customers

The most inefficient thing we see in the PDM space is the reliance on offline, “analog” data and the inability to establish one source of truth across a complex organization. To support business efficiency through digital transformation, we promote a few simple core tenets in everything we do:

  • Establish a data owner who not only holds the single reference point but also is responsible for its quality
  • Right view reporting – clearly communicate exactly what people need to know, recognizing that different stakeholders need to know different things and that no one has time to waste
  • Clear communications – using the right channels of communication to get the job done faster (including more informal channels such as instant messaging or collaborative online working documents)
  • Smart, data-led decision making – reviewing processes using accurate data that is analyzed thoroughly, and justifying recommendations based on a range of evidence
  • Getting your hands dirty! – Digital Transformation is not just a “systems” subject but relies on people and human interaction. So we encourage all of our consultants to actually understand how teams work. Not be afraid to roll up their sleeves and get stuck in instead of just analyzing from the outside!

Want to learn more?

Dave, Could you point us to relevant Learning & Development programs and resources that are valuable for the readers of this blog?
.

If you are interested in learning within the PDM and PLM space, follow Quick Release on LinkedIn as we publish thought leadership articles designed to support industry development.

For those interested in Learning & Development strategy, there is lots of UK and Ireland guidance available from the Chartered Institute of Personnel and Development (CIPD). Similar organizations exist in other countries, such as the Society for Human Resource Management (SHRM) in the USA) which are great resources for building Learning & Development specific skills.

In my research, I often find really thought-provoking articles that shape my approach and thinking regarding Learning & Development, HR and a business approach published by Forbes and Harvard Business Review.

 

What I learned

When I first discovered Quick Release as a company during one of the PLM Roadmap & PDT conferences (see “The weekend after PLM Roadmap & PDT 2019″) I was impressed by their young and energetic approach combined with being pragmatic and focused on making the data “flow”.  Their customers were often traditional automotive companies having the challenge to break the silos. You could say QR was working on the “connected” enterprise as I would name it.

PLM consultancy must change

Besides their pragmatic approach, I discovered through interactions with QR that they are a kind of management consultancy firm you would expect in the future. As everything is going to be faster experience counts. Instead of remaining conceptual and strategic, they do not fear being with their feet in the mud.

This requires a new type of consultant and training, as employees need to be able to connect both to specialists at their customers and also be able to communicate with management. These types of people are hard to get as this is the ideal profile of a future employee.

The broad profile

What I learned from Dave is that QR invests seriously in meaningful education and coaching programs for their employees – to give them a purpose and an environment where they feel valued. I would imagine this applies actually to every company of the future, therefore I am curious if you could share your experiences from the field, either through the comments to this post or contact me personally.

Conclusion

We have seen now four dimensions of PLM education and I wish they gave you insights into what is possible. For each of the companies, I interviewed there might be others with the same skills. What is important is to realize the domain of PLM needs those four dimensions. In my next (short) post I will provide a summary of what I learned and what I believe is the PLM education of the future. Stay connected!

And a bonus you might have seen before – the digital plumber:

Translate

Categories

  1. Jos, one could take the approach that there is an engineering transformation strategy that can be realized by implementing PLM…

  2. Jos, I agree we should break out from the monolithic approach as this typically means lock-in, risk and frustration. The…

  3. Jos, Thanks for these insights. I believe that the mature capabilities provided by advanced toolsets can also be of benefit…

%d bloggers like this: