You are currently browsing the category archive for the ‘MBD’ category.

After the first episode of “The PLM Doctor is IN“, this time a question from Helena Gutierrez. Helena is one of the founders of SharePLM, a young and dynamic company focusing on providing education services based on your company’s needs, instead of leaving it to function-feature training.

I might come back on this topic later this year in the context of PLM and complementary domains/services.

Now sit back and enjoy.

Note: Due to a technical mistake Helena’s mimic might give you a “CNN-like” impression as the recording of her doctor visit was too short to cover the full response.

PLM and Startups – is this a good match?

 

Relevant links discussed in this video

Marc Halpern (Gartner): The PLM maturity table

VirtualDutchman: Digital PLM requires a Model-Based Enterprise

 

Conclusion

I hope you enjoyed the answer and look forward to your questions and comments. Let me know if you want to be an actor in one of the episodes.
The main rule: A single open question that is puzzling you related to PLM.

Last week I shared my plans for 2021 related to my blog, virtualdutchman.com. Those of you who follow my blog might have noticed my posts are never short as I try to discuss or explain a topic from various aspects. This sometimes requires additional research from my side. The findings will provide benefits for all of us. We keep on learning.

At the end of the post, I asked you to participate in a survey to provide feedback on the proposed topics. So far, only one percent of my readers have responded to this short survey. The last time I shared a short survey in 2018, the response was much more significant.

Perhaps you are tired of the many surveys; perhaps you did not make it to the end. Please make an effort this time. Here is on more time the survey

The results so far

To understand the topics below, please make sure you have read the previous blog post to understand each paragraph’s context.

PLM understanding

For PLM-related topics that I proposed, Product Configuration Management, Supplier Collaboration Management, and  Digital Twin Management got the most traction. I started preparing for them, combined with a few new suggested topics that I will further explore. You can click on the images below to read the details.

PLM Deep dive

From the suggested topics for a PLM deep-dive, it is interesting to see most respondents want to learn more about Product Portfolio Management and Systems Engineering within PLM. Traditional topics like Enterprise/Engineering Change Management, BOM Management, or PLM implementation methodologies have been considered less relevant.

The PLM Doctor is in

Several questions were coming in for the “PLM Doctor,” and I started planning the first episodes. The formula: A single question and an answer through a video recording – max. 2 – 3 minutes. Suitable for fast consumers of information.

PLM and Sustainability

Here we can see the majority is observing what is happening. Only a few persons reported interest in sustainability and probably not disconnected; they work for a company that takes sustainability seriously.

 

 

PLM and digitization

When discussing PLM’s digitization, I believe one of the fundamental changes that we need to implement (and learn to master) is a more Model-Based approach for each phase of the product life cycle. Also, most respondents have a notion of what model-based means and want to apply these practices to engineering and manufacturing.

 

Your feedback

I think you all have heard this statement before about Lies and Statistics. Especially with social media, there are billions of people digging for statistics to support their theories. Don’t worry about my situation; I would like to make my statement based on some larger numbers, so please take the survey here if you haven’t done so.

 

Conclusion

I am curious about your detailed inputs, and the next blog post will be the first of the 2021 series.

 

 

 

 

 

I am still digesting all the content of the latest PLM Roadmap / PDT Fall 2020 conference and the new reality that starts to appear due to COVID-19. There is one common theme:

The importance of a resilient and digital supply chain.

Most PLM implementations focus on aligning disciplines internally; the supply chain’s involvement has always been the next step. Perhaps now it is time to make it the first step? Let’s analyze.

No Time to Market improvement due to disconnected supply chains?

During the virtual fireplace chat at the PLM Roadmap/PDT conference, just as a small bonus. You can read the full story here – the quote:

Marc mentioned a survey Gartner has done with companies in fast-moving industries related to the benefits of PLM. Companies reported improvements in accuracy of product data and product development. They did not see so much a reduced time to market or reduced product development costs. After analysis, Gartner believes the real issue is related to collaboration processes and supply chain practices. Here lead times did not change, nor the number of changes.

Of course, he spoke about fast-moving industries where the interaction was done in a disconnected manner. Gartner believes that the cloud would, for sure, start creating these benefits of a reduced time to market and cost of change when the supply chain is connected.

Therefore I want to point again to an old McKinsey article named The case for Digital Reinvention, published in February 2017. Here the authors looked at the various areas of investment in digital technologies and their ROI.  See the image on the left for the areas investigated and the percentage of companies that invested in these areas at that time.

In the article, you will see the ROI analysis for these areas. For example, the marketing and distribution investments did not necessarily have a positive ROI when disconnected from other improvement areas. Digital supply chains were mentioned as the area with the potential highest ROI. However, another important message in the article for all these areas is: You need to have a complete digitization strategy. This is a point I fail to see in many companies. Often an area gets all the attention, however as it remains disconnected from the rest, the real efficiencies are not there. The McKinsey article ends with the conclusion that the digital winners at that time are the ones with bold strategies win:

we found a mismatch between today’s digital investments and the dimensions in which digitization is most significantly affecting revenue and profit growth. We also confirmed that winners invest more and more broadly and boldly than other companies do

The “connected” supply chain

Image: A&D Action Group – Global Collaboration

Of course, the traditional industries that invented PLM have invested in a kind of connected supply chain. However, is it really a connected supply chain? Aerospace and Defense companies had their supplier portals.

A supplier had to download their information or upload their designs combined with additional metadata.

These portals were completely bespoke and required on both sides “backbreaking” manual work to create, deliver, and validate the required exchange packages. The OEMs were driving the exchange process. More or less, by this custom approach, they made it difficult for suppliers to have their own PLM-environment. The downside of this approach was that the supplier had separate environments for each OEM.

In 2006 I worked with SmarTeam on the concept of the “Supply Chain Express,” an offering that allowed a supplier to have their own environment using SmarTeam as a PDM/PLM-system the Supply Chain Express package to create an intelligent import and export package. The content was all based on files and configurable metadata based on the OEM-Supplier relation.

Some other PLM-vendors or implementers have built similar exchange solutions to connect the world of the OEM and the supplier.

The main characteristic was that it is file-based with custom metadata, often in an XML-format or otherwise using Excel as the metadata carrier.

In my terminology of Coordinated – Connected, this would be Coordinated and “old school.”

 

The “better connected” supply chain

As I mentioned in my previous post about the PLM Roadmap/PDT Fall conference,  Katheryn Bell (Pratt & Whitney Canada) presented the progress of the A&D Global Collaboration workgroup. As part of the activities, they classified the collaboration between the OEM and the supplier in 3 levels, as you can see from the image:

This post mainly focuses on the L1 collaboration as this is probably the most used scenario.

In the Aerospace and Automotive industry, the OEM and suppliers’ data exchange has improved twofold by using Technical Data Packages where the content is supported by Model-Based Definition.

The first advantages of Model-Based Definition are mainly related to a consistent information package where the model is leading. The manufacturing views are explicitly defined on the 3D Model. Therefore there is a reduced chance of error for a misconnect between the “drawings” and the 3D Model.

The Model-Based definition still does not solve working with the latest (approved) version of the information. This still remains a “human-based” process in this case, and Kathryn Bell confirmed this was the biggest problem to solve.

The second advantage of using one of the interoperability standards for Model-Based Definition is the disconnect between application-specific data on the OEM side and the supplier side.

A significant advantage of Model-Based Definition is that there are a few interoperability standards, i.e., ISO 10303 – STEP, ISO14306 – JT, and  ISO32000/14739 (PRC for 3D PDF). In the end, the ideal would be that these standards merge into one standard, completely vendor-independent with a clearly defined scope of its purpose.

The benefit of these standards is also they increase the longevity of product data as the information is stored in an application-independent format. As long as the standard does not change (fast), storing data even internally in these neutral formats can save upgrade or maintenance costs.

However, I think you all know the joke below.

 

The connected supply chain

The ultimate goal in the long term will be the connected supply chain. Information shared between an OEM, and a supplier does not require human-based interfaces to ensure everyone works with the correct data.

The easiest way, and this is what some of the larger OEMs have done, is to consider suppliers as part of your PLM-infrastructure and give them access to all relevant data in the context of the system, the product, or the part they are responsible for. For the OEM, the challenge will be to connect suppliers – to motivate and train them to work in this environment.

For the supplier, the challenge is their IP-management. If they work for 100 percent in the OEM-environment, everything is exposed. If they want to work in their own environment, there is probably double work and a disconnect.

Of course, everything depends on the complexity of your interaction with the supplier.

With its Fusion Cloud Product Lifecycle Management (PLM), Oracle was one of the first to shift the attention to the connected supply chain.

If you search for PLM on the Oracle website, you will find it under Fusion Supply Chain and Manufacturing. It is a logical step as traditional ERP-vendors have never provided a full, rich portfolio for product design. CAD-integrations do not get a focus, and the future path to Model-Bases approaches (MBSE / MBD /MBE) is not visible at all.

Almost similar to what the Siemens-SAP alliance is showing. SAP more or less confirms that you should not rely on SAP PLM for more advanced PLM-scenarios but on Siemens’s offering.

For less complex but fast-moving products, for example, in the apparel industry, you see the promise of connecting all suppliers in one environment is time to market and traceability. This industry does not suffer from products with a long lifecycle with upgrades and services.

So far, the best collaboration platform in the cloud I have seen in Shareaspace from Eurostep. Its foundation based on the PLCS standard allows an OEM and Supplier to connect through their “shared space” – you can look at their supply chain offering here.

Slide: PDT Europe 2016 RENAULT PLM Challenges

In the various PDT-conferences, we have seen how even two OEMs could work in a joined environment (Renault-Nissan-Daimler) or how  BAE Systems used the ShareAspace environment to collaborate and consolidate all the data coming from the various system suppliers into one standards-based environment.

In 2021, I plan to write a series of blog posts related to possible add-on services for PLM. Supplier collaboration platforms, Configuration Management, End-to-end configurators, Product Information Management, are some of the themes I am currently exploring.

Conclusion

COVID-19 has illustrated the volatility of supply chains. Changing suppliers, working with suppliers in the traditional ways, still hinder reducing time to market. However, the promise of a real connected supply chain is enormous. As Boeing demonstrated in my previous post and explained in this post, standards are needed to become future proof.

Will 2021 have more focus on the connected supply chain?

 

Last week I shared my first review of the PLM Roadmap / PDT Fall 2020 conference, organized by CIMdata and Eurostep. Having digested now most of the content in detail, I can state this was the best conference of 2020. In my first post, the topics I shared were mainly the consultant’s view of digital thread and digital twin concepts.

This time, I want to focus on the content presented by the various Aerospace & Defense working groups who shared their findings, lessons-learned (so far) on topics like the Multi-view BOM, Supply Chain Collaboration, MBSE Data interoperability.

These sessions were nicely wrapped with presentations from Alberto Ferrari (Raytheon), discussing the digital thread between PLM and Simulation Lifecycle Management and Jeff Plant (Boeing) sharing their Model-Based Engineering strategy.

I believe these insights are crucial, although there might be people in the field that will question if this research is essential. Is not there an easier way to achieve to have the same results?

Nicely formulated by Ilan Madjar as a comment to my first post:

Ilan makes a good point about simplifying the ideas to the masses to make it work. The majority of companies probably do not have the bandwidth to invest and understand the future benefits of a digital thread or digital twins.

This does not mean that these topics should not be studied. If your business is in a small, simple eco-system and wants to work in a connected mode, you can choose a vendor and a few custom interfaces.

However, suppose you work in a global industry with an extensive network of partners, suppliers, and customers.

In that case, you cannot rely on ad-hoc interfaces or a single vendor. You need to invest in standards; you need to study common best practices to drive methodology, standards, and vendors to align.

This process of standardization is so crucial if you want to have a sustainable, connected enterprise. In the end, the push from these companies will lead to standards, allowing the smaller companies to ad-here or connect to.

The future is about Connected through Standards, as discussed in part 1 and further in this post. Let’s go!

Global Collaboration – Defining a baseline for data exchange processes and standards

Katheryn Bell (Pratt & Whitney Canada) presented the progress of the A&D Global Collaboration workgroup. As you can see from the project timeline, they have reached the phase to look towards the future.

Katheryn mentioned the need to standardize terminology as the first point of attention. I am fully aligned with that point; without a standardized terminology framework, people will have a misunderstanding in communication.

This happens even more in the smaller businesses that just pick sometimes (buzz) terms without a full understanding.

Several years ago, I talked with a PLM-implementer telling me that their implementation focus was on systems engineering. After some more explanations, it appeared they were making an attempt for configuration management in reality. Here the confusion was massive. Still, a standard, common terminology is crucial in our domain, even if it seems academic.

The group has been analyzing interoperability standards, standards for long-time archival and retrieval (LOTAR), but also has been studying the ISO 44001 standard related to Collaborative business relationship management systems

In the Q&A session, Katheryn explained that the biggest problem to solve with collaboration was the risk of working with the wrong version of data between disciplines and suppliers.

Of course, such errors can lead to huge costs if they are discovered late (or too late). As some of the big OEMs work with thousands of suppliers, you can imagine it is not an issue easily discovered in a more ad-hoc environment.

The move to a standardized Technical Data Package based on a Model-Based Definition is one of these initiatives in this domain to reduce these types of errors.

You can find the proceedings from the Global Collaboration working group here.

 

Connect, Trace, and Manage Lifecycle of Models, Simulation and Linked Data: Is That Easy?

I loved Alberto Ferrari‘s (Raytheon) presentation how he described the value of a model-based digital thread, positioning it in a targeted enterprise.

Click on the image and discover how business objectives, processes and models go together supported by a federated infrastructure.

Alberto’s presentation was a kind of mind map from how I imagine the future, and it is a pity if you have not had the chance to see his session.

Alberto also focused on the importance of various simulation capabilities combined with simulation lifecycle management. For Alberto, they are essential to implement digital twins. Besides focusing on standards, Alberto pleas for a semantic integration, open service architecture with the importance of DevSecOps.

Enough food for thought; as Alberto mentioned, he presented the corporate vision, not the current state.

More A&D Action Groups

There were two more interesting specialized sessions where teams from the A&D action groups provided a status update.

Brandon Sapp (Boeing) and Ian Parent (Pratt & Whitney) shared the activities and progress on Minimum Model-Based Definition (MBD) for Type Design Certification.

As Brandon mentioned, MBD is already a widely used capability; however, MBD is still maturing and evolving.  I believe that is also one of the reasons why MBD is not yet accepted in mainstream PLM. Smaller organizations will wait; however, can your company afford to wait?

More information about their progress can be found here.

Mark Williams (Boeing) reported from the A&D Model-Based Systems Engineering action group their first findings related to MBSE Data Interoperability, focusing on an Architecture Model Exchange Solution.  A topic interesting to follow as the promise of MBSE is that it is about connected information shared in models. As Mark explained, data exchange standards for requirements and behavior models are mature, readily available in the tools, and easily adopted. Exchanging architecture models has proven to be very difficult. I will not dive into more details, respecting the audience of this blog.

For those interested in their progress, more information can be found here

Model-Based Engineering @ Boeing

In this conference, the participation of Boeing was significant through the various action groups. As the cherry on the cake, there was Jeff Plant‘s session, giving an overview of what is happening at Boeing. Jeff is Boeing’s director of engineering practices, processes, and tools.

In his introduction, Jeff mentioned that Boeing has more than 160.000 employees in over 65 countries. They are working with more than 12.000 suppliers globally. These suppliers can be manufacturing, service or technology partnerships. Therefore you can imagine, and as discussed by others during the conference, streamlined collaboration and traceability are crucial.

The now-famous MBE Diamond symbol illustrates the model-based information flows in the virtual world and the physical world based on the systems engineering approach. Like Katheryn Bell did in her session related to Global Collaboration, Jeff started explaining the importance of a common language and taxonomy needed if you want to standardize processes.

Zoom in on the Boeing MBE Taxonomy, you will discover the clarity it brings for the company.

I was not aware of the ISO 23247 standard concerning the Digital Twin framework for manufacturing, aiming to apply industry standards to the model-based definition of products and process planning. A standard certainly to follow as it brings standardization on top of existing standards.

As Jeff noted: A practical standard for implementation in a company of any size. In my opinion, mandatory for a sustainable, connected infrastructure.

Jeff presented the slide below, showing their standardization internally around federated platforms.

This slide resembles a lot the future platform vision I have been sharing since 2017 when discussing PLM’s future at PLM conferences, when explaining the differences between Coordinated and Connected – see also my presentation here on Slideshare.

You can zoom in on the picture to see the similarities. For me, the differences were interesting to observe. In Jeff’s diagram, the product lifecycle at the top indicates the platform of (central) interest during each lifecycle stage, suggesting a linear process again.

In reality, the flow of information through feedback loops will be there too.

The second exciting detail is that these federated architectures should be based on strong interoperability standards. Jeff is urging other companies, academics and vendors to invest and come to industry standards for Model-Based System Engineering practices.  The time is now to act on this domain.

It reminded me again of Marc Halpern’s message mentioned in my previous post (part 1) that we should be worried about vendor alliances offering an integrated end-to-end data flow based on their solutions. This would lead to an immense vendor-lock in if these interfaces are not based on strong industry standards.

Therefore, don’t watch from the sideline; it is the voice (and effort) of the companies that can drive standards.

Finally, during the Q&A part, Jeff made an interesting point explaining Boeing is making a serious investment, as you can see from their participation in all the action groups. They have made the long-term business case.

The team is confident that the business case for such an investment is firm and stable, however in such long-term investment without direct results, these projects might come under pressure when the business is under pressure.

The virtual fireside chat

The conference ended with a virtual fireside chat from which I picked up an interesting point that Marc Halpern was bringing in. Marc mentioned a survey Gartner has done with companies in fast-moving industries related to the benefits of PLM. Companies reported improvements in accuracy and product development. They did not see so much a reduced time to market or cost reduction. After analysis, Gartner believes the real issue is related to collaboration processes and supply chain practices. Here lead times did not change, nor the number of changes.

Marc believes that this topic will be really showing benefits in the future with cloud and connected suppliers. This reminded me of an article published by McKinsey called The case for digital reinvention. In this article, the authors indicated that only 2 % of the companies interview were investing in a digital supply chain. At the same time, the expected benefits in this area would have the most significant ROI.

The good news, there is consistency, and we know where to focus for early results.

Conclusion

It was a great conference as here we could see digital transformation in action (groups). Where vendor solutions often provide a sneaky preview of the future, we saw people working on creating the right foundations based on standards. My appreciation goes to all the active members in the CIMdata A&D action groups as they provide the groundwork for all of us – sooner or later.

Last week I was happy to attend the PLM Roadmap / PDT Fall 2020 conference as usual organized by CIMdata and Eurostep. I wrote about the recent PI DX conference, which touched a lot on the surface of PLM and Digital Transformation. This conference is really a conference for those who want to understand the building blocks needed for current and future PLM.

In this conference, usually with approximately 150 users on-site, now with over 250 connected users for 3 (half) days. Many of us, following every session of the conference. As an active participant in the physical events, it was a little disappointing not to be in the same place with the other participants this time. The informal network meetings in this conference have always been special thanks to a relatively small but stable group of experts.  Due to the slightly reduced schedule, there was this time, less attention for some of the typical PDT-topics most of the time coming from Sweden and related to sustainability.

The conference’s theme was Digital Thread—the PLM Professionals’ Path to Delivering Innovation, Efficiency, and Quality and might sound like a marketing statement.  However, the content presented was much more detailed than just marketing info. The fact that you watched the presentation on your screen made it an intense conference with many valuable details.

Have a look at the agenda, and I will walk you through some of the highlights for me. As there was so much content to discuss, I will share this time part 1. Next week, in part 2, you will see the coherence of all the presentations.

As if there was a Coherent Thread.

Digital Twin, It Requires a Digital Thread

Peter Bilello, President & CEO, CIMdata, ‘s keynote with the title Digital Twin, It Requires a Digital Thread was immediately an illustration of discussing reality.  When I stated at the Digital Twin conference in the Netherlands that “Digital Twins do not run on Documents“, it had the same meaning as when Peter stated,” A Digital Twin without a Digital Thread is an orphan”.

Digital Thread

And Peter’s statement, “All companies do PLM, most of the time however disconnected”, is another way to stimulate companies working in a connected manner.

As usual Peter’s session was a good overview of the various aspect related to the Digital Thread and Digital Twin.

Digital Twin

The concept of a virtual twin is not new. The focus is as mentioned before now more on the term “Connected” Peter provided the CIMdata definition for Digital Thread and Digital Twin. Click on the images to the left to read the full definition.

Peter’s overview also referred to the Boeing Diamond, illustrating the mapping of the physical and virtual world, connected through a Digital Thread the various Digital Twins that can exist. The Boeing Diamond was one of the favorites during the conference.

When you look at Peter’s conclusions, there is an alignment with what I wrote in the post: A Digital Twin for Everyone and the fact that we need to strive for a connected enterprise. Only then we can benefit from a Digital Twin concept.

 

The Multi-view BOM Solution Evaluation
– Process, Results, and Industry Impacts

The reports coming from the various A&D PLM action groups are always engaging sessions to watch. Here, nine companies, even competitors, discuss and explore PLM themes between themselves supported by CIMdata.

These companies were the first that implemented PLM; it is interesting to watch how they move forward like supertankers. They cannot jump from one year to another year on a new fashionable hype. Their PLM-infrastructure needs to be consistent and future-proof due to their data’s longevity and the high standards for regulatory compliance and safety.

However, these companies are also pioneers for the future. They have been practicing Model-Based approaches for over ten years already and are still learning. In next week’s post, you will read later that these frontrunners are pushing for standards to make a Model-Based future affordable and achievable.

In that context, the action group Multi-View BOM shared their evaluation results for a study related to the multi-view BOM. A year ago, I wrote about this topic when Fred Feru from Airbus presented the intermediate results at the CIMdata Roadmap/PDT 2019 conference.

Dan Ganser (Gulfstream) and Javier Reines (Airbus) presented the findings. The conclusion was that the four vendors evaluated, i.e., Aras, Dassault Systems, PTC and Siemens, all passed the essential requirements and use cases. You can find the report and the findings here: Multi-view Bill of Materials

One interesting remark.

When the use cases were evaluated, the vendors could score on a level from 0 to 5, see picture. Interesting to see that apparently, it was possible to exceed the requirement, something that seems like a contradiction.

In particular, in this industry, where formal requirements management is a must – either you meet a requirement or not.

Dan Ganser explained that the current use cases were defined based on the minimum expectations, therefore there was the option to exceed the requirement. I still would be curious to see what does it mean to exceed the requirement. Is it usability, time, or something innovative we might have missed?

 

5G for Digital Twins & Shadows

I learned a lot from the presentation from Niels Koenig, working at the Fraunhofer Institute for Production Technology. Niels explained how important 5G is for realizing the Industry 4.0 targets. At the 5G Industry Campus, several projects are running to test and demonstrate the value of 5G in relation to manufacturing.

If you want to get an impression of the 5G Industry Campus – click on the Youube movie.

One of the examples Niels discussed was closed-loop manufacturing. Thanks to the extremely low latency (< 1ms), a connected NC machine can send real-time measurements to be compared with the expected values. For example, in the case of resonance, the cutting might not be smooth. Thanks to the closed-loop, the operator will be able to interfere or adjust the operation. See the image below.

Digital Thread: Be Careful What you Wish For, It Just Might Come True

I was looking forward to Marc Halpern‘s presentation. Marc often brings a less technical viewpoint but a more business-related viewpoint to the discussion. Over the past ten years, there have been many disruptive events, most recently the COVID-pandemic.

Companies are asking themselves how they can remain resilient. Marc shared some of his thoughts on how Digital Twins and Digital Threads can support resilience.

In that context, Gartner saw a trend that their customers are now eagerly looking for solutions related to Digital Twin, Digital Thread, Model-Based Approaches, combined with the aim to move to the cloud. Related to Digital Thread and Digital Twin, most of Gartner’s clients are looking for traceability and transparency along the product lifecycle. Most Digital Twin initiatives focus on a twin of operational assets, particularly inside the manufacturing facility. Nicely linking to Niels Konig’s session related to 5G.

Marc stated that there seems to be a consensus that a Digital Thread is compelling enough for manufacturers to invest. In the end, they will have to. However, there are also significant risks involved. Marc illustrated the two extremes; in reality, companies will end up somewhere in the middle, illustrated later by Jeff Plant from Boeing. The image on the left is a sneaky preview for next week.

When discussing the Digital Thread, Marc again referred to it more as a Digital Net, a kind of connected infrastructure for various different threads based on the various areas of interest.

I show here a slide from Marc’s presentation at the PDT conference in 2018. It is more an artist’s impression of the same concept discussed during this conference again, the Boeing Diamond.

Related to the risk of implementing a Digital Thread and Digital Twin, Marc showed another artistic interpretation; The two extremes of two potential end states of Digital Thread investment. Marc shared the critical risks for both options.

For the Vendor Black Hole, his main points were that if you choose a combined solution, diminished negotiating power, higher implementation costs, and potentially innovative ideas might not be implemented as they are not so relevant for the vendor. They have the power!

As an example of combined solutions Marc mentioned, the recently announced SAP-Siemens partnership, the Rockwell Automation-PTC partnership, the Schneider Electric-Aveva-partnership, and the ABB-Dassault Systemes partnership.

Once you are in the black hole, you cannot escape. Therefore, Marc recommended making sure you do not depend on a few vendors for your Digital Twin infrastructure.

The picture on the left illustrates the critical risks of the Enterprise Architecture “Mess”. It is a topic that I am following for a long time. Suppose you have a collection of services related to the product lifecycle, like Workflow-services, 3D Modeling-services, BOM-services, Manufacturing-services.

Together they could provide a PLM-infrastructure.

The idea behind this is that thanks to openness and connectivity, every company can build its own unique enterprise architecture. No discussion about standard best practices. You build your company’s best practices (for the future, the current ?)

It is mainly promoted as a kind of bottom-up PLM. If you are missing capabilities, just build them yourselves, using REST-services, APIs, using Low-Code platforms. It seems attractive for the smaller enterprises, however most of the time, only a short time. I fully concur with Marc’s identified risks here.

As I often illustrated in presentations related to a digital future, you will need a mix of both. Based on your point of focus, you could imagine five major platforms being connected together to cover all aspects of a business. Depending on your company’s business model and products, one of them might be the dominant one. With my PLM-focus, this would be the Product Innovation Platform, where the business is created.

Marc ended with five priorities to enable a long-term Digital Thread success.

  • First of all – set the ground rules for data governance. A topic often mentioned but is your company actively engaging on that already?
  • Next, learn from Model-Based Systems Engineering as a foundation for a Model-Based Enterprise.  A topic often discussed during the previous CIMdata Roadmap / PDT-conference.
  • The change from storing and hiding information in siloes towards an infrastructure and mindset of search and access of data, in particular, the access to Bill of Materials

The last point induced two more points.

  • The need for an open architecture and standards. We would learn more on this topic on day 3 of the conference.
  • Make sure your digital transformation sticks within the organization by investing and executing on organizational change management.

Conclusion

The words “Digital Thread” and “Digital Twin” are mentioned 18 times in this post and during the conference even more. However, at this conference, they were not hollow marketing terms. They are part of a dictionary for the future, as we will see in next week’s post when discussing some of the remaining presentations.

Closing this time with a point we all agreed upon: “A Digital Twin without a Digital Thread is an orphan”. Next week more!

After the series about “Learning from the past,” it is time to start looking toward the future. I learned from several discussions that I probably work most of the time with advanced companies. I believe this would motivate companies that lag behind even to look into the future even more.

If you look into the future for your company, you need new or better business outcomes. That should be the driver for your company. A company does not need PLM or a Digital Twin. A company might want to reduce its time to market and improve collaboration between all stakeholders. These objectives can be realized by different ways of working and an IT infrastructure to allow these processes to become digital and connected.

That is the “game”. Coming back to the future of PLM. We do not need a discussion about definitions; I leave this to the academics and vendors. We will see the same applies to the concept of a Digital Twin.

My statement: The digital twin is not new. Everybody can have their own digital twin as long as you interpret the definition differently. Does this sound like the PLM definition?

The definition

I like to follow the Gartner definition:

A digital twin is a digital representation of a real-world entity or system. The implementation of a digital twin is an encapsulated software object or model that mirrors a unique physical object, process, organization, person, or other abstraction. Data from multiple digital twins can be aggregated for a composite view across a number of real-world entities, such as a power plant or a city, and their related processes.

As you see, not a narrow definition. Now we will look at the different types of interpretations.

Single-purpose siloed Digital Twins

  1. Simple – data only

One of the most straightforward applications of a digital twin is, for example, my Garmin Connect environment. My device registers performance parameters (speed, cadence, power, heartbeat, location) when cycling. Then, after every trip, I can analyze my performance. I can see changes in my overall performance; compare my performance with others in my category (weight, age, sex).

Based on that, I can decide if I want to improve my performance. My personal business goal is to maintain and improve my overall performance, knowing I cannot stop aging by upgrading my body.

On November 4th, 2020, I am participating in the (almost virtual) Digital Twin conference organized by Bits&Chips in the Netherlands. In the context of human performance, I look forward to Natal van Riel’s presentation: Towards the metabolic digital twin – for sure, this direction is not simple. Natal is a full professor at the Technical University in Eindhoven, the “smart city” in the Netherlands.

  1. Medium – data and operating models

Many connected devices in the world use the same principle. An airplane engine, an industrial robot, a wind turbine, a medical device, and a train carriage; all track the performance based on this connection between physical and virtual, based on some sort of digital connectivity.

The business case here is also monitoring performance, predicting maintenance, and upgrading the product when needed.

This is the domain of Asset Lifecycle Management, a practice that has existed for decades. Based on financial and performance models, the optimal balance between maintaining and overhauling has to be found. Repairs are disruptive and can be extremely costly. A manufacturing site that cannot produce can cost millions per day. Connecting data between the physical and the virtual model allows us to have real-time insights and be proactive. It becomes a digital twin.

  1. Advanced – data and connected 3D model

The digital twin we see the most in marketing videos is a virtual twin, using a 3D representation for understanding and navigation. The 3D representation provides a Virtual Reality (VR) environment with connected data. When pointing at the virtual components, information might appear, or some animation might take place.

Building such a virtual representation is a significant effort; therefore, there needs to be a serious business case.

The simplest business case is to use the virtual twin for training purposes. A flight simulator provides a virtual environment and behavior as-if you are flying in a physical airplane – the behavior model behind the simulator should match as well as possibly the real behavior. However, as it is a model, it will never be 100 % reality and requires updates when new findings or product changes appear.

A virtual model of a platform or plant can be used for training on Standard Operating Procedures (SOPs). In the physical world, there is no place or time to conduct such training. Here the complexity might be lower. There is a 3D Model; however, serious updates can only be expected after a major maintenance or overhaul activity.

These practices are not new either and are used in places where physical training cannot be done.

More challenging is the Augmented Reality (AR) use case. Here the virtual model, most of the time, a lightweight 3D Model, connects to real-time data coming from other sources. For example, AR can be used when an engineer has to service a machine. The AR environment might project actual data from the machine, indicate service points and service procedures.

The positive side of the business case is clear for such an opportunity, ensuring service engineers always work with the right information in a real-time context. The main obstacle to implementing AR, in reality, is the access to data, the presentation of the data and keeping the data in the AR environment matching the reality.

And although there are 3D Models in use, they are, to my knowledge, always created in siloes, not yet connected to their design sources. Have a look at the Digital Twin conference from Bits&Chips, as mentioned before.

Several of the cases mentioned above will be discussed here. The conference’s target is to share real cases concluded by Q & A sessions, crucial for a virtual event.

Connected Virtual Twins along the product lifecycle

So far, we have been discussing the virtual twin concept, where we connect a product/system/person in the physical world to a virtual model. Now let us zoom in on the virtual twins relevant for the early parts of the product lifecycle, the manufacturing twin, and the development twin. This image from Siemens illustrates the concept:

On slides they imagine a complete integrated framework, which is the future vision. Let us first zoom in on the individual connected twins.

The digital production twin

This is the area of virtual manufacturing and creating a virtual model of the manufacturing plant. Virtual manufacturing planning is not a new topic. DELMIA (Dassault Systèmes) and Tecnomatix (Siemens) are already for a long time offering virtual manufacturing planning solutions.

At that time, the business case was based on the fact that the definition of a manufacturing plant and process done virtually allows you to optimize the plant before investing in physical assets.

Saving money as there is no costly prototype phase to optimize production. In a virtual world, you can perform many trade-off studies without extra costs. That was the past (and, for many companies, still the current situation).

With the need to be more flexible in manufacturing to address individual customer orders without increasing the overhead of delivering these customer-specific solutions, there is a need for a configurable plant that can produce these individual products (batch size 1).

This is where the virtual plant model comes into the picture again. Instead of having a virtual model to define the ultimate physical plant, now the virtual model remains an active model to propose and configure the production process for each of these individual products in the physical plant.

This is partly what Industry 4.0 is about. Using a model-based approach to configure the plant and its assets in a connected manner. The digital production twin drives the execution of the physical plant. The factory has to change from a static factory to a dynamic “smart” factory.

In the domain of Industry 4.0, companies are reporting progress. However, in my experience, the main challenge is still that the product source data is not yet built in a model-based, configurable manner. Therefore, requires manual rework. This is the area of Model-Based Definition, and I have been writing about this aspect several times. Latest post: Model-Based: Connecting Engineering and Manufacturing

The business case for this type of digital twin, of course, is to be able to customer-specific products with extremely competitive speed and reduced cost compared to standard. It could be your company’s survival strategy. As it is hard to predict the future, as we see from COVID-19, it is still crucial to anticipate the future instead of waiting.

The digital development twin

Before a product gets manufactured, there is a product development process. In the past, this was pure mechanical with some electronic components. Nowadays, many companies are actually manufacturing systems as the software controlling the product plays a significant role. In this context, the model-based systems engineering approach is the upcoming approach to defining and testing a system virtually before committing to the physical world.

Model-Based Systems Engineering can define a single complex product and perform all kinds of analyses on the system even before there is a physical system in place. I will explain more about model-based systems engineering in future posts. In this context, I want to stress that having a model-based system engineering environment combined with modularity (do not confuse it with model-based) is a solid foundation for dealing with unique custom products. Solutions can be configured and validated against their requirements already during the engineering phase.

The business case for the digital development twin is easy to make. Shorter time to market, improved and validated quality, and reduced engineering hours and costs compared to traditional ways of working. To achieve these results,  for sure, you need to change your ways of working and the tools you are using. So it won’t be that easy!

For those interested in Industry 4.0 and the Model-Based System Engineering approach, join me at the upcoming PLM Road Map 2020 and PDT 2020 conference on 17-18-19 November. As you can see from the agenda, a lot of attention to the Digital Twin and Model-Based approaches.

Three digital half-days with hopefully a lot to learn and stay with our feet on the ground. In particular, I am looking forward to Marc Halpern’s keynote speech: Digital Thread: Be Careful What you Wish For, It Just Might Come True

Conclusion

It has been very noisy on the internet related to product features and technologies, probably due to COVID-19 and therefore disrupted interactions between all of us – vendors, implementers and companies trying to adjust their future. The Digital Twin concept is an excellent framing for a concept that everyone can relate to. Choose your business case and then look for the best matching twin.

During my holiday I have read some interesting books. Some for the beauty of imagination and some to enrich my understanding of the human brain.

Why the human brain? It is the foundation and motto of my company: The Know-How to Know Now.
In 2012 I wrote a post: Our brain blocks PLM acceptance followed by a post in 2014  PLM is doomed, unless …… both based on observations and inspired by the following books (must read if you are interested in more than just PLM practices and technology):

In 2014, Digital Transformation was not so clear. We talked about disruptors, but disruption happened outside our PLM comfort zone.

Now six years later disruption or significant change in the way we develop and deliver solutions to the market has become visible in the majority of companies. To stay competitive or meaningful in a global market with changing customer demands, old ways of working no longer bring enough revenue to sustain.  The impact of software as part of the solution has significantly changed the complexity and lifecycle(s) of solutions on the market.

Most of my earlier posts in the past two years are related to these challenges.

What is blocking Model-Based Definition?

This week I had a meeting in the Netherlands with three Dutch peers all interested and involved in Model-Based Definition – either from the coaching point of view or the “victim” point of view.  We compared MBD-challenges with Joe Brouwer’s AID (Associated Information Documents) approach and found a lot of commonalities.

No matter which method you use it is about specifying unambiguously how a product should be manufactured – this is a skill and craftsmanship and not a technology. We agreed that a model-based approach where information (PMI) is stored as intelligent data elements in a Technical Data Package (TPD) will be crucial for multidisciplinary usage of a 3D Model and its associated information.

If we would store the information again as dumb text in a view, it will need human rework leading to potential parallel information out of sync, therefore creating communication and quality issues. Unfortunate as it was a short meeting, the intention is to follow-up this discussion in the Netherlands to a broader audience. I believe this is what everyone interested in learning and understanding the needs and benefits of a model-based approach (unavoidable) should do. Get connected around the table and share/discuss.

We realized that human beings indeed are often the blocking reason why new ways of working cannot be introduced. Twenty-five years ago we had the discussion moving from 2D to 3D for design. Now due to the maturity of the solutions and the education of new engineers this is no longer an issue. Now we are in the next wave using the 3D Model as the base for manufacturing definition, and again a new mindset is needed.

There are a few challenges here:

  • MBD is still in progress – standards like AP242 still needs enhancements
  • There is a lack of visibility on real reference stories to motivate others.
    (Vendor-driven stories often are too good to be true or too narrow in scope)
  • There is no education for (modern) business processes related to product development and manufacturing. Engineers with new skills are dropped in organizations with traditional processes and silo thinking.

Educate, or our brain will block the future!

The above points need to be addressed, and here the human brain comes again into the picture.  Our unconscious, reptile brain is continuously busy to spend a least amount of energy as described in Thinking, Fast and Slow. Currently, I am reading the Idiot Brain: What Your Head Is Really Up To by Dean Burnett, another book confirming that our brain is not a logical engine making wise decisions

And then there is the Dunning-Kruger effect, explaining that the people with the lowest skills often have the most outspoken opinion and not even aware of this flaw. We see this phenomenon in particular now in social media where people push their opinion as if they are facts.

So how can we learn new model-based approaches and here I mean all the model-based aspects I have discussed recently, i.e., Model-Based Systems Engineering, Model-Based Definition/ Model-Based Enterprise and the Digital Twin? We cannot learn it from a book, as we are entering a new era.

First, you might want to understand there is a need for new ways of working related to complex products. If you have time, listen to Xin Guo Zhang’s opening keynote with the title: Co-Evolution of Complex Aeronautical Systems & Complex SE. It takes 30 minutes so force yourself to think slow and comprehend the message related to the needed paradigm shift for systems engineering towards model-based systems engineering

Also, we have to believe that model-based is the future. If not, we will find for every issue on our path a reason not to work toward the ultimate goal.

You can see this in the comments of my earlier post on LinkedIn, where Sami Grönstrand writes:

I warmly welcome the initiative to “clean up” these concepts  (It is time to clean up our model-based problem and above all, await to see live examples of transformations — even partial — coupled with reasonable business value identification. 

There are two kinds of amazing places: those you have first to see before you can believe they exist.
And then those kinds that you have to believe in first before you can see them…

And here I think we need to simplify en enhance the Model-Based myth as according to Yuval Harari in his book Sapiens, the power of the human race came from creating myths to align people to have long-term, forward-looking changes accepted by our reptile brain. We are designed to believe in myths. Therefore, the need for a Model-based myth.In my post PLM as a myth? from 2017, I discussed this topic in more detail.

Conclusion

There are so many proof points that our human brain is not as reliable as we think it is.  Knowing less about these effects makes it even harder to make progress towards a digital future. This post with all its embedded links can keep your brain active for a few hours. Try it, avoid to think fast and avoid assuming you know it all. Your thoughts?

 

Learning & Discussing more?
Still time to register for CIMdata PLM Roadmap and PDT Europe

 

 

 

Earth GIF - Find & Share on GIPHY

At this moment we are in the middle of the year. Usually for me a quiet time and a good time to reflect on what has happened so far and to look forward.

Three themes triggered me to write this half-year:

  • The changing roles of (PLM) consultancy
  • The disruptive effect of digital transformation on legacy PLM
  • The Model-driven approaches

A short summary per theme here with links to the original posts for those who haven’t followed the sequence.

The changing roles of (PLM) consultancy

Triggered by Oleg Shilovitsky’s post Why traditional PLM ranking is dead. PLM ranking 2.0 a discussion started related to the changing roles of PLM choice and the roles of a consultant.  Oleg and I agreed that using the word dead in a post title is a way to catch extra attention. And as many people do not read more than the introduction, this is a way to frame ideas (not invented by us, look at your newspaper and social media posts).  Please take your time and read this post till the end.

Oleg and I concluded that the traditional PLM status reports provided by consultancy firms are no longer is relevant. They focus on the big vendors, in a status-quo and most of them are 80 % the same on their core PLM capabilities. The challenge comes in how to select a PLM approach for your company.

Here Oleg and I differ in opinion. I am more looking at PLM from a business transformation point of view, how to improve your business with new ways of working. The role of a consultant is crucial here as the consultant can help to formalize the company’s vision and areas to focus on for PLM. The value of the PLM consultant is to bring experience from other companies instead of inventing new strategies per company. And yes, a consultant should get paid for this added value.

Oleg believes more in the bottom-up approach where new technology will enable users to work differently and empower themselves to improve their business (without calling it PLM). More or less concluding there is no need for a PLM consultant as the users will decide themselves about the value of the selected technology. In the context of Oleg’s position as CEO/Co-founder of OpenBOM, it is a logical statement, fighting for the same budget.

The discussion ended during the PLMx conference in Hamburg, where Oleg and I met with an audience recorded by MarketKey. You can find the recording Panel Discussion: Digital Transformation and the Future of PLM Consulting here.
Unfortunate, like many discussions, no conclusion. My conclusion remains the same – companies need PLM coaching !

The related post to this topic are:

 

The disruptive effect of digital transformation on legacy PLM

A topic that I have discussed the past two years is that current PLM is not compatible with a modern data-driven PLM. Note: data-driven PLM is still “under-development”. Where in most companies the definition of the products is stored in documents / files, I believe that in order to manage the complexity of products, hardware and software in the future, there is a need to organize data related to models not to files. See also: From Item-centric to model-centric ?

For a company it is extremely difficult to have two approaches in parallel as the first reaction is: “let’s convert the old data to the new environment”.

This statement has been proven impossible in most of the engagements I am involved in and here I introduced the bimodal approach as a way to keep the legacy going (mode 1) and scale-up for the new environment (mode 2).

A bimodal approach is sometimes acceptable when the PLM software comes from two different vendors. Sometimes this is also called the overlay approach – the old system remains in place and a new overlay is created to connect the legacy PLM system and potentially other systems like ALM or MBSE environments. For example some of the success stories for Aras complementing Siemens PLM.

Like the bimodal approach the overlay approach creates the illusion that in the near future the old legacy PLM will disappear. I partly share that illusion when you consider the near future a period of 5 – 10+ years depending on the company’s active products. Faster is not realistic.

And related to bimodal, I now prefer to use the terminology used by McKinsey: our insights/toward an integrated technology operating model in the context of PLM.

The challenge is that PLM vendors are reluctant to support a bimodal approach for their own legacy PLM as then suddenly this vendor becomes responsible for all connectivity between mode 1 and mode 2 data – every vendors wants to sell only the latest.

I will elaborate on this topic during the PDT Europe conference in Stuttgart – Oct 25th . No posts on this topic this year (yet) as I am discussing, learning and collecting examples from the field. What kept me relative busy was the next topic:

The Model-driven approaches

Most of my blogging time I spent on explaining the meaning behind a modern model-driven approach and its three main aspects: Model-Based Systems Engineering, Model-Based Definition and Digital Twins. As some of these aspects are still in the hype phase, it was interesting to see the two different opinions are popping up. On one side people claiming the world is still flat (2D), considering model-based approaches just another hype, caused by the vendors. There is apparently no need for digital continuity. If you look into the reactions from certain people, you might come to the conclusion it is impossible to have a dialogue, throwing opinions is not a discussion..

One of the reasons might be that people reacting strongly have never experienced model-based efforts in their life and just chime in or they might have a business reason not to agree to model-based approached as it does not align with their business? It is like the people benefiting from the climate change theory – will the vote against it when facts are known ? Just my thoughts.

There is also another group, to which I am connected, that is quite active in learning and formalizing model-based approaches. This in order to move forward towards a digital enterprise where information is connected and flowing related to various models (behavior models, simulation models, software models, 3D Models, operational models, etc., etc.) . This group of people is discussing standards and how to use and enhance them. They discuss and analyze with arguments and share lessons learned. One of the best upcoming events in that context is the joined CIMdata PLM Road Map EMEA and the PDT Europe 2018 – look at the agenda following the image link and you should get involved too – if you really care.

 

And if you are looking into your agenda for a wider, less geeky type of conference, consider the PI PLMx CHICAGO 2018 conference on Nov 5 and 6. The agenda provides a wider range of sessions, however I am sure you can find the people interested in discussing model-based learnings there too, in particular in this context Stream 2: Supporting the Digital Value Chain

My related posts to model-based this year were:

Conclusion

I spent a lot of time demystifying some of PLM-related themes. The challenge remains, like in the non-PLM world, that it is hard to get educated by blog posts as you might get over-informed by (vendor-related) posts all surfing somewhere on the hype curve. Do not look at the catchy title – investigate and take time to understand HOW things will this work for you or your company. There are enough people explaining WHAT they do, but HOW it fit in a current organization needs to be solved first. Therefore the above three themes.

This is my concluding post related to the various aspects of the model-driven enterprise. We went through Model-Based Systems Engineering (MBSE) where the focus was on using models (functional / logical / physical / simulations) to define complex product (systems). Next we discussed Model Based Definition / Model-Based Enterprise (MBD/MBE), where the focus was on data continuity between engineering and manufacturing by using the 3D Model as a master for design, manufacturing and eventually service information.

And last time we looked at the Digital Twin from its operational side, where the Digital Twin was applied for collecting and tuning physical assets in operation, which is not a typical PLM domain to my opinion.

Now we will focus on two areas where the Digital Twin touches aspects of PLM – the most challenging one and the most over-hyped areas I believe. These two areas are:

  • The Digital Twin used to virtually define and optimize a new product/system or even a system of systems. For example, defining a new production line.
  • The Digital Twin used to be the virtual replica of an asset in operation. For example, a turbine or engine.

Digital Twin to define a new Product/System

There might be some conceptual overlap if you compare the MBSE approach and the Digital Twin concept to define a new product or system to deliver. For me the differentiation would be that MBSE is used to master and define a complex system from the R&D point of view – unknown solution concepts – use hardware or software?  Unknown constraints to be refined and optimized in an iterative manner.

In the Digital Twin concept, it is more about a defining a system that should work in the field. How to combine various systems into a working solution and each of the systems has already a pre-defined set of behavioral / operational parameters, which could be 3D related but also performance related.

You would define and analyze the new solution virtual to discover the ideal solution for performance, costs, feasibility and maintenance. Working in the context of a virtual model might take more time than traditional ways of working, however once the models are in place analyzing the solution and optimizing it takes hours instead of weeks, assuming the virtual model is based on a digital thread, not a sequential process of creating and passing documents/files. Virtual solutions allow a company to optimize the solution upfront instead of costly fixing during delivery, commissioning and maintenance.

Why aren’t we doing this already? It takes more skilled engineers instead of cheaper fixers downstream. The fact that we are used to fixing it later is also an inhibitor for change. Management needs to trust and understand the economic value instead of trying to reduce the number of engineers as they are expensive and hard to plan.

In the construction industry, companies are discovering the power of BIM (Building Information Model) , introduced to enhance the efficiency and productivity of all stakeholders involved. Massive benefits can be achieved if the construction of the building and its future behavior and maintenance can be optimized virtually compared to fixing it in an expensive way in reality when issues pop up.

The same concept applies to process plants or manufacturing plants where you could virtually run the (manufacturing) process. If the design is done with all the behavior defined (hardware-in-the-loop simulation and software-in-the-loop) a solution has been virtually tested and rapidly delivered with no late discoveries and costly fixes.

Of course it requires new ways of working. Working with digital connected models is not what engineering learn during their education time – we have just started this journey. Therefore organizations should explore on a smaller scale how to create a full Digital Twin based on connected data – this is the ultimate base for the next purpose.

Digital Twin to match a product/system in the field

When you are after the topic of a Digital Twin through the materials provided by the various software vendors, you see all kinds of previews what is possible. Augmented Reality, Virtual Reality and more. All these presentations show that clicking somewhere in a 3D Model Space relevant information pops-up. Where does this relevant information come from?

Most of the time information is re-entered in a new environment, sometimes derived from CAD but all the metadata comes from people collecting and validating data. Not the type of work we promote for a modern digital enterprise. These inefficiencies are good for learning and demos but in a final stage a company cannot afford silos where data is collected and entered again disconnected from the source.

The main problem: Legacy PLM information is stored in documents (drawings / excels) and not intended to be shared downstream with full quality.
Read also: Why PLM is the forgotten domain in digital transformation.

If a company has already implemented an end-to-end Digital Twin to deliver the solution as described in the previous section, we can understand the data has been entered somewhere during the design and delivery process and thanks to a digital continuity it is there.

How many companies have done this already? For sure not the companies that are already a long time in business as their current silos and legacy processes do not cater for digital continuity. By appointing a Chief Digital Officer, the journey might start, the biggest risk the Chief Digital Officer will be running another silo in the organization.

So where does PLM support the concept of the Digital Twin operating in the field?

For me, the IoT part of the Digital Twin is not the core of a PLM. Defining the right sensors, controls and software are the first areas where IoT is used to define the measurable/controllable behavior of a Digital Twin. This topic has been discussed in the previous section.

The second part where PLM gets involved is twofold:

  • Processing data from an individual twin
  • Processing data from a collection of similar twins

Processing data from an individual twin

Data collected from an individual twin or collection of twins can be analyzed to extract or discover failure opportunities. An R&D organization is interested in learning what is happening in the field with their products. These analyses lead to better and more competitive solutions.

Predictive maintenance is not necessarily a part of that.  When you know that certain parts will fail between 10.000 and 20.000 operating hours, you want to optimize the moment of providing service to reduce downtime of the process and you do not want to replace parts way too early.


The R&D part related to predictive maintenance could be that R&D develops sensors inside this serviceable part that signal the need for maintenance in a much smaller time from – maintenance needed within 100 hours instead of a bandwidth of 10.000 hours. Or R&D could develop new parts that need less service and guarantee a longer up-time.

For an R&D department the information from an individual Digital Twin might be only relevant if the Physical Twin is complex to repair and downtime for each individual too high. Imagine a jet engine, a turbine in a power plant or similar. Here a Digital Twin will allow service and R&D to prepare maintenance and simulate and optimize the actions for the physical world before.

The five potential platforms of a digital enterprise

The second part where R&D will be interested in, is in the behavior of similar products/systems in the field combined with their environmental conditions. In this way, R&D can discover improvement points for the whole range and give incremental innovation. The challenge for this R&D organization is to find a logical placeholder in their PLM environment to collect commonalities related to the individual modules or components. This is not an ERP or MES domain.

Concepts of a logical product structure are already known in the oil & gas, process or nuclear industry and in 2017 I wrote about PLM for Owners/Operators mentioning Bjorn Fidjeland has always been active in this domain, you can find his concepts at plmPartner here  or as an eLearning course at SharePLM.

To conclude:

  • This post is way too long (sorry)
  • PLM is not dead – it evolves into one of the crucial platforms for the future – The Product Innovation Platform
  • Current BOM-centric approach within PLM is blocking progress to a full digital thread

More to come after the holidays (a European habit) with additional topics related to the digital enterprise

 

This is almost my last planned post related to the concepts of model-based. After having discussed Model-Based Systems Engineering (needed to develop complex products/systems including hardware and software) and Model-Based Definition (creating an efficient connection between Engineering and Manufacturing), my last post will be related to the most over-hyped topic: The Digital Twin

There are several reasons why the Digital Twin is over-hyped. One of the reasons is that the Digital Twin is not necessarily considered as a PLM-related topic. Other vendors like SAP (the network of digital twins), Oracle (Digital Twins for IoT applications)  and GE with their Predix-platform also contributed to the hype related to the digital twin. The other reason is that the concept of Digital Twin is a great idea for marketers to shine above the clouds. Are recent comment from Monica Schnitger says it all in her post 5 quick takeaways from Siemens Automation summit. Monica’s take away related to Digital Twin:

The whole digital twin concept is just starting to gain traction with automation users. In many cases, they don’t have a digital representation of the equipment on their lines; they may have some data from the equipment OEM or their automation contractors but it’s inconsistent and probably incomplete. The consensus seemed to be that this is a great idea but out of many attendees’ immediate reach. [But it is important to start down this path: model something critical, gather all the data you can, prove benefit then move on to a bigger project.]

Monica is aiming to the same point I have been mentioning several times. There is no digital representation and the existing data is inconsistent. Don’t wait: The importance of accurate data – act now !

What is a digital twin?

I think there are various definitions of the digital twin and I do not want to go in a definition debate like we had before with the acronyms MBD/MBE (Model Based Definition/Enterprise – the confusion) or even the acronym PLM (classical PLM or digital PLM ?). Let’s agree on the following high-level statements:

  • A digital twin is a virtual representation of a physical product
  • The virtual part of the digital twin is defined by what you want to analyze, simulate, predict related to the physical product
  • One physical product can have multiple digital twins, only in the ideal world there is potentially a unique digital twin for every physical product in the world
  • When a product interacts with the environment, based on inputs and outputs, we normally call them systems. When I use Product, it will be most of the time a System, in particular in the context of a digital twin

Given the above statements, I will give some examples of digital twin concepts:

As a cyclist I am active on platforms like Garmin and Strava, using a tracking device, heart monitor and a power meter. During every ride my device plus the sensors measure my performance and all the data is uploaded to the platform, providing me with a report where I drove, how fast, my heartbeat, cadence and power during the ride. On Strava I can see the Flybys (other digital twins that crossed my path and their performances) and I can see per segment how I performed considered to others and I can filter by age, by level etc.)

This is the easiest part of a digital twin. Every individual can monitor and analyze their personal behavior and discover trends. Additionally, the platform owner has all the intelligence about all cyclists around the world, how they perform and what would be the best performance per location. And based on their Premium offering (where you pay) they can give you advanced advise on how you can improve. This is the Strava business model bringing value to the individual meanwhile learning from the behavior of thousands. Note in this scenario there is no 3D involved.

Another known digital twin story is related to plants in operation. In the past 10 years I have been advocating for Plant Lifecycle Management (PLM for Owner/Operators), describing the value of a virtual plant model using PLM capabilities combined with Maintenance, Repair and Overhaul (MRO) in order to reduce downtime. In a nuclear environment the usage of 3D verification, simulation and even control software in a virtual environment, can bring great benefit due to the fact that the physical twin is not always accessible and downtime can be up to several million per week.

The above examples provide two types of digital twins. I will discuss some characteristics in the next paragraphs.

Digital Twin – performance focus

Companies like GE and SAP focus a lot on the digital twin in relation to the asset performance. Measuring the performance of assets, compare their performance with other similar assets and based on performance characteristics the collector of the data can sell predictive maintenance analysis, performance optimization guidance and potentially other value offerings to their customers.

Small improvements in the range of a few percents can have a big impact on the overall net results. The digital twin is crucial in this business model to build-up knowledge, analyze and collect it and sell the knowledge again. This type of scenario is the easiest one. You need products with sensors, you need an infrastructure to collect the data and extract and process information in a manner that it can be linked to a behavior model with parameters that influence the model.

Image SAP blogs

This is the model-based part of the digital twin. For a single product there can be different models related to the parameters driving your business. E.g. performance parameters for output, parameters for optimal up-time (preventive maintenance – usage optimization) or parameters related to environmental impact, etc..) Building and selling the results of such a model is an add-on business, creating more value for your customer combined with creating more loyalty. Using the digital twin in the context of performance focus does not require a company to change the way they are working totally.  Yes, you need new skills, data collection and analysis, and more sensor technology but a lot of the product development activities can remain the same (for the moment).

As a conclusion for this type of digital twin I would state, yes there is some PLM involved, but the main focus is on business execution.

Due to the fact that I already reach more than 1000 words, I will focus in my next post on the most relevant digital twin for PLM. Here all disciplines come together. The 3D Mechanical model, the behavior models, the embedded and control software, (manufacturing) simulation and more. All to create an almost perfect virtual copy of a real product or system in the physical world. And there we will see that this is not as easy, as concepts depend on accurate data and reliable models, which is not the case currently in most companies in their engineering environment.

 

Conclusion

Digital Twin is a marketing hype however when you focus on only performance monitoring and tuning it becomes a reality as it does not require a company to align in a digital manner across the whole lifecycle. However this is just the beginning of a real digital twin.

Where are you in your company with the digital twin journey?

Translate

Categories

  1. As a complement, even if more and more of the diversity of a product is managed at the software level…

  2. 1) A wiring diagram stores information (wires between ports of the electrical components) that does not exist in most of…

  3. BOM has NEVER been the sole "master" of the Product. The DEFINITION FILE is ! For example the wiring of…

%d bloggers like this: