You are currently browsing the category archive for the ‘Innovation’ category.

In my last post in this series, The road to model-based and connected PLM, I mentioned that perhaps it is time to talk about SLM instead of PLM when discussing popular TLA’s for our domain of expertise. There were not so many encouraging statements for SLM so far.

SLM could mean for me, Solution Lifecycle Management, considering that the company’s offering more and more is a mix of products and services. Or SLM could mean System Lifecycle Management, in that case pushing the idea that more and more products are interacting with the outside world and therefore could be considered systems. Products are (almost) dead.

In addition, I mentioned that the typical product lifecycle and related configuration management concepts need to change as in the SLM domain. There is hardware and software with different lifecycles and change processes.

It is a topic I want to explore further. I am curious to learn more from Martijn Dullaart, who will be lecturing at the  PLM Road map and PDT 2021 fall conference in November. I hope my expectations are not too high, knowing it is a topic of interest for Martijn. Feel free to join this discussion

In this post, it is time to follow up on my third statement related to what data-driven implies:

Data-driven means that we need to manage data in a much more granular manner. We have to look different at data ownership. It becomes more about data accountability per role as the data can be used and consumed throughout the product lifecycle

On this topic, I have a list of points to consider; let’s go through them.

The dataset

In this post, I will often use the term dataset (you are also allowed to write the data set I understood).

A dataset means a predefined number of attributes and values that belong logically to each other. Datasets should be defined based on the purpose and, if possible, designated for a single goal. In this way, they can be stored in a database.

Combined with other datasets, a combination can result in relevant business information. Note a dataset is not only transactional data; a dataset could also describe geometry.

Identify the dataset

In the document-based world, a lot of information could be stored in a single file. In a data-driven world, we should define a dataset that contains a specific piece of information, logically belonging together. If we are more precise, a part would have various related datasets that make up the definition of a part. These definitions could be:

  • Core identification attributes like ID, Name, Type and Status
  • The Type could define a set of linked information. For example, a valve would have different characteristics as a resistor. Through classification, we can link data sets to the core definition of a part.
  • The part can have engineering-specific data (CAD and metadata), manufacturing-specific data, supplier-specific data, and service-specific data. Each of these datasets needs to be defined as a unique element in a data-driven environment
  • CAD is a particular case as most current CAD systems don’t treat geometry as a single dataset. In a file-based world, many other datasets are stored in the file (e.g., engineering or manufacturing details). In a data-driven environment, we want to have the CAD definition to be treated like a dataset. Dassault Systèmes with their CATIA V6 and 3DEXPERIENCE platform or PTC with OnShape are examples of this approach.Having CAD as separate datasets makes sharing and collaboration so much easier, as we can see from these solutions. The concept for CAD stored in a database is not new, and this approach has been used in various disciplines. Mechanical CAD was always a challenge.

Thanks to Moore’s Law (approximate every 2 years, processor power doubled – click on the image for the details) and higher network connection speed, it starts to make sense to have mechanical CAD also stored in a database instead of a file

An important point to consider is a kind of standardization of datasets. In theory, there should be a kind of minimum agreed collection of datasets. Industry standards provide these collections in their dictionary. Whenever you optimize your data model for a connected enterprise, make sure you look first into the standards that apply to your industry.

They might not be perfect or complete, but inventing your own new standard is a guarantee for legacy issues in the future. This remark is also valid for the software vendors in this domain. A proprietary data model might give you a competitive advantage.

Still, in the long term, there is always the need to connect with outside stakeholders.

 

Identify the RACI

To ensure a dataset is complete and well maintained, the concept of RACI could be used. RACI is the abbreviation for Responsible Accountable Consulted and Informed and a simplification of the RASCI Model, see also a responsibility assignment matrix.

In a data-driven environment, there is no data ownership anymore like you have for documents. The main reason that data ownership can no longer be used is that datasets can be consumed by anyone in the ecosystem. No longer only your department or the manufacturing or service department.

Data sets in a data-driven environment bring value when connected with other datasets in applications or dashboards.

A dataset describing the specification attributes of a part could be used in a spare part app and a service app. Of course, the dataset will be used in a different context – still, we need to ensure we can trust the data.

Therefore, per identified dataset, there should be governed by a kind of RACI concept. The RACI concept is a way to break the siloes in an organization.

Identify Inside / outside

There is a lot of fear that a connected, data-driven environment will expose Intellectual Property (IP). It came up in recent discussions. If you like storytelling and technology, read my old SmarTeam colleague Alex Bruskin’s post: The Bilbo Baggins Threat to PLM Assets. Alex has written some “poetry” with a deep technical message behind it.

It is true that if your data set is too big, you have the challenge of exposing IP when connecting this dataset with others. Therefore, when building a data model, you should make it possible to have datasets pure for internal usage and datasets for sharing.

When you use the concept of RACI, the difference should be defined by the I(informed) – is it PLM-data or PIM-data for example?

Tracking relations

Suppose we follow up on the concept of datasets. In that case, it becomes clear that relations between the datasets are as crucial as the dataset. In traditional PLM applications, these relations are often predefined as part of the core data model/

For example, the EBOM parts have relationships between themselves and specification data – see image.

The MBOM parts have links with the supplier data or the manufacturing process.

The prepared relations in a PLM system allow people to implement the system relatively quickly to map their approaches to this taxonomy.

However, traditional PLM systems are based on a document-based (or file-based) taxonomy combined with related metadata. In a model-based and connected environment, we have to get rid of the document-based type of data.

Therefore, the datasets will be more granular, and there is a need to manage exponential more relations between datasets.

This is why you see the graph database coming up as a needed infrastructure for modern connected applications. If you haven’t heard of a graph database yet, you are probably far from technology hypes. To understand the principles of a graph database you can read this article from neo4j:  Graph Databases for Beginners: Why graph technology is the future

As you can see from the 2020 Gartner Hype Cycle for Artificial Intelligence this technology is at the top of the hype and conceptually the way to manage a connected enterprise. The discussion in this post also demonstrates that besides technology there is a lot of additional conceptual thinking needed before it can be implemented.

Although software vendors might handle the relations and datasets within their platform, the ultimate challenge will be sharing datasets with other platforms to get a connected ecosystem.

For example, the digital web picture shown above and introduced by Marc Halpern at the 2018 PDT conference shows this concept. Recently CIMdata discussed this topic in a similar manner: The Digital Thread is Really a Web, with the Engineering Bill of Materials at Its Center
(Note I am not sure if CIMdata has published a recording of this webinar – if so I will update the link)

Anyway, these are signs that we started to find the right visuals to imagine new concepts. The traditional digital thread pictures, like the one below, are, for me, impressions of the past as they are too rigid and focusing on some particular value streams.

From a distance, it looks like a connected enterprise should work like our brain. We story information on different abstraction levels. We keep incredibly many relations between information elements. As the brain is a biological organ, connections degrade or get lost. Or the opposite other relationships become so strong that we cannot change them anymore. (“I know I am always right”)

Interestingly, the brain does not use the “single source of truth”-concept – there can be various “truths” inside a brain. This makes us human beings with all the good and the harmful effects of that.

As long as we realize there is no single source of truth.

In business and our technological world, we need sometimes the undisputed truth. Blockchain could be the basis for securing the right connections between datasets to guarantee the result is valid. I am curious if blockchain can scale to complex connected situations, although Moore’s Law might ultimately help us here too(if still valid).

The topic is not new – in 2014 I wrote a post with the title: PLM is doomed unless ….   Where I introduced the topic of owning and sharing in the context of the human brain.  In the post, I refer to the book On Intelligence by Jeff Hawkins how tries to analyze what is human-based intelligence and how could we apply it to our technology concepts. Still a fascinating book worth reading if you have the time and opportunity.

 

Conclusion

A data-driven approach requires a more granular definition of information, leading to the concepts of datasets and managing relations between datasets. This is a fundamental difference compared to the past, where we were operating systems with information. Now we are heading towards connected platforms that provide a filtered set of real-time data to act upon.

I am curious to learn more about how people have solved the connected challenges and in what kind of granularity. Let us know!

 

 

In my last post, I zoomed in on a preferred technical architecture for the future digital enterprise. Drawing the conclusion that it is a mission impossible to aim for a single connected environment. Instead, information will be stored in different platforms, both domain-oriented (PLM, ERP, CRM, MES, IoT) and value chain oriented (OEM, Supplier, Marketplace, Supply Chain hub).

In part 3, I posted seven statements that I will be discussing in this series. In this post, I will zoom in on point 2:

Data-driven does not mean we do not need any documents anymore. Read electronic files for documents. Likely, document sets will still be the interface to non-connected entities, suppliers, and regulatory bodies. These document sets can be considered a configuration baseline.

 

System of Record and System of Engagement

In the image below, a slide from 2016,  I show a simplified view when discussing the difference between the current, coordinated approach and the future, connected approach.  This picture might create the wrong impression that there are two different worlds – either you are document-driven, or you are data-driven.

In the follow-up of this presentation, I explained that companies need both environments in the future. The most efficient way of working for operations will be infrastructure on the right side, the platform-based approach using connected information.

For traceability and disconnected information exchanges, the left side will be there for many years to come. Systems of Record are needed for data exchange with disconnected suppliers, disconnected regulatory bodies and probably crucial for configuration management.

The System of Record will probably remain as a capability in every platform or cross-section of platform information. The Systems of Engagement will be the configured real-time environment for anyone involved in active company processes, not only ERP or MES, all execution.

Introducing SysML and SML

This summer, I received a copy of Martin Eigner’s System Lifecycle Management book, which I am reading at his moment in my spare moments. I always enjoyed Martin’s presentations. In many ways, we share similar ideas. Martin from his profession spent more time on the academic aspects of product and system lifecycle than I. But, on the other hand, I have always been in the field observing and trying to make sense of what I see and learn in a coherent approach. I am halfway through the book now, and for sure, I will come back on the book when I have finished.

A first impression: A great and interesting book for all. Martin and I share the same history of data management. Read all about this in his second chapter: Forty Years of Product Data Management

From PDM via PLM to SysLM, is a chapter that everyone should read when you haven’t lived it yourself. It helps you to understand the past (Learning for the past to understand the future). When I finish this series about the model-based and connected approach for products and systems, Martin’s book will be highly complementary given the content he describes.

There is one point for which I am looking forward to is feedback from the readers of this blog.

Should we, in our everyday language, better differentiate between Product Lifecycle Management (PLM) and System Lifecycle Management(SysLM)?

In some customer situations, I talk on purpose about System Lifecycle Management to create the awareness that the company’s offering is more than an electro/mechanical product. Or ultimately, in a more circular economy, would we use the term Solution Lifecycle Management as not only hardware and software might be part of the value proposition?

Martin uses consistently the abbreviation SysLM, where I would prefer the TLA SLM. The problem we both have is that both abbreviations are not unique or explicit enough. SysLM creates confusion with SysML (for dyslectic people or fast readers). SLM already has so many less valuable meanings: Simulation Lifecycle Management, Service Lifecycle Management or Software Lifecycle Management.

For the moment, I will use the abbreviation SLM, leaving it in the middle if it is System Lifecycle Management or Solution Lifecycle Management.

 

How to implement both approaches?

In the long term, I predict that more than 80 percent of the activities related to SLM will take place in a data-driven, model-based environment due to the changing content of the solutions offered by companies.

A solution will be based on hardware, the solid part of the solution, for which we could apply a BOM-centric approach. We can see the BOM-centric approach in most current PLM implementations. It is the logical result of optimizing the product lifecycle management processes in a coordinated manner.

However, the most dynamic part of the solution will be covered by software and services. Changing software or services related to a solution has completely different dynamics than a hardware product.

Software and services implementations are associated with a data-driven, model-based approach.

The management of solutions, therefore, needs to be done in a connected manner. Using the BOM-centric approach to manage software and services would create a Kafkaesque overhead.

Depending on your company’s value proposition to the market, the challenge will be to find the right balance. For example, when you keep on selling disconnectedhardware, there is probably no need to change your internal PLM processes that much.

However, when you are moving to a connected business model providing solutions (connected systems / Outcome-based services), you need to introduce new ways of working with a different go-to-market mindset. No longer linear, but iterative.

A McKinsey concept, I have been promoting several times, illustrates a potential path – note the article was not written with a PLM mindset but in a business mindset.

What about Configuration Management?

The different datasets defining a solution also challenge traditional configuration management processes. Configuration Management (CM) is well established in the aerospace & defense industry. In theory, proper configuration management should be the target of every industry to guarantee an appropriate performance, reduced risk and cost of fixing issues.

The challenge, however, is that configuration management processes are not designed to manage systems or solutions, where dynamic updates can be applied whether or not done by the customer.

This is a topic to solve for the modern Connected Car (system) or Connected Car Sharing (solution)

For that reason, I am inquisitive to learn more from Martijn Dullaart’s presentation at the upcoming PLM Roadmap/PDT conference. The title of his session: The next disruption please …

In his abstract for this session, Martijn writes:

From Paper to Digital Files brought many benefits but did not fundamentally impact how Configuration Management was and still is done. The process to go digital was accelerated because of the Covid-19 Pandemic. Forced to work remotely was the disruption that was needed to push everyone to go digital. But a bigger disruption to CM has already arrived. Going model-based will require us to reexamine why we need CM and how to apply it in a model-based environment. Where, from a Configuration Management perspective, a digital file still in many ways behaves like a paper document, a model is something different. What is the deliverable? How do you manage change in models? How do you manage ownership? How should CM adopt MBx, and what requirements to support CM should be considered in the successful implementation of MBx? It’s time to start unraveling these questions in search of answers.

One of the ideas I am currently exploring is that we need a new layer on top of the current configuration management processes extending the validation to software and services. For example, instead of describing every validated configuration, a company might implement the regular configuration management processes for its hardware.

Next, the systems or solutions in the field will report (or validate) their configuration against validation rules. A topic that requires a long discussion and more than this blog post, potentially a full conference.

Therefore I am looking forward to participating in the CIMdata/PDT FALL conference and pick-up the discussions towards a data-driven, model-based future with the attendees.  Besides CM, there are several other topics of great interest for the future. Have a look at the agenda here

 

Conclusion

A data-driven and model-based infrastructure still need to be combined with a coordinated, document-driven infrastructure.  Where the focus will be, depends on your company’s value proposition.

If we discuss hardware products, we should think PLM. When you deliver systems, you should perhaps talk SysML (or SLM). And maybe it is time to define Solution Lifecycle Management as the term for the future.

Please, share your thoughts in the comments.

 

After the first article discussing “The Future of PLM,” now again a post in the category of PLM and complementary practices/domains a topic that is already for a long time on the radar: Model-Based Definition, I am glad to catch up with Jennifer Herron, founder of Action Engineering, who is one of the thought leaders related to Model-Based Definition (MBD) and Model-Based Enterprise (MBE).

In 2016 I spoke with Jennifer after reading her book: “Re-Use Your CAD – The Model-Based CAD Handbook”. At that time, the discussion was initiated through two articles on Engineering.com. Action Engineering introduced OSCAR seven years later as the next step towards learning and understanding the benefits of Model-Based Definition.

Therefore, it is a perfect moment to catch up with Jennifer. Let’s start.

 

Model-Based Definition

Jennifer, first of all, can you bring some clarity in terminology. When I discussed the various model-based approaches, the first response I got was that model-based is all about 3D Models and that a lot of the TLA’s are just marketing terminology.
Can you clarify which parts of the model-based enterprise you focus on and with the proper TLA’s?

Model-Based means many things to many different viewpoints and systems of interest. All these perspectives lead us down many rabbit holes, and we are often left confused when first exposed to the big concepts of model-based.

At Action Engineering, we focus on Model-Based Definition (MBD), which uses and re-uses 3D data (CAD models) in design, fabrication, and inspection.

There are other model-based approaches, and the use of the word “model” is always a challenge to define within the proper context.

For MBD, a model is 3D CAD data that comes in both native and neutral formats

Another model-based approach is Model-Based Systems Engineering (MBSE). The term “model” in this context is a formalized application of modeling to support system requirements, design, analysis, verification and validation activities beginning in the conceptual design phase and continuing throughout development and later lifecycle phases.

<Jos> I will come back on Model-Based Systems Engineering in future posts

Sometimes MBSE is about designing widgets, and often it is about representing the entire system and the business operations. For MBD, we often focus our education on the ASME Y14.47 definition that MBD is an annotated model and associated data elements that define the product without a drawing.

Model-Based Definition for Everybody?

I believe it took many years till 3D CAD design became a commodity; however, I still see the disconnected 2D drawing used to specify a product or part for manufacturing or suppliers. What are the benefits of model-based definition?
Are there companies that will not benefit from the model-based definition?

There’s no question that the manufacturing industry is addicted to their drawings. There are many reasons why, and yet mostly the problem is lack of awareness of how 3D CAD data can make design, fabrication, and inspection work easier.

For most, the person doing an inspection in the shipping and receiving department doesn’t have exposure to 3D data, and the only thing they have is a tabulated ERP database and maybe a drawing to read. If you plop down a 3D viewable that they can spin and zoom, they may not know how that relates to their job or what you want them to do differently.

Today’s approach of engineering championing MBD alone doesn’t work. To evolve information from the 2D drawing onto the 3D CAD model without engaging the stakeholders (machinists, assembly technicians, and inspectors) never yields a return on investment.

Organizations that succeed in transitioning to MBD are considering and incorporating all departments that touch the drawing today.

Incorporating all departments requires a vision from the management. Can you give some examples of companies that have transitioned to MBD, and what were the benefits they noticed?

I’ll give you an example of a small company with no First Article Inspection (FAI) regulatory requirements and a huge company with very rigorous FAI requirements.

 

Note: click on the images below to enjoy the details.

The small company instituted a system of CAD modeling discipline that allowed them to push 3D viewable information directly to the factory floor. The assembly technicians instantly understood engineering’s requirements faster and better.

The positive MBD messages for these use cases are 3D  navigation, CAD Re-Use, and better control of their revisions on the factory floor.

 

The large company has added inspection requirements directly onto their engineering and created a Bill of Characteristics (BOC) for the suppliers and internal manufacturers. They are removing engineering ambiguity, resulting in direct digital information exchange between engineering, manufacturing, and quality siloes.

These practices have reduced error and reduced time to market.

The positive MBD messages for these use cases are unambiguous requirements capture by Engineering, Quality Traceability, and Model-Based PMI (Product and Manufacturing Information).

Model-Based Definition and PLM?

How do you see the relation between Model-Based Definition and PLM? Is a PLM system a complication or aid to implement a Model-Based Definition? And do you see a difference between the old and new PLM Vendors?

Model-Based Definition data is complex and rich in connected information, and we want it to be. With that amount of connected data, a data management system (beyond upload/download of documents) must keep all that data straight.

Depending on the size and function of an organization, a PLM may not be needed. However, a way to manage changes and collaboration amongst those using 3D data is necessary. Sometimes that results in a less sophisticated Product Data Management (PDM) system. Large organizations often require PLM.

There is significant resistance to doing MBD and PLM implementations simultaneously because PLM is always over budget and behind schedule. However, doing just MBD or just PLM without the other doesn’t work either. I think you should be brave and do both at once.

I think we can debate why PLM is always over budget and behind schedule. I hear the same about ERP implementations. Perhaps it has to deal with the fact that enterprise applications have to satisfy many users?

I believe that working with model versions and file versions can get mixed in larger organizations, so there is a need for PDM or PLM. Have you seen successful implementations of both interacting together?

Yes, the only successful MBD implementations are those that already have a matured PDM/PLM (scaled best to the individual business).

 

Model-Based Definition and Digital Transformation

In the previous question, we already touched on the challenge of old and modern PLM. How do you see the introduction of Model-Based Definition addressing the dreams of Industry 4.0, the Digital Twin and other digital concepts?

I just gave a presentation at the ASME Digital Twin Summit discussing the importance of MBD for the Digital Twin. MBD is a foundational element that allows engineering to compare their design requirements to the quality inspection results of digital twin data.

The feedback loop between Engineering and Quality is fraught with labor-intensive efforts in most businesses today.

Leveraging the combination of MBD and Digital Twin allows automation possibilities to speed up and increase the accuracy of the engineering to inspection feedback loop. That capability helps organizations realize the vision of Industry 4.0.

And then there is OSCAR.

I noticed you announced OSCAR. First, I thought OSCAR was a virtual aid for model-based definition, and I liked the launching page HERE. Can you tell us more about what makes OSCAR unique?

One thing that is hard with MBD implementation is there is so much to know. Our MBDers at Action Engineering have been involved with MBD for many years and with many companies. We are embedded in real-life transitions from using drawings to using models.

Suppose you start down the model-based path for digital manufacturing. In that case, there are significant investments in time to learn how to get to the right set of capabilities and the right implementation plan guided by a strategic focus. OSCAR reduces that ramp-up time with educational resources and provides vetted and repeatable methods for an MBD implementation.

OSCAR combines decades of Action Engineering expertise and lessons learned into a multi-media textbook of sorts. To kickstart an individual or an organization’s MBD journey, it includes asynchronous learning, downloadable resources, and CAD examples available in Creo, NX, and SOLIDWORKS formats.

CAD users can access how-to training and downloadable resources such as the latest edition of Re-Use Your CAD (RUYC). OSCAR enables process improvement champions to make their case to start the MBD journey. We add content regularly and post what’s new. Free trials are available to check out the online platform.

Learn more about what OSCAR is here:

Want to learn more?

In this post, I believe we only touched the tip of the iceberg. There is so much to learn and understand. What would you recommend to a reader of this blog who got interested?

 

RUYC (Re-Use Your CAD)  is an excellent place to start, but if you need more audio-visual, and want to see real-life examples of MBD in action, get a Training subscription of OSCAR to get rooted in the vocabulary and benefits of MBD with a Model-Based Enterprise. Watch the videos multiple times! That’s what they are for. We love to work with European companies and would love to support you with a kickstart coaching package to get started.

What I learned

First of all, I learned that Jennifer is a very pragmatic person. Her company (Action Engineering) and her experience are a perfect pivot point for those who want to learn and understand more about Model-Based Definition. In particular, in the US, given her strong involvement in the American Society of Mechanical Engineers (ASME).

I am still curious if European or Asian counterparts exist to introduce and explain the benefits and usage of Model-Based Definition to their customers.  Feel free to comment.

Next, and an important observation too, is the fact that Jennifer also describes the tension between Model-Based Definition and PLM. Current PLM systems might be too rigid to support end-to-end scenarios, taking benefit of the Model-Based definition.

I have to agree here. PLM Vendors mainly support their own MBD (model-based definition), where the ultimate purpose is to share all product-related information using various models as the main information carriers efficiently.

We have to study and solve a topic in the PLM domain, as I described in my technical highlights from the PLM Road Map & PDT Spring 2021 conference.

There is work to do!

Conclusion

Model-Based Definition is, for me, one of the must-do steps of a company to understand the model-based future. A model-based future sometimes incorporates Model-Based Systems Engineering, a real Digital Thread and one or more Digital Twins (depending on your company’s products).

It is a must-do activity because companies must transform themselves to depend on digital processes and digital continuity of data to remain competitive. Document-driven processes relying on the interpretation of a person are not sustainable.

 

After the first article discussing “The Future of PLM,” now again a post in the category of PLM and complementary practices/domains a topic that is already for a long time on the radar: Model-Based Definition, I am glad to catch up with Jennifer Herron, founder of Action Engineering, who is one of the thought leaders related to Model-Based Definition (MBD) and Model-Based Enterprise (MBE).

In 2016 I spoke with Jennifer after reading her book: “Re-Use Your CAD – The Model-Based CAD Handbook”. At that time, the discussion was initiated through two articles on Engineering.com. Action Engineering introduced OSCAR seven years later as the next step towards learning and understanding the benefits of Model-Based Definition.

Therefore, it is a perfect moment to catch up with Jennifer. Let’s start.

 

Model-Based Definition

Jennifer, first of all, can you bring some clarity in terminology. When I discussed the various model-based approaches, the first response I got was that model-based is all about 3D Models and that a lot of the TLA’s are just marketing terminology.
Can you clarify which parts of the model-based enterprise you focus on and with the proper TLA’s?

Model-Based means many things to many different viewpoints and systems of interest. All these perspectives lead us down many rabbit holes, and we are often left confused when first exposed to the big concepts of model-based.

At Action Engineering, we focus on Model-Based Definition (MBD), which uses and re-uses 3D data (CAD models) in design, fabrication, and inspection.

There are other model-based approaches, and the use of the word “model” is always a challenge to define within the proper context.

For MBD, a model is 3D CAD data that comes in both native and neutral formats

Another model-based approach is Model-Based Systems Engineering (MBSE). The term “model” in this context is a formalized application of modeling to support system requirements, design, analysis, verification and validation activities beginning in the conceptual design phase and continuing throughout development and later lifecycle phases.

<Jos> I will come back on Model-Based Systems Engineering in future posts

Sometimes MBSE is about designing widgets, and often it is about representing the entire system and the business operations. For MBD, we often focus our education on the ASME Y14.47 definition that MBD is an annotated model and associated data elements that define the product without a drawing.

Model-Based Definition for Everybody?

I believe it took many years till 3D CAD design became a commodity; however, I still see the disconnected 2D drawing used to specify a product or part for manufacturing or suppliers. What are the benefits of model-based definition?
Are there companies that will not benefit from the model-based definition?

There’s no question that the manufacturing industry is addicted to their drawings. There are many reasons why, and yet mostly the problem is lack of awareness of how 3D CAD data can make design, fabrication, and inspection work easier.

For most, the person doing an inspection in the shipping and receiving department doesn’t have exposure to 3D data, and the only thing they have is a tabulated ERP database and maybe a drawing to read. If you plop down a 3D viewable that they can spin and zoom, they may not know how that relates to their job or what you want them to do differently.

Today’s approach of engineering championing MBD alone doesn’t work. To evolve information from the 2D drawing onto the 3D CAD model without engaging the stakeholders (machinists, assembly technicians, and inspectors) never yields a return on investment.

Organizations that succeed in transitioning to MBD are considering and incorporating all departments that touch the drawing today.

Incorporating all departments requires a vision from the management. Can you give some examples of companies that have transitioned to MBD, and what were the benefits they noticed?

I’ll give you an example of a small company with no First Article Inspection (FAI) regulatory requirements and a huge company with very rigorous FAI requirements.

 

Note: click on the images below to enjoy the details.

The small company instituted a system of CAD modeling discipline that allowed them to push 3D viewable information directly to the factory floor. The assembly technicians instantly understood engineering’s requirements faster and better.

The positive MBD messages for these use cases are 3D  navigation, CAD Re-Use, and better control of their revisions on the factory floor.

 

The large company has added inspection requirements directly onto their engineering and created a Bill of Characteristics (BOC) for the suppliers and internal manufacturers. They are removing engineering ambiguity, resulting in direct digital information exchange between engineering, manufacturing, and quality siloes.

These practices have reduced error and reduced time to market.

The positive MBD messages for these use cases are unambiguous requirements capture by Engineering, Quality Traceability, and Model-Based PMI (Product and Manufacturing Information).

Model-Based Definition and PLM?

How do you see the relation between Model-Based Definition and PLM? Is a PLM system a complication or aid to implement a Model-Based Definition? And do you see a difference between the old and new PLM Vendors?

Model-Based Definition data is complex and rich in connected information, and we want it to be. With that amount of connected data, a data management system (beyond upload/download of documents) must keep all that data straight.

Depending on the size and function of an organization, a PLM may not be needed. However, a way to manage changes and collaboration amongst those using 3D data is necessary. Sometimes that results in a less sophisticated Product Data Management (PDM) system. Large organizations often require PLM.

There is significant resistance to doing MBD and PLM implementations simultaneously because PLM is always over budget and behind schedule. However, doing just MBD or just PLM without the other doesn’t work either. I think you should be brave and do both at once.

I think we can debate why PLM is always over budget and behind schedule. I hear the same about ERP implementations. Perhaps it has to deal with the fact that enterprise applications have to satisfy many users?

I believe that working with model versions and file versions can get mixed in larger organizations, so there is a need for PDM or PLM. Have you seen successful implementations of both interacting together?

Yes, the only successful MBD implementations are those that already have a matured PDM/PLM (scaled best to the individual business).

 

Model-Based Definition and Digital Transformation

In the previous question, we already touched on the challenge of old and modern PLM. How do you see the introduction of Model-Based Definition addressing the dreams of Industry 4.0, the Digital Twin and other digital concepts?

I just gave a presentation at the ASME Digital Twin Summit discussing the importance of MBD for the Digital Twin. MBD is a foundational element that allows engineering to compare their design requirements to the quality inspection results of digital twin data.

The feedback loop between Engineering and Quality is fraught with labor-intensive efforts in most businesses today.

Leveraging the combination of MBD and Digital Twin allows automation possibilities to speed up and increase the accuracy of the engineering to inspection feedback loop. That capability helps organizations realize the vision of Industry 4.0.

And then there is OSCAR.

I noticed you announced OSCAR. First, I thought OSCAR was a virtual aid for model-based definition, and I liked the launching page HERE. Can you tell us more about what makes OSCAR unique?

One thing that is hard with MBD implementation is there is so much to know. Our MBDers at Action Engineering have been involved with MBD for many years and with many companies. We are embedded in real-life transitions from using drawings to using models.

Suppose you start down the model-based path for digital manufacturing. In that case, there are significant investments in time to learn how to get to the right set of capabilities and the right implementation plan guided by a strategic focus. OSCAR reduces that ramp-up time with educational resources and provides vetted and repeatable methods for an MBD implementation.

OSCAR combines decades of Action Engineering expertise and lessons learned into a multi-media textbook of sorts. To kickstart an individual or an organization’s MBD journey, it includes asynchronous learning, downloadable resources, and CAD examples available in Creo, NX, and SOLIDWORKS formats.

CAD users can access how-to training and downloadable resources such as the latest edition of Re-Use Your CAD (RUYC). OSCAR enables process improvement champions to make their case to start the MBD journey. We add content regularly and post what’s new. Free trials are available to check out the online platform.

Learn more about what OSCAR is here:

Want to learn more?

In this post, I believe we only touched the tip of the iceberg. There is so much to learn and understand. What would you recommend to a reader of this blog who got interested?

 

RUYC (Re-Use Your CAD)  is an excellent place to start, but if you need more audio-visual, and want to see real-life examples of MBD in action, get a Training subscription of OSCAR to get rooted in the vocabulary and benefits of MBD with a Model-Based Enterprise. Watch the videos multiple times! That’s what they are for. We love to work with European companies and would love to support you with a kickstart coaching package to get started.

What I learned

First of all, I learned that Jennifer is a very pragmatic person. Her company (Action Engineering) and her experience are a perfect pivot point for those who want to learn and understand more about Model-Based Definition. In particular, in the US, given her strong involvement in the American Society of Mechanical Engineers (ASME).

I am still curious if European or Asian counterparts exist to introduce and explain the benefits and usage of Model-Based Definition to their customers.  Feel free to comment.

Next, and an important observation too, is the fact that Jennifer also describes the tension between Model-Based Definition and PLM. Current PLM systems might be too rigid to support end-to-end scenarios, taking benefit of the Model-Based definition.

I have to agree here. PLM Vendors mainly support their own MBD (model-based definition), where the ultimate purpose is to share all product-related information using various models as the main information carriers efficiently.

We have to study and solve a topic in the PLM domain, as I described in my technical highlights from the PLM Road Map & PDT Spring 2021 conference.

There is work to do!

Conclusion

Model-Based Definition is, for me, one of the must-do steps of a company to understand the model-based future. A model-based future sometimes incorporates Model-Based Systems Engineering, a real Digital Thread and one or more Digital Twins (depending on your company’s products).

It is a must-do activity because companies must transform themselves to depend on digital processes and digital continuity of data to remain competitive. Document-driven processes relying on the interpretation of a person are not sustainable.

 

Last week I wrote about the recent PLM Road Map & PDT Spring 2021 conference day 1, focusing mainly on technology. There were also interesting sessions related to exploring future methodologies for a digital enterprise. Now on Day 2, we started with two sessions related to people and methodology, indispensable when discussing PLM topics.

Designing and Keeping Great Teams

This keynote speech from Noshir Contractor, Professor of Behavioral Sciences in the McCormick School of Engineering & Applied Science, intrigued me as the subtitle states: Lessons from Preparing for Mars. What Can PLM Professionals Learn from This?

You might ask yourself, is a PLM implementation as difficult and as complex as a mission to Mars? I hoped, so I followed with great interest Noshir’s presentation.

Noshir started by mentioning that many disruptive technologies have emerged in recent years, like Teams, Slack, Yammer and many more.

The interesting question he asked in the context of PLM is:

As the domain of PLM is all about trying to optimize effective collaboration, this is a fair question

Structural Signatures

Noshir shared with us that it is not the most crucial point to look at people’s individual skills but more about who they know.
Measure who they work with is more important than who they are.

Based on this statement, Noshir showed some network patterns of different types of networks.

Click on the image to see the enlarged picture.

It is clear from these patterns how organizations communicate internally and/or externally. It would be an interesting exercise to perform in a company and to see if the analysis matches the perceived reality.

Noshir’s research was used by NASA to analyze and predict the right teams for a mission to Mars.

Noshir went further by proposing what PLM can learn from teams that are going into space. And here, I was not sure about the parallel. Is a PLM project comparable to a mission to Mars? I hope not! I have always advocated that a PLM implementation is a journey. Still, I never imagined that it could be a journey into the remote unknown.

Noshir explained that they had built tools based on their scientific model to describe and predict how teams could evolve over time. He believes that society can also benefit from these learnings. Many inventions from the past were driven by innovations coming from space programs.

I believe Noshir’s approach related to team analysis is much more critical for organizations with a mission. How do you build multidisciplinary teams?

The proposed methodology is probably best for a holocracy based organization. Holocrazy is an interesting concept for companies to get their employees committed, however, it also demands a type of involvement that not every person can deliver.  For me, coming back to PLM, as a strategy to enable collaboration, the effectiveness of collaboration depends very much on the organizational culture and created structure.

DISRUPTION – EXTINCTION or still EVOLUTION?

We talk a lot about disruption because disruption is a painful process that you do not like to happen to yourself or your company. In the context of this conference’s theme, I discussed the awareness that disruptive technologies will be changing the PLM Value equation.

However, disruptive technologies are not alone sufficient. In PLM, we have to deal with legacy data, legacy processes, legacy organization structures, and often legacy people.

A disruption like the switch from mini-computers to PCs (killed DEC) or from Symbian to iOS (killed Nokia) is therefore not likely to happen that fast. Still, there is a need to take benefit from these new disruptive technologies.

My presentation was focusing on describing the path of evolution and focus areas for the PLM community. Doing nothing means extinction; experimenting and learning towards the future will provide an evolutionary way.

Starting from acknowledging that there is an incompatibility between data produced most of the time now and the data needed in the future, I explained my theme: From Coordinated to Connected. As a PLM community, we should spend more time together in focus groups, conferences on describing and verifying methodology and best practices.

Nigel Shaw (EuroStep) and Mark Williams (Boeing) hinted in this direction during this conference  (see day 1). Erik Herzog (SAAB Aeronautics) brought this topic to last year’s conference (see day 3). Outside this conference, I have comparable touchpoints with Martijn Dullaert when discussing Configuration Management in the future in relation to PLM.

In addition, this decade will probably be the most disruptive decade we have known in humanity due to external forces that push companies to change. Sustainability regulations from governments (the Paris agreement),  the implementation of circular economy concepts combined with the positive and high Total Share Holder return will push companies to adapt themselves more radical than before.

What is clear is that disruptive technologies and concepts, like Industry 4.0, Digital Thread and Digital Twin, can serve a purpose when implemented efficiently, ensuring the business becomes sustainable.

Due to the lack of end-to-end experience, we need focus groups and conferences to share progress and lessons learned. And we do not need to hear the isolated vendor success stories here as a reference, as often they are siloed again and leading to proprietary environments.

You can see my full presentation on SlideShare: DISRUPTION – EXTINCTION or still EVOLUTION?

 

Building a profitable Digital T(win) business

Beatrice Gasser,  Technical, Innovation, and Sustainable Development Director from the Egis group, gave an exciting presentation related to the vision and implementation of digital twins in the construction industry.

The Egis group both serves as a consultancy firm as well as an asset management organization. You can see a wide variety of activities on their website or have a look at their perspectives

Historically the construction industry has been lagging behind having low productivity due to fragmentation, risk aversion and recently, more and more due to the lack of digital talent. In addition, some of the construction companies make their money from claims inside of having a smooth and profitable business model.

Without innovation in the construction industry, companies working the traditional way would lose market share and investor-focused attention, as we can see from the BCG diagram I discussed in my session.

The digital twin of construction is an ideal concept for the future. It can be built in the design phase to align all stakeholders, validate and integrate solutions and simulate the building operational scenarios at almost zero materials cost. Egis estimates that by using a digital twin during construction, the engineering and construction costs of a building can be reduced between 15 and 25 %

More importantly, the digital twin can also be used to first simulate operations and optimize energy consumption. The connected digital twin of an existing building can serve as a new common data environment for future building stakeholders. This could be the asset owner, service companies, and even the regulatory authorities needing to validate the building’s safety and environmental impact.

Beatrice ended with five principles essential to establish a digital twin, i.e

I think the construction industry has a vast potential to disrupt itself. Faster than the traditional manufacturing industries due to their current needs to work in a best-connected manner.

Next, there is almost no legacy data to deal with for these companies. Every new construction or building is a unique project on its own. The key differentiators will be experience and efficient ways of working.

It is about the belief, the guts and the skilled people that can make it work – all for a more efficient and sustainable future.

 

 

Leveraging PLM and Cloud Technology for Market Success

Stan Przybylinski, Vice President of CIMdata, reported their global survey related to the cloud, completed in early 2021.  Also, Stan typified Industry 4.0 as a connected vision and cloud and digital thread as enablers to implementing this vision.

The companies interviewed showed a lot of goodwill to make progress – click on the image to see the details. CIMdata is also working with PLM Vendors to learn and describe better the areas of beneft. I remain curious about who comes with a realization and business case that is future-proof. This will define our new PLM Value Equation.

 

Conclusion

These were two exciting days with enough mentioning of disruptive technologies. Our challenge in the PLM domain will be to give them a purpose. A purpose is likely driven by external factors related to the need for a sustainable future.  Efficiency and effectiveness must come from learning to work in connected environments (digital twin, digital thread, industry 4.0, Model-Based (Systems) Engineering.

Note: You might have seen the image below already – a nice link between sustainability and the mission to Mars

One of my favorite conferences is the PLM Road Map & PDT conference. Probably because in the pre-COVID days, it was the best PLM conference to network with peers focusing on PLM practices, standards, and sustainability topics. Now the conference is virtual, and hopefully, after the pandemic, we will meet again in the conference space to elaborate on our experiences further.

Last year’s fall conference was special because we had three days filled with a generic PLM update and several A&D (Aerospace & Defense) working groups updates, reporting their progress and findings. Sessions related to the Multiview BOM researchGlobal Collaboration, and several aspects of Model-Based practices: Model-Based Definition, Model-Based Engineering & Model-Based Systems engineering.

All topics that I will elaborate on soon. You can refresh your memory through these two links:

This year, it was a two-day conference with approximately 200 attendees discussing how emerging technologies can disrupt the current PLM landscape and reshape the PLM Value Equation. During the first day of the conference, we focused on technology.

On the second day, we looked in addition to the impact new technology has on people and organizations.

Today’s Emerging Trends & Disrupters

Peter Bilello, CIMdata’s President & CEO, kicked off the conference by providing CIMdata observations of the market. An increasing number of technology capabilities, like cloud, additive manufacturing, platforms, digital thread, and digital twin, all with the potential of realizing a connected vision. Meanwhile, companies evolve at their own pace, illustrating that the gap between the leaders and followers becomes bigger and bigger.

Where is your company? Can you afford to be a follower? Is your PLM ready for the future? Probably not, Peter states.

Next, Peter walked us through some technology trends and their applicability for a future PLM, like topological data analytics (TDA), the Graph Database, Low-Code/No-Code platforms, Additive Manufacturing, DevOps, and Agile ways of working during product development. All capabilities should be related to new ways of working and updated individual skills.

I fully agreed with Peter’s final slide – we have to actively rethink and reshape PLM – not by calling it different but by learning, experimenting, and discussing in the field.

Digital Transformation Supporting Army Modernization

An interesting viewpoint related to modern PLM came from Dr. Raj Iyer, Chief Information Officer for IT Reform from the US Army. Rai walked us through some of the US Army’s challenges, and he gave us some fantastic statements to think about. Although an Army cannot be compared with a commercial business, its target remains to be always ahead of the competition and be aware of the competition.

Where we would say “data is the new oil”, Rai Iyer said: “Data is the ammunition of the future fight – as fights will more and more take place in cyberspace.”

The US Army is using a lot of modern technology – as the image below shows. The big difference here with regular businesses is that it is not about ROI but about winning fights.

Also, for the US Army, the cloud becomes the platform of the future. Due to the wide range of assets, the US Army has to manage, the importance of product data standards is evident.  – Rai mentioned their contribution and adherence to the ISO 10303 STEP standard crucial for interoperability. It was an exciting insight into the US Army’s current and future challenges. Their primary mission remains to stay ahead of the competition.

Joining up Engineering Data without losing the M in PLM

Nigel Shaw’s (Eurostep) presentation was somehow philosophical but precisely to the point what is the current dilemma in the PLM domain.  Through an analogy of the internet, explaining that we live in a world of HTTP(s) linking, we create new ways of connecting information. The link becomes an essential artifact in our information model.

Where it is apparent links are crucial for managing engineering data, Nigel pointed out some of the significant challenges of this approach, as you can see from his (compiled) image below.

I will not discuss this topic further here as I am planning to come back to this topic when explaining the challenges of the future of PLM.

As Nigel said, they have a debate with one of their customers to replace the existing PLM tools or enhance the existing PLM tools. The challenge of moving from coordinated information towards connected data is a topic that we as a community should study.

Integration is about more than Model Format.

This was the presentation I have been waiting for. Mark Williams from Boeing had built the story together with Adrian Burton from Airbus. Nigel Shaw, in the previous session, already pointed to the challenge of managing linked information. Mark elaborated further about the model-based approach for system definition.

All content was related to the understanding that we need a  model-based information infrastructure for the future because storing information in documents (the coordinated approach) is no longer viable for complex systems. Mark ‘slide below says it all.

Mark stressed the importance of managing model information in context, and it has become a challenge.

Mark mentioned that 20 years ago, the IDC (International Data Corporation) measured Boeing’s performance and estimated that each employee spent 2 ½ hours per day. In 2018, the IDC estimated that this number has grown to 30 % of the employee’s time and could go up to 50 % when adding the effort of reusing and duplicating data.

The consequence of this would be that a full-service enterprise, having engineering, manufacturing and services connected, probably loses 70 % of its information because they cannot find it—an impressive number asking for “clever” ways to find the correct information in context.

It is not about just a full indexed search of the data, as some technology geeks might think. It is also about describing and standardizing metadata that describes the models. In that context, Mark walked through a list of existing standards, all with their pros and cons, ending up with the recommendation to use the ISO 10303-243 – MoSSEC standard.

MoSSEC standing for Modelling and Simulation information in a collaborative Systems Engineering Context to manage and connect the relationships between models.

MoSSEC and its implication for future digital enterprises are interesting, considering the importance of a model-based future. I am curious how PLM Vendors and tools will support and enable the standard for future interoperability and collaboration.

Additive Manufacturing
– not as simple as paper printing – yet

Andreas Graichen from Siemens Energy closed the day, coming back to the new technologies’ topic: Additive Manufacturing or in common language 3D Printing. Andreas shared their Additive Manufacturing experiences, matching the famous Gartner Hype Cycle. His image shows that real work needs to be done to understand the technology and its use cases after the first excitement of the hype is over.

Material knowledge was one of the important topics to study when applying additive manufacturing. It is probably a new area for most companies to understand the material behaviors and properties in an Additive Manufacturing process.

The ultimate goal for Siemens Energy is to reach an “autonomous” workshop anywhere in the world where gas turbines could order their spare parts by themselves through digital warehouses. It is a grand vision, and Andreas confirmed that the scalability of Additive Manufacturing is still a challenge.

For rapid prototyping or small series of spare parts, Additive Manufacturing might be the right solution. The success of your Additive Manufacturing process depends a lot on how your company’s management has realistic expectations and the budget available to explore this direction.

Conclusion

Day 1 was enjoyable and educational, starting and ending with a focus on disruptive technologies. The middle part related to data the data management concepts needed for a digital enterprise were the most exciting topics to follow up in my opinion.

Next week I will follow up with reviewing day 2 and share my conclusions. The PLM Road Map & PDT Spring 2021 conference confirmed that there is work to do to understand the future (of PLM).

 

Last Friday, we discussed with several members of the PLM Global Green Alliance the book: “How to avoid a Climate Disaster” written by Bill Gates. I was happy to moderate the session between Klaus Brettschneider, Rich McFall, Lionel Grealou, Ilan Madjar and Patrick Hillberg. From the LinkedIn profiles of each of them, you can see we are all active in the domain of PLM. And they have read the book upfront before the discussion.

I think the book addresses climate change in a tangible manner. Bill Gates brings structure into addressing climate changes and encourages you to be active. What you can do as an individual, as a citizen. My only comment to this book would be that as a typical nerd, Bill Gates does not understand so much human behavior, understanding people’s emotions that might lead to non-logical behavior.

When you browse through the book’s reviews, for example, on Goodreads, you see the extreme, rating from 1 to 5. Some people believe that Bill Gates, due to his wealth and ways of living, is not allowed to write this book. Other like the transparent and pragmatic approach discussing the related themes in the book.

Our perspective

Klaus, Rich, Lio, Ilan and Patrick did not have extreme points of view – so don’t watch the recording if you are looking for anxiety. They reviewed How to Avoid a Climate Disaster from their perspective and how it could be relevant for PLM practitioners.  It became a well-balanced dialogue. You can watch or listen to the recording following this link:

Book discussion: How to avoid a climate disaster written by Bill Gates

Note: we will consolidate all content on our PLMGreenAlliances website to ensure nothing is lost – feel free to comment/discuss further.

More on sustainability

If you want to learn more about all sorts of disruption, not only disruption caused by climate change, have a look at the upcoming conference this week: DISRUPTION—the PLM Professionals’ Exploration of Emerging Technologies that Will Reshape the PLM Value Equation.

My contribution will be on day 2, where I combine disruptive technology with the need to become really sustainable in our businesses.

It will be a call for action from our PLM community. In the coming nine years, we have to change our business, become sustainable and use the relevant new technologies. This requires system thinking – will mankind being able to deal with so many different parameters.

Conclusion

Start the dialogue with us, the PLM Global Green Alliance, by watching and reading content from the website. Or become an active member participating in discussion sessions related to any relevant topic for our alliance. More to come at the end of May, you too?

 

 

 

 

 

 

 

 

Regularly (young) individuals approach me looking for advice to start or boost their PLM career. One of the questions the PLM Doctor is IN quickly could answer.

Before going further on this topic, there is also the observation that many outspoken PLM experts are “old.” Meanwhile, all kinds of new disruptive technologies are comping up.

Can these old guys still follow and advise on all trends/hypes?

My consultant’s answer is: “Yes and No” or “It depends”.

The answer illustrates the typical nature of a consultant. It is almost impossible to give a binary answer; still, many of my clients are looking for binary answers. Generalizing further, you could claim: “Human beings like binary answers”, and then you understand what is happening now in the world.

The challenge for everyone in the PLM domain is to keep an open mindset and avoid becoming binary. Staying non-binary means spending time to digest what you see, what you read or what you hear. Ask yourself always the question: Is it so simple? Try to imagine how the content you read fits in the famous paradigm: People, Processes and Tools. It would help if you considered all these aspects.

Learning by reading

I was positively surprised by Helena Gutierrez’s post on LinkedIn: The 8 Best PLM blogs to follow. First of all, Helena’s endorsement, explaining the value of having non-academic PLM information available as a foundation for her learnings in PLM.

And indeed, perhaps I should have written a book about PLM. However, it would be a book about the past. Currently, PLM is not stable; we are learning every day to use new technologies and new ways of working. For example, the impact and meaning of model-based enterprise.

However, the big positive surprise came from the number of likes within a few days, showing how valuable this information is for many others on their PLM journey. I am aware there are more great blogs out in the field, sometimes with the disadvantage that they are not in English and therefore have a limited audience.

Readers of this post, look at the list of 8 PLM blogs and add your recommended blog(s) in the comments.

Learning by reading (non-binary) is a first step in becoming or staying up to date.

Learning by listening

General PLM conferences have been an excellent way to listen to other people’s experiences in the past. Depending on the type of conference, you would be able to narrow your learning scope.

This week I started my preparation for the upcoming PLM Roadmap and PDT conference. Here various speakers will provide their insight related to “disruption,” all in the context of disruptive technologies for PLM.

Good news, also people and business aspects will be part of the conference.

Click on the image for the agenda and registration

My presentation with the title: DISRUPTION – EXTINCTION or still EVOLUTION? I will address all these aspects. We have entered a decisive decade to prove we can disrupt our old habits to save the planet for future generations.

It is challenging to be interactive as a physical conference; it is mainly a conference to get inspired or guided in your thinking about new PLM technologies and potential disruption.

Learning by listening and storing the content in your brain is the second step in becoming or staying up to date.

Learning by discussing

One of the best learnings comes from having honest discussions with other people who all have different backgrounds. To be part of such a discussion, you need to have at least some basic knowledge about the topic. This avoids social media-like discussions where millions of “experts” have an opinion behind the keyboard. (The Dunning-Kruger effect)

There are two upcoming discussions I want to highlight here.

1. Book review: How to Avoid a Climate Disaster.

On Thursday, May 13th, I will moderate a PLM Global Green Alliance panel discussion on Zoom to discuss Bill Gates’ book: “How to Avoid a Climate Disaster”. As you can imagine, Bill Gates is not known as a climate expert, more as a philanthrope and technology geek. However, the reviews are good.

What can we learn from the book as relevant for our PLM Global Green Alliance?

If you want to participate, read all the details on our PGGA website.

The PGGA core team members, Klaus Brettschneider, Lionel Grealou, Richard McFall, Ilan Madjar and Hannes Lindfred, have read the book.

 

2. The Modular Way Questions & Answers

In my post PLM and Modularity, I announced the option for readers of “The Modular Way” to ask the authors (Björn Eriksson & Daniel Strandhammar) or provide feedback on the book together with a small audience. This session is also planned to take place in May and to be scheduled based on the participants’ availability. At this moment, there are still a few open places. Therefore if you have read the book and want to participate, send an email to tacit@planet.nl or info@brickstrategy.com.

Learning by discussing is the best way to enrich your skills, particularly if you have Active Listening skills – crucial to have for a good discussion.

 

Conclusion

No matter where you are in your career, in the world of PLM, learning never stops. Twenty years of experience have no value if you haven’t seen the impact of digitalization coming. Make sure you learn by reading, by listening and by discussing.

After the series about “Learning from the past,” it is time to start looking towards the future.  I learned from several discussions that I am probably working most of the time with advanced companies. I believe this would motivate companies that lag behind even to look into the future even more.

If you look into the future for your company, you need new or better business outcomes. That should be the driver for your company. A company does not need PLM or a Digital Twin. A company might want to reduce its time to market, improve collaboration between all stakeholders. These objectives can be realized by different ways of working and an IT-infrastructure to allow these processes to become digital and connected.

That is the “game”. Coming back to the future of PLM.  We do not need a discussion about definitions; I leave this to the academics and vendors. We will see the same applies to the concept of a Digital Twin.

My statement: The digital twin is not new. Everybody can have their own digital twin as long as you interpret the definition differently. Does this sound like the PLM definition?

The definition

I like to follow the Gartner definition:

A digital twin is a digital representation of a real-world entity or system. The implementation of a digital twin is an encapsulated software object or model that mirrors a unique physical object, process, organization, person, or other abstraction. Data from multiple digital twins can be aggregated for a composite view across a number of real-world entities, such as a power plant or a city, and their related processes.

As you see, not a narrow definition. Now we will look at the different types of interpretations.

Single-purpose siloed Digital Twins

  1. Simple – data only

One of the most straightforward applications of a digital twin is, for example, my Garmin Connect environment. When cycling, my device registers performance parameters (speed, cadence, power, heartbeat, location). After every trip, I can analyze my performance. I can see changes in my overall performance; compare my performance with others in my category (weight, age, sex).

Based on that, I can decide if I want to improve my performance. My personal business goal is to maintain and improve my overall performance, knowing I cannot stop aging by upgrading my body.

On November 4th, 2020, I am participating in the (almost virtual) Digital Twin conference organized by Bits&Chips in the Netherlands. In the context of human performance, I look forward to Natal van Riel’s presentation: Towards the metabolic digital twin – for sure, this direction is not simple. Natal is a full professor at the Technical University in Eindhoven, the “smart city” in the Netherlands

  1. Medium – data and operating models

Many connected devices in the world use the same principle. An airplane engine, an industrial robot, a wind turbine, a medical device, and a train carriage; all track the performance based on this connection between physical and virtual, based on some sort of digital connectivity.

The business case here is also monitoring performance, predict maintenance, and upgrade the product when needed.

This is the domain of Asset Lifecycle Management, a practice that exists for decades. Based on financial and performance models, the optimal balance between maintaining and overhaul has to be found. Repairs are disruptive and can be extremely costly. A manufacturing site that cannot produce can costs millions per day. Connecting data between the physical and the virtual model allows us to have real-time insights and be proactive. It becomes a digital twin.

  1. Advanced – data and connected 3D model

The ditial twin we see the most in marketing videos is a virtual twin, using a 3D-representation for understanding and navigation.  The 3D-representation provides a Virtual Reality (VR) environment with connected data. When pointing at the virtual components, information might appear, or some animation takes place.

Building such a virtual representation is a significant effort; therefore, there needs to be a serious business case.

The simplest business case is to use the virtual twin for training purposes. A flight simulator provides a virtual environment and behavior as-if you are flying in the physical airplane – the behavior model behind the simulator should match as good as possible the real behavior. However, as it is a model, it will never be 100 % reality and requires updates when new findings or product changes appear.

A virtual model of a platform or plant can be used for training on Standard Operating Procedures (SOPs). In the physical world, there is no place or time to conduct such training. Here the complexity might be lower. There is a 3D Model; however, serious updates can only be expected after a major maintenance or overhaul activity.

These practices are not new either and are used in places where the physical training cannot be done.

More challenging is the Augmented Reality (AR) use case. Here the virtual model, most of the time, a lightweight 3D Model, connects to real-time data coming from other sources. For example, AR can be used when an engineer has to service a machine. The AR-environment might project actual data from the machine, indicate service points and service procedures.

The positive side of the business case is clear for such an opportunity, ensuring service engineers always work with the right information in a real-time context. The main obstacle for implementing AR, in reality, is the access to data, the presentation of the data and keeping the data in the AR-environment matching the reality.

And although there are 3D Models in use, they are, to my knowledge, always created in siloes, not yet connected to their design sources.Have a look at the Digital Twin conference from Bits&Chips, as mentioned before.

Several of the cases mentioned above will be discussed here. The conference’s target is to share real cases concluded by Q & A sessions, crucial for a virtual event.

Connected Virtual Twins along the product lifecycle

So far, we have been discussing the virtual twin concept, where we connect a product/system/person in the physical world to a virtual model. Now let us zoom in on the virtual twins relevant for the early parts of the product lifecycle, the manufacturing twin, and the development twin. This image from Siemens illustrates the concept:

On slides they imagine a complete integrated framework, which is the future vision. Let us first zoom in on the individual connected twins.

The digital production twin

This is the area of virtual manufacturing and creating a virtual model of the manufacturing plant. Virtual manufacturing planning is not a new topic. DELMIA (Dassault Systèmes) and Tecnomatix (Siemens) are already for a long time offering virtual manufacturing planning solutions.

At that time, the business case was based on the fact that the definition of a manufacturing plant and process done virtually allows you to optimize the plant before investing in physical assets.

Saving money as there is no costly prototype phase to optimize production. In a virtual world, you can perform many trade-off studies without extra costs. That was the past (and for many companies still the current situation).

With the need to be more flexible in manufacturing to address individual customer orders without increasing the overhead of delivering these customer-specific solutions, there is a need for a configurable plant that can produce these individual products (batch size 1).

This is where the virtual plant model comes into the picture again. Instead of having a virtual model to define the ultimate physical plant, now the virtual model remains an active model to propose and configure the production process for each of these individual products in the physical plant.

This is partly what Industry 4.0 is about. Using a model-based approach to configure the plant and its assets in a connected manner. The digital production twin drives the execution of the physical plant. The factory has to change from a static factory to a dynamic “smart” factory.

In the domain of Industry 4.0, companies are reporting progress. However, to my experience, the main challenge is still that the product source data is not yet built in a model-based, configurable manner. Therefore, requiring manual rework. This is the area of Model-Based Definition, and I have been writing about this aspect several times. Latest post: Model-Based: Connecting Engineering and Manufacturing

The business case for this type of digital twin, of course, is to be able to customer-specific products with extremely competitive speed and reduced cost compared to standard. It could be your company’s survival strategy. As it is hard to predict the future, as we see from COVID-19, it is still crucial to anticipate the future, instead of waiting.

The digital development twin

Before a product gets manufactured, there is a product development process. In the past, this was pure mechanical with some electronic components. Nowadays, many companies are actually manufacturing systems as the software controlling the product plays a significant role. In this context, the model-based systems engineering approach is the upcoming approach to defining and testing a system virtually before committing to the physical world.

Model-Based Systems Engineering can define a single complex product and perform all kinds of analysis on the system even before there is a physical system in place.  I will explain more about model-based systems engineering in future posts. In this context, I want to stress that having a model-based system engineering environment combined with modularity (do not confuse it with model-based) is a solid foundation for dealing with unique custom products. Solutions can be configured and validated against their requirements already during the engineering phase.

The business case for the digital development twin is easy to make. Shorter time to market, improved and validated quality, and reduced engineering hours and costs compared to traditional ways of working. To achieve these results,  for sure, you need to change your ways of working and the tools you are using. So it won’t be that easy!

For those interested in Industry 4.0 and the Model-Based System Engineering approach, join me at the upcoming PLM Road Map 2020 and PDT 2020 conference on 17-18-19 November. As you can see from the agenda, a lot of attention to the Digital Twin and Model-Based approaches.

Three digital half-days with hopefully a lot to learn and stay with our feet on the ground.  In particular, I am looking forward to Marc Halpern’s keynote speech: Digital Thread: Be Careful What you Wish For, It Just Might Come True

Conclusion

It has been very noisy on the internet related to product features and technologies, probably due to COVIC-19 and therefore disrupted interactions between all of us – vendors, implementers and companies trying to adjust their future. The Digital Twin concept is an excellent framing for a concept that everyone can relate to. Choose your business case and then look for the best matching twin.

I usually write a post after participating in a PLM conference. Last week, I participated in TECHNIA’s PLM Innovation Forum, which was a 100 % virtual event with over 1500 registered participants from 58 countries. These numbers show the power of a virtual conference during these difficult times. It is an excellent option for a sustainable future – less travel to be there.

The additional beauty of this event is that, although the live sessions are over, all the content will be available until May 31st. You can still join!

It was (and is) a well-organized and massive event with over 70 sessions; the majority pre-recorded. As you can imagine 70 live sessions in two days would be too massive to grasp. Today the Friday after the event, I have been watching other sessions that have my interest, and it felt like another conference day.

TECHNIA, globally the largest Dassault Systèmes (DS)  implementer after DS themselves as Jonas Geyer, Technia’s CEO,  mentioned in his introduction speech, illustrated the breadth of their industry and technology skills complementary or based on the 3DEXPERIENCE platform.

TECHNIA was supported by Dassault Systèmes Execs and subject experts. In addition, a larger group of companies and interest groups supported the conference, even our humble PLM Green Alliance as you can see in the image above.

I followed the full two live days in real-time, meanwhile man sitting in my virtual booth to chat with virtual visitors. To my surprise, the anxiety during the conference felt like a physical conference – you get energized.

The positive point for me,  no finger food or a standing lunch and decent coffee when needed. The point to enhance and learn for this type of event, is to make the booth a little more human – perhaps supported by video?

At the end,  a great event, and if you are interested in the Dassault Systèmes/TECHNIA combined offering, supported by customer stories, take the chance till the end of May to register and browse the rich content.

 

Now I will share some of my picks from the live event. Another post will come based on my additional discoveries and networking discussions.

 

 

The B.CONNECT project

Fabien Hoefer and Philip Haller both from B.Braun, a medical device, and pharmaceutical company, with a wide range of products.  Their massive PLM-project, approx. sixty persons involved was driven by the fact that every product has a lot of related data stored in different silos that it becomes impossible to have the correct understanding and status and to maintain it for the product and service lifecycle, on average, 10 – 15 years.

Their target is a real PLM-platform implementation connecting the people, the processes, data, and systems. Their aim is really about the “connected” approach, a characteristic of a digital company.

As you can still watch the presentation, look at the following topics discussed:

  • focus on product archetypes instead of division (portfolio management)
  • data templates based on classification, global and specific data sets (data governance)
  • the need to have a Master Data Management in place (data governance)
  • the unique product identifier (remember the FFF-discussion in my blog)
  • data-driven documentation (a perfect example of a digital PLM implementation)
  • platform strategy (one application for one capability in a heterogeneous systems environment)
  • Ownership of the PLM implementation at board level (it is not an engineering tool)
  • in the Q&A – the mix of waterfall & agile – the hybrid approach (as in the medical world the validation of the system is required – a point we missed in the SmarTeam FDA toolkit – validation of a system is needed when the system/processes change)

In the Q&A session, it was clear that the big elephant in the room, the migration, has been identified, but no answers yet. See my presentation to understand the reference to the elephant.  I am curious about B. Braun’s approach, given my experience with PLM digital transformations. Will it be entirely digital or hybrid.

Looking forward to learning more from Fabien or Philip.

Business drivers for Sustainable Manufacturing

This session, presented by Hannes Lindfred from TECHNIA, was one of my favorite presentations,  as it links tightly to what we want to achieve with the PLM Green Alliance.

The subtitle of the presentation says it all: “How PLM can support Supply chain transparency, Circular economy, and System oriented product development”.

In a relaxed and entertaining manner, he explained the concepts and the needs of a circular economy, combined with examples from reality. In particular, I liked his closing statement linking the potential of digitization, modern PLM, and the circular economy. We have to learn to think and act circular. Highly recommended to watch!

Leading PLM Trends & Potential Disruptors

A PLM conference would not be a PLM-conference if Peter Bilello from CIMdata would not be speaking. We share a lot of insights related to digital transformation and the understanding it requires the involvement of PLM. However, it is not the traditional PLM that is needed.

PLM needs to be rethought, think about the concept of a Product Innovation Platform. A digital platform is required if we want end-to-end digitalization; otherwise, we keep working in optimized silos.

Peter shared some survey results (see below) from early this year. It illustrates that most companies currently invest in traditional PDM aspects. Restating the need for our PLM communities to learn and educate and rethink aspects of PLM and learn to communicate them.

Remarkably similar to some of the aspects I explained in my: From Coordinated to Connected presentations. Changing to data, changing workforce, changing processes meaning systems thinking. Another plea for everyone to invest in learning. See his concluding remarks:

The closing Q&A session was interesting, addressing additive manufacturing, the graph database, and potential PLM disruptors coming from outside the traditional PLM space.

I recommend, pay attention to the closing questions – so many good points to put PLM in perspective.

From Coordinated to Connected & Sustainable

Of course, I recommend you watch my presentation. It is one of the few opportunities to hear in a short time all the thoughts and concepts that I developed over the past 5 – 6 years. It saves you reading all my blog posts, which are less structured than this presentation.

I recommend you to watch this presentation in the context of Peter Bilello’s presentation as there are a lot of similarities, told in different words.

After my presentation, I appreciated the Q&A part, as it allowed me to point to some more of the related topics: Legacy CAD-issues – the incompatibility of the past and future data, Management vision and the Perception of ROI.

 

Professional PLM
Raise your standards and your horizons

An interesting presentation to watch, after seeing Peter Bilello’s presentation and my presentation,  is the one given by Roger Tempest. Roger is another veteran in the PLM-world and co-founder of the PLM Interest group. For many years Roger is striving to get the PLM professional recognized and certified. We both share the experience that being a PLM consultant is not a profession to become wealthy.

One of the reasons might be that the scope of PLM and what is the required skill level is not precise. PLM considered as an engineering tool and PLM having so many diverse definitions.

The challenge of Roger’s approach is that it tries to capture people within a standardized PLM framework, which becomes apparent in the Q&A session. Currently, he is in the stage of building a steering group, “looking for companies that are fairly committed to PLM”. So which companies are the ones interested in PLM to commit time and resources to build a professional PLM body? This can be only academic people and PLM Vendors/Implementers. The last group will probably not likely agree on standardization.

Also related to the question about the different industries and maturity levels for companies came with an unsatisfactory answer. He talks about “absolute” PLM and no need to compare PLM with other industries. Here I believe there is such a fundamental difference in the meaning of PLM when talking to the traditional manufacturing companies as compared to high-tech/software-driven industries. I inserted here Marc Halpern’s maturity/technology diagram that I have been referencing in my presentation too.

The final question about vendors joining the PLM standardization group seems to be a utopia. As I expressed in my presentation, referring to Marc Halpern’s business maturity diagram, the vendors show us the vision of various business aspects related to PLM.

Marc already indicated this is the phase of the Product Innovation Platform.

As long as the professional PLM organization is focusing on defining the standard, I foresee the outside world will move faster and be more diverse than a single PLM expert can handle. A typical issue with many other standards as you can see below.

What’s Next

I hope to see and participate more in virtual PLM conferences as it allows much larger audiences to connect compared to traditional conferences. However, there are things to improve, and therefore I want to propose some enhancements:

Make sure during the “live” sessions, there is the experience of “being live and connected”. Even when streaming a pre-recorded lecture, always follow-up immediately with a live Q&A session. I found the Q&A sessions very educative as they clarify or put the presentation in a broader context.

The current virtual booth as only a chat room is too primitive – it reminded me of the early days of internet communication – discussion groups in ASCII-terminal mode through Compuserve (remember). A booth could become a virtual meeting space on its own – all, of course, depending on the amount of bandwidth available. The feeling of “The Doctor is in”

It is great that the content is available for 30 days, and I agree there is a need for a time limit on the content; otherwise, the conference becomes more a library. What I would like to see after the “live” days to still have a kind of place for sharing. What are your favorite presentations, and why should others look at it?

 

Conclusion

A great event and learning experience for me. Virtual conferences are the future for sure, and I encourage others to develop this type of conferences related to PLM further. It is a way to share knowledge and discuss topics in a sustainable manner. In the upcoming 30 days, I will come back to the conference one more time, based on interesting topics discovered or discussion related to the content. 

Meanwhile, I encourage you too – if you are still in lockdown and if there is time to study – this is one of these unique opportunities.

 

Translate

Email subscription to this blog

Categories

%d bloggers like this: