You are currently browsing the category archive for the ‘Education’ category.

My previous post introducing the concept of connected platforms created some positive feedback and some interesting questions. For example, the question from Maxime Gravel:

Thank you, Jos, for the great blog. Where do you see Change Management tool fit in this new Platform ecosystem?

is one of the questions I try to understand too. You can see my short comment in the comments here. However, while discussing with other experts in the CM-domain, we should paint the path forward. Because if we cannot solve this type of question, the value of connected platforms will be disputable.

It is essential to realize that a digital transformation in the PLM domain is challenging. No company or vendor has the perfect blueprint available to provide an end-to-end answer for a connected enterprise. In addition, I assume it will take 10 – 20 years till we will be familiar with the concepts.

It takes a generation to move from drawings to 3D CAD. It will take another generation to move from a document-driven, linear process to data-driven, real-time collaboration in an iterative manner.  Perhaps we can move faster, as the Automotive, Aerospace & Defense, and Industrial Equipment industries are not the most innovative industries at this time. Other industries or startups might lead us faster into the future.

Although I prefer discussing methodology, I believe before moving into that area, I need to clarify some more technical points before moving forward. My apologies for writing it in such a simple manner. This information should be accessible for the majority of readers.

What means data-driven?

I often mention a data-driven environment, but what do I mean precisely by that. For me, a data-driven environment means that all information is stored in a dataset that contains a single aspect of information in a standardized manner, so it becomes accessible by outside tools.

A document is not a dataset, as often it includes a collection of datasets. Most of the time, the information it is exposed to is not standardized in such a manner a tool can read and interpret the exact content. We will see that a dataset needs an identifier, a classification, and a status.

An identifier to be able to create a connection between other datasets – traceability or, in modern words, a digital thread.
A classification as the classification identifier will determine the type of information the dataset contains and potential a set of mandatory attributes

A status to understand if the dataset is stable or still in work.

Examples of a data-driven approach – the item

The most common dataset in the PLM world is probably the item (or part) in a Bill of Material. The identifier is the item number (ID + revision if revisions are used). Next, the classification will tell you the type of part it is.

Part classification can be a topic on its own, and every industry has its taxonomy.

Finally, the status is used to identify if the dataset is shareable in the context of other information (released, in work, obsolete), allowing tools to expose only relevant information.

In a data-driven manner, a part can occur in several Bill of Materials – an example of a single definition consumed in other places.

When the part information changes, the accountable person has to analyze the relations to the part, which is easy in a data-driven environment. It is normal to find this functionality in a PDM or ERP system.

When the part would change in a document-driven environment, the effort is much higher.

First, all documents need to be identified where this part occurs. Then the impact of change needs to be managed in document versions, which will lead to other related changes if you want to keep the information correct.

Examples of a data-driven approach – the requirement

Another example illustrating the benefits of a data-driven approach is implementing requirements management, where requirements become individual datasets.  Often a product specification can contain hundreds of requirements, addressing the needs of different stakeholders.

In addition, several combinations of requirements need to be handled by other disciplines, mechanical, electrical, software, quality and legal, for example.

As requirements need to be analyzed and ranked, a specification document would never be frozen. Trade-off analysis might lead to dropping or changing a single requirement. It is almost impossible to manage this all in a document, although many companies use Excel. The disadvantages of Excel are known, in particular in a dynamic environment.

The advantage of managing requirements as datasets is that they can be grouped. So, for example, they can be pushed to a supplier (as a specification).

Or requirements could be linked to test criteria and test cases, without the need to manage documents and make sure you work with them last updated document.

As you will see, also requirements need to have an Identifier (to manage digital relations), a classification (to allow grouping) and a status (in work / released /dropped)

Data-driven and Models – the 3D CAD model

3D PDF Model

When I launched my series related to the model-based approach in 2018, the first comments I got came from people who believed that model-based equals the usage of 3D CAD models – see Model-based – the confusion. 3D Models are indeed an essential part of a model-based infrastructure, as the 3D model provides an unambiguous definition of the physical product. Just look at how most vendors depict the aspects of a virtual product using 3D (wireframe) models.

Although we use a 3D representation at each product lifecycle stage, most companies do not have a digital continuity for the 3D representation. Design models are often too heavy for visualization and field services support. The connection between engineering and manufacturing is usually based on drawings instead of annotated models.

I wrote about modern PLM and Model-Based Definition, supported by Jennifer Herron from Action Engineering – read the post PLM and Model-Based Definition here.

If your company wants to master a data-driven approach, this is one of the most accessible learning areas. You will discover that connecting engineering and manufacturing requires new technology, new ways of working and much more coordination between stakeholders.

Implementing Model-Based Definition is not an easy process. However, it is probably one of the best steps to get your digital transformation moving. The benefits of connected information between engineering and manufacturing have been discussed in the blog post PLM and Model-Based Definition

Essential to realize all these exciting capabilities linked to Industry 4.0 require a data-driven, model-based connection between engineering and manufacturing.

If this is not the case, the projected game-changers will not occur as they become too costly.

Data-driven and mathematical models

To manage complexity, we have learned that we have to describe the behavior in models to make logical decisions. This can be done in an abstract model, purely based on mathematical equations and relations. For example, suppose you look at climate models, weather models or COVID infections models.

In that case, we see they all lead to discussions from so-called experts that believe a model should be 100 % correct and any exception shows the model is wrong.

It is not that the model is wrong; the expectations are false.

For less complex systems and products, we also use models in the engineering domain. For example, logical models and behavior models are all descriptive models that allow people to analyze the behavior of a product.

For example, how software code impacts the product’s behavior. Usually, we speak about systems when software is involved, as the software will interact with the outside world.

There can be many models related to a product, and if you want to get an impression, look at this page from the SEBoK wiki: Types of Models. The current challenge is to keep the relations between these models by sharing parameters.

The sharable parameters then again should be datasets in a data-driven environment. Using standardized diagrams, like SysML or UML,  enables the used objects in the diagram to become datasets.

I will not dive further into the modeling details as I want to remain at a high level.

Essential to realize digital models should connect to a data-driven infrastructure by sharing relevant datasets.

What does data-driven imply?

 

I want to conclude this time with some statements to elaborate on further in upcoming posts and discussions

  1. Data-driven does not imply there needs to be a single environment, a single database that contains all information. Like I mentioned in my previous post, it will be about managing connected datasets in a federated manner. It is not anymore about owned the data; it is about access to reliable data.
  2. Data-driven does not mean we do not need any documents anymore. Read electronic files for documents. Likely, document sets will still be the interface to non-connected entities, suppliers, and regulatory bodies. These document sets can be considered a configuration baseline.
  3. Data-driven means that we need to manage data in a much more granular manner. We have to look different at data ownership. It becomes more data accountability per role as the data can be used and consumed throughout the product lifecycle.
  4. Data-driven means that you need to have an enterprise architecture, data governance and a master data management (MDM) approach. So far, the traditional PLM vendors have not been active in the MDM domain as they believe their proprietary data model is leading. Read also this interesting McKinsey article: How enterprise architects need to evolve to survive in a digital world
  5. A model-based approach with connected datasets seems to be the way forward. Managing data in documents will become inefficient as they cannot contribute to any digital accelerator, like applying algorithms. Artificial Intelligence relies on direct access to qualified data.
  6. I don’t believe in Low-Code platforms that provide ad-hoc solutions on demand. The ultimate result after several years might be again a new type of spaghetti. On the other hand, standardized interfaces and protocols will probably deliver higher, long-term benefits. Remember: Low code: A promising trend or a Pandora’s Box?
  7. Configuration Management requires a new approach. The current methodology is very much based on hardware products with labor-intensive change management. However, the world of software products has different configuration management and change procedure. Therefore, we need to merge them in a single framework. Unfortunately, this cannot be the BOM framework due to the dynamics in software changes. An interesting starting point for discussion can be found here: Configuration management of industrial products in PDM/PLM

 

Conclusion

Again, a long post, slowly moving into the future with many questions and points to discuss. Each of the seven points above could be a topic for another blog post, a further discussion and debate.

After my summer holiday break in August, I will follow up. I hope you will join me in this journey by commenting and contributing with your experiences and knowledge.

 

 

 

 

One of my favorite conferences is the PLM Road Map & PDT conference. Probably because in the pre-COVID days, it was the best PLM conference to network with peers focusing on PLM practices, standards, and sustainability topics. Now the conference is virtual, and hopefully, after the pandemic, we will meet again in the conference space to elaborate on our experiences further.

Last year’s fall conference was special because we had three days filled with a generic PLM update and several A&D (Aerospace & Defense) working groups updates, reporting their progress and findings. Sessions related to the Multiview BOM researchGlobal Collaboration, and several aspects of Model-Based practices: Model-Based Definition, Model-Based Engineering & Model-Based Systems engineering.

All topics that I will elaborate on soon. You can refresh your memory through these two links:

This year, it was a two-day conference with approximately 200 attendees discussing how emerging technologies can disrupt the current PLM landscape and reshape the PLM Value Equation. During the first day of the conference, we focused on technology.

On the second day, we looked in addition to the impact new technology has on people and organizations.

Today’s Emerging Trends & Disrupters

Peter Bilello, CIMdata’s President & CEO, kicked off the conference by providing CIMdata observations of the market. An increasing number of technology capabilities, like cloud, additive manufacturing, platforms, digital thread, and digital twin, all with the potential of realizing a connected vision. Meanwhile, companies evolve at their own pace, illustrating that the gap between the leaders and followers becomes bigger and bigger.

Where is your company? Can you afford to be a follower? Is your PLM ready for the future? Probably not, Peter states.

Next, Peter walked us through some technology trends and their applicability for a future PLM, like topological data analytics (TDA), the Graph Database, Low-Code/No-Code platforms, Additive Manufacturing, DevOps, and Agile ways of working during product development. All capabilities should be related to new ways of working and updated individual skills.

I fully agreed with Peter’s final slide – we have to actively rethink and reshape PLM – not by calling it different but by learning, experimenting, and discussing in the field.

Digital Transformation Supporting Army Modernization

An interesting viewpoint related to modern PLM came from Dr. Raj Iyer, Chief Information Officer for IT Reform from the US Army. Rai walked us through some of the US Army’s challenges, and he gave us some fantastic statements to think about. Although an Army cannot be compared with a commercial business, its target remains to be always ahead of the competition and be aware of the competition.

Where we would say “data is the new oil”, Rai Iyer said: “Data is the ammunition of the future fight – as fights will more and more take place in cyberspace.”

The US Army is using a lot of modern technology – as the image below shows. The big difference here with regular businesses is that it is not about ROI but about winning fights.

Also, for the US Army, the cloud becomes the platform of the future. Due to the wide range of assets, the US Army has to manage, the importance of product data standards is evident.  – Rai mentioned their contribution and adherence to the ISO 10303 STEP standard crucial for interoperability. It was an exciting insight into the US Army’s current and future challenges. Their primary mission remains to stay ahead of the competition.

Joining up Engineering Data without losing the M in PLM

Nigel Shaw’s (Eurostep) presentation was somehow philosophical but precisely to the point what is the current dilemma in the PLM domain.  Through an analogy of the internet, explaining that we live in a world of HTTP(s) linking, we create new ways of connecting information. The link becomes an essential artifact in our information model.

Where it is apparent links are crucial for managing engineering data, Nigel pointed out some of the significant challenges of this approach, as you can see from his (compiled) image below.

I will not discuss this topic further here as I am planning to come back to this topic when explaining the challenges of the future of PLM.

As Nigel said, they have a debate with one of their customers to replace the existing PLM tools or enhance the existing PLM tools. The challenge of moving from coordinated information towards connected data is a topic that we as a community should study.

Integration is about more than Model Format.

This was the presentation I have been waiting for. Mark Williams from Boeing had built the story together with Adrian Burton from Airbus. Nigel Shaw, in the previous session, already pointed to the challenge of managing linked information. Mark elaborated further about the model-based approach for system definition.

All content was related to the understanding that we need a  model-based information infrastructure for the future because storing information in documents (the coordinated approach) is no longer viable for complex systems. Mark ‘slide below says it all.

Mark stressed the importance of managing model information in context, and it has become a challenge.

Mark mentioned that 20 years ago, the IDC (International Data Corporation) measured Boeing’s performance and estimated that each employee spent 2 ½ hours per day. In 2018, the IDC estimated that this number has grown to 30 % of the employee’s time and could go up to 50 % when adding the effort of reusing and duplicating data.

The consequence of this would be that a full-service enterprise, having engineering, manufacturing and services connected, probably loses 70 % of its information because they cannot find it—an impressive number asking for “clever” ways to find the correct information in context.

It is not about just a full indexed search of the data, as some technology geeks might think. It is also about describing and standardizing metadata that describes the models. In that context, Mark walked through a list of existing standards, all with their pros and cons, ending up with the recommendation to use the ISO 10303-243 – MoSSEC standard.

MoSSEC standing for Modelling and Simulation information in a collaborative Systems Engineering Context to manage and connect the relationships between models.

MoSSEC and its implication for future digital enterprises are interesting, considering the importance of a model-based future. I am curious how PLM Vendors and tools will support and enable the standard for future interoperability and collaboration.

Additive Manufacturing
– not as simple as paper printing – yet

Andreas Graichen from Siemens Energy closed the day, coming back to the new technologies’ topic: Additive Manufacturing or in common language 3D Printing. Andreas shared their Additive Manufacturing experiences, matching the famous Gartner Hype Cycle. His image shows that real work needs to be done to understand the technology and its use cases after the first excitement of the hype is over.

Material knowledge was one of the important topics to study when applying additive manufacturing. It is probably a new area for most companies to understand the material behaviors and properties in an Additive Manufacturing process.

The ultimate goal for Siemens Energy is to reach an “autonomous” workshop anywhere in the world where gas turbines could order their spare parts by themselves through digital warehouses. It is a grand vision, and Andreas confirmed that the scalability of Additive Manufacturing is still a challenge.

For rapid prototyping or small series of spare parts, Additive Manufacturing might be the right solution. The success of your Additive Manufacturing process depends a lot on how your company’s management has realistic expectations and the budget available to explore this direction.

Conclusion

Day 1 was enjoyable and educational, starting and ending with a focus on disruptive technologies. The middle part related to data the data management concepts needed for a digital enterprise were the most exciting topics to follow up in my opinion.

Next week I will follow up with reviewing day 2 and share my conclusions. The PLM Road Map & PDT Spring 2021 conference confirmed that there is work to do to understand the future (of PLM).

 

Last Friday, we discussed with several members of the PLM Global Green Alliance the book: “How to avoid a Climate Disaster” written by Bill Gates. I was happy to moderate the session between Klaus Brettschneider, Rich McFall, Lionel Grealou, Ilan Madjar and Patrick Hillberg. From the LinkedIn profiles of each of them, you can see we are all active in the domain of PLM. And they have read the book upfront before the discussion.

I think the book addresses climate change in a tangible manner. Bill Gates brings structure into addressing climate changes and encourages you to be active. What you can do as an individual, as a citizen. My only comment to this book would be that as a typical nerd, Bill Gates does not understand so much human behavior, understanding people’s emotions that might lead to non-logical behavior.

When you browse through the book’s reviews, for example, on Goodreads, you see the extreme, rating from 1 to 5. Some people believe that Bill Gates, due to his wealth and ways of living, is not allowed to write this book. Other like the transparent and pragmatic approach discussing the related themes in the book.

Our perspective

Klaus, Rich, Lio, Ilan and Patrick did not have extreme points of view – so don’t watch the recording if you are looking for anxiety. They reviewed How to Avoid a Climate Disaster from their perspective and how it could be relevant for PLM practitioners.  It became a well-balanced dialogue. You can watch or listen to the recording following this link:

Book discussion: How to avoid a climate disaster written by Bill Gates

Note: we will consolidate all content on our PLMGreenAlliances website to ensure nothing is lost – feel free to comment/discuss further.

More on sustainability

If you want to learn more about all sorts of disruption, not only disruption caused by climate change, have a look at the upcoming conference this week: DISRUPTION—the PLM Professionals’ Exploration of Emerging Technologies that Will Reshape the PLM Value Equation.

My contribution will be on day 2, where I combine disruptive technology with the need to become really sustainable in our businesses.

It will be a call for action from our PLM community. In the coming nine years, we have to change our business, become sustainable and use the relevant new technologies. This requires system thinking – will mankind being able to deal with so many different parameters.

Conclusion

Start the dialogue with us, the PLM Global Green Alliance, by watching and reading content from the website. Or become an active member participating in discussion sessions related to any relevant topic for our alliance. More to come at the end of May, you too?

 

 

 

 

 

 

 

 

Regularly (young) individuals approach me looking for advice to start or boost their PLM career. One of the questions the PLM Doctor is IN quickly could answer.

Before going further on this topic, there is also the observation that many outspoken PLM experts are “old.” Meanwhile, all kinds of new disruptive technologies are comping up.

Can these old guys still follow and advise on all trends/hypes?

My consultant’s answer is: “Yes and No” or “It depends”.

The answer illustrates the typical nature of a consultant. It is almost impossible to give a binary answer; still, many of my clients are looking for binary answers. Generalizing further, you could claim: “Human beings like binary answers”, and then you understand what is happening now in the world.

The challenge for everyone in the PLM domain is to keep an open mindset and avoid becoming binary. Staying non-binary means spending time to digest what you see, what you read or what you hear. Ask yourself always the question: Is it so simple? Try to imagine how the content you read fits in the famous paradigm: People, Processes and Tools. It would help if you considered all these aspects.

Learning by reading

I was positively surprised by Helena Gutierrez’s post on LinkedIn: The 8 Best PLM blogs to follow. First of all, Helena’s endorsement, explaining the value of having non-academic PLM information available as a foundation for her learnings in PLM.

And indeed, perhaps I should have written a book about PLM. However, it would be a book about the past. Currently, PLM is not stable; we are learning every day to use new technologies and new ways of working. For example, the impact and meaning of model-based enterprise.

However, the big positive surprise came from the number of likes within a few days, showing how valuable this information is for many others on their PLM journey. I am aware there are more great blogs out in the field, sometimes with the disadvantage that they are not in English and therefore have a limited audience.

Readers of this post, look at the list of 8 PLM blogs and add your recommended blog(s) in the comments.

Learning by reading (non-binary) is a first step in becoming or staying up to date.

Learning by listening

General PLM conferences have been an excellent way to listen to other people’s experiences in the past. Depending on the type of conference, you would be able to narrow your learning scope.

This week I started my preparation for the upcoming PLM Roadmap and PDT conference. Here various speakers will provide their insight related to “disruption,” all in the context of disruptive technologies for PLM.

Good news, also people and business aspects will be part of the conference.

Click on the image for the agenda and registration

My presentation with the title: DISRUPTION – EXTINCTION or still EVOLUTION? I will address all these aspects. We have entered a decisive decade to prove we can disrupt our old habits to save the planet for future generations.

It is challenging to be interactive as a physical conference; it is mainly a conference to get inspired or guided in your thinking about new PLM technologies and potential disruption.

Learning by listening and storing the content in your brain is the second step in becoming or staying up to date.

Learning by discussing

One of the best learnings comes from having honest discussions with other people who all have different backgrounds. To be part of such a discussion, you need to have at least some basic knowledge about the topic. This avoids social media-like discussions where millions of “experts” have an opinion behind the keyboard. (The Dunning-Kruger effect)

There are two upcoming discussions I want to highlight here.

1. Book review: How to Avoid a Climate Disaster.

On Thursday, May 13th, I will moderate a PLM Global Green Alliance panel discussion on Zoom to discuss Bill Gates’ book: “How to Avoid a Climate Disaster”. As you can imagine, Bill Gates is not known as a climate expert, more as a philanthrope and technology geek. However, the reviews are good.

What can we learn from the book as relevant for our PLM Global Green Alliance?

If you want to participate, read all the details on our PGGA website.

The PGGA core team members, Klaus Brettschneider, Lionel Grealou, Richard McFall, Ilan Madjar and Hannes Lindfred, have read the book.

 

2. The Modular Way Questions & Answers

In my post PLM and Modularity, I announced the option for readers of “The Modular Way” to ask the authors (Björn Eriksson & Daniel Strandhammar) or provide feedback on the book together with a small audience. This session is also planned to take place in May and to be scheduled based on the participants’ availability. At this moment, there are still a few open places. Therefore if you have read the book and want to participate, send an email to tacit@planet.nl or info@brickstrategy.com.

Learning by discussing is the best way to enrich your skills, particularly if you have Active Listening skills – crucial to have for a good discussion.

 

Conclusion

No matter where you are in your career, in the world of PLM, learning never stops. Twenty years of experience have no value if you haven’t seen the impact of digitalization coming. Make sure you learn by reading, by listening and by discussing.

For a year, we are now used to virtual events. PI PLMx 2020 in London was my last real event where I met people. When rereading my post about this event (the weekend after PI PLMx), I wrote that it was not a technology festival. Many presentations were about business change and how to engage people in an organization.

The networking discussions during the event and evenings were the most valuable parts of the conference.

And then came COVID-19. ☹

Shortly after, in April 2020, I participated in the TECHNIA Innovation Forum, which was the first virtual conference with a setup like a conference. A main stage, with live sessions, virtual booths, and many prerecorded sessions related to various PLM topics.

You can read my experience related to the conference in two posts: the weekend after PLMIF and My four picks from PLMIF. A lot of content available for 30 days. However, I was missing the social interaction, the people.

My favourite conference for 2020 was the CIMdata PLM Roadmap / PDT Fall 2020 conference in November. The PLM Roadmap/PDT conferences are not conferences for a novice audience; you have to be skilled in the domain of PLM most of the time with a strong presence from Aerospace and Defense companies.

The Fall 2020 theme: “Digital Thread—the PLM Professionals’ Path to Delivering Innovation, Efficiency, and Quality” might sound like a marketing term.

We hear so many times the words Digital Thread and Digital Twin. However, this conference was with speakers, active practitioners, from the field.  I wrote about this conference in two posts: The weekend after PLM Roadmap / PDT 2020 – Part 1 and Part 2. I enjoyed the conference; however, I was missing social interaction.

The Digital Twin

Beyond the marketing hype, there is still a lot to learn and discuss from each other. First of all, it is not about realizing a digital twin; a business need should be the driver to investigate the possibility of a digital twin.

I am preparing a longer blog post on this topic to share learnings from people in the field. For example, in November 2020, I participated in the Netherlands in a Digital Twin Conference, focusing on real-life cases.

Companies shared their vision and successes.  It was clear that we are all learning to solve pieces of the big puzzle; there are small successes. However, without marketing language, this type of event becomes extremely helpful for further discussion and follow-up.

Recently, I enjoyed the panel discussions during the PI DX Spotlight session: Digital Twin-Driven Design. The PI DX Spotlight sessions are a collection of deep dives in various themes – have a look for the upcoming schedule here.

In the Digital Twin-Driven Design session, I enjoyed the session: What does a Digital Twin mean to your Business and Defining Requirements?

The discussion was moderated by Peter Bilello, with three interesting panellists with different industrial backgrounds. (Click on the image for the details). I have to re-watch some of the Spotlight sessions (the beauty of a virtual event) to see how they fit in the planned Digital Twin post.

 

 

The Cenit/Keonys Innovation day

On March 23rd (this Tuesday), Cenit & Keonys launch their virtual Innovation Day, another event that, before COVID-19, would have been a real people event. I am mentioning this event in particular, as I was allowed to interview fifteen of their customers about their day-to-day work, PLM-related plans, and activities.

All these interviews have been recorded and processed in such a manner that within 5 to 8 minutes, you get an understanding of what people are doing.

To prepare for these interviews, I spoke with each of them before the interview. I wanted to understand the passion for their work and where our interests overlap.

I will not mention the individual interviews in this post, as I do not want to spoil the event. I talked with various startups (do they need PLM?)  and established companies that started a PLM journey. I spoke with simulation experts (the future) and dimensional management experts (listen to these interviews to understand what it means). And ultimately, I interviewed a traditional porcelain family brand using 3D printing and 3D design, and at the other end, the German CIO of the year from 2020

(if you Google a little, you will easily find the companies involved here)

The most common topics discussed were:

  • What was the business value of your PLM-related activity?
  • Did COVID-19 impact your business?
  • What about a cloud-based solution, and how do people align?
  • If relevant, what are your experiences with a Model-Based Definition?
  • What about sustainability?

I hope you will take the opportunity to register and watch these interviews as, for me, they were an excellent opportunity to be in touch with the reality in the field. As always, we keep on learning.

The Modular Way

Talking about learning. This week, I finished the book The Modular Way, written by Bjorn Eriksson & Daniel Strandhammar.  During the lockdown last year, Bjorn & Daniel, founders of the Brick Strategy, decided to write down their experiences with mainly Scandinavian companies into a coherent framework to achieve modularization.

Modularity is a popular topic in many board meetings. How often have you heard:  “We want to move from Engineering To Order to more Configure To Order”? Or another related incentive: “We need to be cleverer with our product offering and reduced the number of different parts”.

Next, the company buys a product that supports modularity, and management believes the work has been done. Of course, not. Modularity requires a thoughtful strategy.

Illustration from the book: The Modular Way

The book can be a catalyst for such companies that want to invest in modularity but do not know where and how to start. The book is not written academically. It is more a story taking you along the steps needed to define, implement, and maintain modularity. Every step has been illustrated by actual cases and their business motivation and achieved benefits where possible. I plan to come back with Bjorn and Daniel in a dedicated post related to PLM and Modularity.

Conclusion

Virtual Events are probably part of our new future. A significant advantage is the global reach of such events. Everyone can join from anywhere connected around the world. Besides the larger events, I look forward to discovering more small and targeted discussion events like PI DX Spotlights. The main challenge for all – keep it interactive and social.

Let us know your favourite virtual event !!

After “The Doctor is IN,” now again a written post in the category of PLM and complementary practices/domains. In January, I discussed together with Henrik Hulgaard from Configit the complementary value of PLM and CLM (Configuration Lifecycle Management). For me, CLM is a synonym for Product Configuration Management.

PLM and Complementary Practices (feedback)

As expected, readers were asking the question:

“What is the difference between CLM (Configuration Lifecycle Management) and CM(Configuration Management)?”

Good question.

As the complementary role of CM is also a part of the topics to discuss, I am happy to share this blog today with Martijn Dullaart. You probably know Martijn if you are actively following topics on PLM and CM.

Martijn has his own blog mdux.net, and you might have seen him recently in Jenifer Moore’s PLM TV-episode: Why CM2 for Faster Change and Better Documentation. Martijn is the Lead Architect for Enterprise Configuration Management at ASML (Our Dutch national pride) and chairperson of the Industry 4.0 committee of the Integrated Process Excellence (IPX) Congress. Let us start.

Configuration Management and CM2

Martijn, first of all, can you bring some clarity in terminology. When discussing Configuration Management, what is the pure definition, what is CM2 as a practice, and what is IpX‘s role and please explain where you fit in this picture?

Classical CM focuses mainly on the product, the product definition, and actual configurations like as-built and as-maintained of the product. CM2 extends the focus to the entire enterprise, e.g., the processes and procedures (ways of working) of a company, including the IT and facilities, to support the company’s value stream.

CM2 expands the scope to all information that could impact safety, security, quality, schedule, cost, profit, the environment, corporate reputation, or brand recognition.

Basically, CM2 shifts the focus to Integrated Process Excellence and promotes continual improvement.

Next to this, CM2 provides the WHAT and the HOW, something most standards lack. My main focus is still around the product and promoting the use of CM outside the product domain.

For all CM related documentation, we are already doing this.

Configuration Management and PLM

People claim that if you implement PLM as an enterprise backbone, not as an engineering tool, you can do Configuration Management with your PLM environment.

What is your opinion?

Yes, I think that this is possible, provided that the PLM tool has the right capabilities. Though the question should be: Is this the best way to go about it. For instance, some parts of Configuration Management are more transactional oriented, e.g., registering the parts you build in or out of a product.

Other parts of CM are more iterative in nature, e.g., doing impact analysis and making an implementation plan. I am not saying this cannot be done in a PLM tool as an enterprise backbone. Still, the nature of most PLM tools is to support iterative types of work rather than a transactional type of work.

I think you need some kind of enterprise backbone that manages the configuration as an As-Planned/As-Released baseline. A baseline that shows not only the released information but also all planned changes to the configuration.

Because the source of information in such a baseline comes from different tools, you need an overarching tool to connect everything. For most companies, this means that they require an overarching system with their current state of enterprise applications.

Preferably I would like to use the data directly from the sources. Still, connectivity and performance are not yet to a level that we can do this. Cloud and modern application and database architectures are very promising to this end.

 

Configuration Management for Everybody?

I can imagine companies in the Aerospace industry need to have proper configuration management for safety reasons. Also, I can imagine that proper configuration management can be relevant for other industries. Do they need to be regulated, or are there other reasons for a company to start implementing CM processes?

I will focus the first part of my answer within the context of CM for products only.

Basically, all products are regulated to some degree. Aerospace & Defense and Medical Device and Pharma are highly regulated for obvious reasons. Other industries are also regulated, for example, through environmental regulations like REACH, RoHS, WEEE or safety-related regulations like the CE marking or FCC marking.

Customers can also be an essential driver for the need for CM. If, as a customer, you buy expensive equipment, you expect that the supplier of that equipment can deliver per commitment. The supplier can also maintain and upgrade the equipment efficiently with as few disruptions to your operations as possible.

Not just customers but also consumers are critical towards the traceability of the product and all its components.

Even if you are sitting on a rollercoaster, you presume the product is well designed and maintained. In other words, there is often a case to be made to apply proper configuration management in any company. Still, the extent to which you need to implement it may vary based on your needs.

 

The need for Enterprise Configuration Management is even more significant because one of the hardest things is to change the way an organization works and operates.

Often there are different ways of doing the same thing. There is a lot of tribal knowledge, and ways of working are not documented so that people can easily find it, let alone that it is structured and linked so that you can do an impact analysis when you want to introduce a change in your organization.

 

CM and Digital Transformation

One of the topics that we both try to understand better is how CM will evolve in the future when moving to a more model-based approach. In the CM-terminology, we still talk about documents as information objects to be managed. What is your idea of CM and a model-based future?

It is indeed a topic where probably new or changed methodology is required, and I started already describing CM topics in several posts on my enterprise MDUX blog. Some of the relevant posts in this context are:

First, let me say that model-based has the future, although, at the same time, the CM aspects are often overlooked.

When managing changes, too much detail makes estimating cost and effort for a business case more challenging, and planning information that is too granular is not desirable. Therefore, CM2 looks at datasets. Datasets should be as small as possible but not smaller. Datasets are sets of information that need to be released as a whole. Still, they can be released independently from other datasets. For example, a bill of materials, a BOM line item is not a dataset, but the complete set of BOM line items that make up the BoM of an assembly is considered a dataset. I can release a BoM independent from a test plan.

Data models need to facilitate this. However, today, in many PLM systems, a BOM and the metadata of a part are using the same revision. This means that to change the metadata, I need a revision of the BoM, while the BoM might not change. Some changes to metadata might not be relevant for a supplier. Communicating the changes to your supplier could create confusion.

I know some people think this is about document vs. model-centric, but it is not. A part is identified in the ‘physical world’ by its part ID. Even if you talk about allowing revisions in the supply chain, including the part ID’s revision, you create a new identifier. Now every new revision will end up in a different stock location. Is that what we want?

In any case, we are still in the early days, and the thinking about this topic has just begun and needs to take shape in the coming year(s).

 

CM and/or CLM?

As in my shared blog post with Henrik Hulgaard related to CLM, can you make a clear differentiation between the two domains for the readers?

 

Configuration Lifecycle Management (CLM)  is mainly positioned towards Configurable Products and the configurable level of the product.

 

Why I think this, even though Configit’s  CLM declaration states that “Configuration Lifecycle Management (CLM) is the management of all product configuration definitions and configurations across all involved business processes applied throughout the lifecycle of a product.”,
it also states:

  • “CLM differs from other Enterprise Business Disciplines because it focuses on cross-functional use of configurable products.”
  • “Provides a Single Source of Truth for Configurable Data
  • “handles the ever-increasing complexity of Configurable Products“.

I find Configuration Lifecycle Management is a core Configuration Management practice you need to have in place for configurable products. The dependencies you need to manage are enormously complex. Software parameters that depend on specific hardware, hardware to hardware dependencies, commercial variants, and options.

Want to learn more?

In this post, we just touched the surface of PLM and Configuration Management. Where can an interested reader find more information related to CM for their company?

 

For becoming trained in CM2, people can reach out to the Institute for Process Excellence, a company that focuses on consultancy and methodology for many aspects of a modern, digital enterprise, including Configuration Management.

And there is more out there, e.g.:

Conclusion

Thanks, Martijn, for your clear explanations. People working seriously in the PLM domain managing the full product lifecycle should also learn and consider Configuration Management best practices. I look forward to a future discussion on how to perform Configuration Management in a model-based environment.

PLM, CLM, and CM – mind the overlap

 

 

 

 

First of all, thank you for the overwhelming response to the survey that I promoted last week: PLM 2021– your goals? It gave me enough inspiration and content to fill the upcoming months.

The first question of the survey was dealing with complementary practices or systems related to a traditional PLM-infrastructure.

As you can see, most of you are curious about Digital Twin management 68 % (it is hype). Second best are Configuration Management, Product Configuration Management and Supplier Collaboration Management, all with 58% of the votes. Click on the image to see the details. Note: you could vote for more than one topic.

Product Configuration Management

Therefore, I am happy to share this blog space with Configit’s CTO, Henrik Hulgaard. Configit is a company specialized in Product Configuration Management, or as they call it, Configuration Lifecycle Management (CLM).

Recently Henrik wrote an interesting article on LinkedIn: How to achieve End-To-End Configuration.  A question that I heard several times from my clients. How to align the selling and delivery of configurable products, including sales, engineering and manufacturing?

Configit – the company / the mission

Henrik, thanks for helping me explaining the complementary value of end-to-end Product Configuration Management to traditional PLM systems. First of all, can you give a short introduction to Configit as a company and the unique value you are offering to your clients?

Hi Jos, thank you for having me. Configit has worked with configuration challenges for the last 20 years. We are approximately 200 people and have offices in Denmark, Germany, India, and in the US (Atlanta and Detroit) and work with some of the world’s largest manufacturing companies.

We are founded on patented technology, called Virtual Tabulation. The YouTube movie below explains the term Virtual Tabulation.

Virtual Tabulation compiles EVERY possible configuration scenario and then compresses that data into a very small file so that it can be used by everyone in your team.

Virtual Tabulations enables important capabilities such as:

  • Consolidation of all configuration data, both Engineering and Sales related, into single-source-of-truth.
  • Effortless maintenance of complicated rule data.
  • Fast and error-free configuration engine that provides perfect guidance to the customer across multiple platforms and channels..

As the only vendor, Configit provides a configuration platform that fully supports end-to-end configuration processes, from early design and engineering, over sales and manufacturing to support and service configurable products.

This is what we understand by Configuration Lifecycle Management (CLM).

Why Configuration Lifecycle Management?

You have introduced the term Configuration Lifecycle Management – another TLA (Three Letter Acronym) and easy pronounce. However, why would a company being interested to implement Configuration Lifecycle Management (CLM)?

CLM is a way to break down the siloed systems traditionally found in manufacturing companies where products are defined in a PLM system, sold using a CRM/CPQ system, manufactured using an ERP system and serviced by typically ad-hoc and home-grown systems.  A CLM system feeds these existing systems with an aligned and consistent view of what variants of a configurable product is available.

Organizations obtain several benefits when aligning across functions on what product variants it offers:

  • Engineering: faster time-to-market, optimized variability, and the assurance to only engineer products that are sold
  • Sales: reducing errors, making sure that what gets quoted is accurate, and reducing the time to close the deal. The configurator provides current, up-to-date, and accurate information.
  • Manufacturing: reducing errors and production stoppages due to miss-builds
  • Service: accurate information about the product’s configuration. The service technician knows precisely what capabilities to expect on the particular product to be serviced.

For example, one of our customers experienced a 95% reduction in the time – from a year to two weeks – it took them to create the configuration models needed to build and sell their products. This reduction meant a significant reduction in time to market and allowed additional product lines to be introduced.

CLM for everybody?

I can imagine that companies with products that are organized for mass-production still wanting to have the mindset of being as flexible as possible on the sales side. What type of companies would benefit the most from a CLM approach?

Any company that offers customized or configurable products or services will need to ensure that what is engineered is aligned with what is sold and serviced. Our customers typically have relatively high complexity with hundreds to thousands of configuration parameters.

CLM is not just for automotive companies that have high volume and high complexity. Many of our customers are in industrial components and machinery, offering complex systems and services. A couple of examples:

Philips Healthcare sells advanced scanners to hospitals and uses CLM to ensure that what is sold is aligned with what can be offered. They also would like to move to sell scanners as a service where the hospital may pay per MR scan.

Thyssenkrupp Elevators sell elevators that are highly customizable based on the needs and environment. The engineering rules start in the CAD environment. They are combined with commercial rules to provide guidance to the customer about valid options.

CLM and Digital Transformation

For me, CLM is an excellent example of what modern, digital enterprises need to do. Having product data available along the whole lifecycle to make real-time decisions. CLM is a connecting layer that allows companies to break the siloes between marketing, sales, engineering and operations. At C-level get excited by that idea as I can see the business value.

Now, what would you recommend realizing this idea?

  • The first step is to move away from talking about parts and instead talk about features when communicating about product capabilities.

This requires that an organization establishes a common feature “language” (sometimes this is called a master feature dictionary) that is shared across the different functions.

As the feature codes are essential in the communication between the functions, the creation and updating of the feature language must be carefully managed by putting people and processes in place to manage them.

  • The next step is typically to make information about valid configurations available in a central place, sometimes referred to as the single source of truth for configuration.

We offer services to expose this information and integrate it into existing enterprise systems such as PLM, ERP and CRM/CPQ.  The configuration models may still be maintained in legacy systems. Still, they are imported and brought together in the CLM system.

Once consuming systems all share a single configuration engine, the organization may move on to improve on the rule authoring and replace the existing legacy rule authoring applications found in PLM and ERP systems with more modern applications such as Configit Ace.

Customer Example: Connecting Sales, R&D and ERP

As can be seen from above, these steps all go across the functional silos. Thus, it is essential that the CLM journey has top-level management support, typically from the CIO.

COVID-19?

Related to COVID-19, I believe companies realized that they had to reconsider their supply chains due to limiting dependencies on critical suppliers. Is this an area where Configit would contribute too?

The digital transformation that many manufacturing companies have worked on for years clearly has been accelerated by the COVID-19 situation, and indeed they might now start to encode information about the critical suppliers in the rules.

We have seen this happening in 2011 with the tsunami in Japan when suddenly supplier could not provide certain parts anymore.  The organization then has to quickly adapt the rules so that the options requiring those parts are no longer available to order.

Therefore, the CLM vision also includes suppliers as configuration knowledge has to be shared across organizations to ensure that what is ordered also can be delivered.

Learning more?

It is clear that CLM is a complementary layer to standard PLM-infrastructures and complementary to CRM and ERP.  A great example of what is possible in a modern, digital enterprise. Where can readers find more information?

Configit offers several resources on Configuration Lifecycle Management on our website, including our blog,  webinars and YouTube videos, e.g., Tech Chat on Manufacturing and Configuration Lifecycle Management (CLM)

Besides these continuous growing resources, there is the whitepaper “Accelerating Digital Transformation in Manufacturing with Configuration Lifecycle Management (CLM)” available here among other whitepapers.

What I have learned

  • Configuration Lifecycle Management is relevant for companies that want to streamline their business functions, i.e., sales, engineering, manufacturing, and service. CLM will reduce the number of iterations in the process, reduce costly fixing when trying to align to customer demands, and ultimately create more service offerings by knowing customer existing configurations.
  • The technology to implement CLM is there. Configit has shown in various industries, it is possible. It is an example of adding value on top of a digital information infrastructure (CRM, PLM, and ERP)
  • The challenge will be on aligning the different functions to agree and align on one standard configuration authority. Therefore, responsibility should lie at the top-level of an organization, likely the modern CIO or CDO.
  • I was glad to learn that Henrik stated:

    “The first step is to move away from talking about parts and instead talk about features when communicating about product capabilities”.

    A topic I will discuss soon when talking about Product & Portfolio Management with PLM.

Conclusion

It was a pleasure to work with Configit, in particular, Henrik Hulgaard, learning more about Configuration Lifecycle Management or whatever you may name it. More important, I hope you find this post insightful for your understanding if and where it applies to your business.

Always feel free to ask more questions related to the complimentary value of PLM and Product Configuration Management(CLM)

It Is 2021, and after two weeks’ time-out and reflection, it is time to look forward. Many people have said that 2020 was a “lost year,” and they are looking forward to a fresh restart, back to the new normal. For me, 2020 was the contrary of a lost year. It was a year where I had to change my ways of working. Communication has changed, digitization has progressed, and new trends have become apparent.

If you are interested in some of the details, watch the conversation I had with Rob Ferrone from QuickRelease, just before Christmas: Two Santas looking back to 2020.

It was an experiment with video, and you can see there is a lot to learn for me. I agree with Ilan Madjar’s comment that it is hard to watch two people talking for 20 minutes. I prefer written text that I can read at my own pace, short videos (max 5 min), or long podcasts that I can listen to, when cycling or walking around.

So let me share with you some of the plans I have for 2021, and I am eager to learn from you where we can align.

PLM understanding

I plan a series of blog posts where I want to share PLM-related topics that are not necessarily directly implemented in a PLM-system or considered in PLM-implementations as they require inputs from multiple sources.  Topics in this context are: Configuration Management, Product Configuration Management, Product Information Management, Supplier Collaboration Management, Digital Twin Management, and probably more.

For these posts, I will discuss the topic with a subject matter expert, potentially a vendor or a consultant in that specific domain, and discuss the complementary role to traditional PLM. Besides a blog post, this topic might also be more explained in-depth in a podcast.

The PLM Doctor is in

Most of you might have seen Lucy from the Charley Brown cartoon as the doctor giving advice for 5¢. As an experiment, I want to set up a similar approach, however, for free.

These are my conditions:

  • Only one question at a time.
  • The question and answer will be published in a 2- 3 minute video.
  • The question is about solving a pain.

If you have such a question related to PLM, please contact me through a personal message on LinkedIn, and I will follow-up.

PLM and Sustainability

A year ago, I started with Rich McFall, the PLM Green Global Alliance.  Our purpose to bring people together, who want to learn and share PLM-related practices, solutions,  ideas contributing to a greener and more sustainable planet.

We do not want to compete or overlap with more significant global or local organizations, like the Ellen McArthur Foundation or the European Green Deal.

We want to bring people together to dive into the niche of PLM and its related practices.  We announced the group on LinkedIn; however, to ensure a persistent referential for all information and interactions, we have launched the website plmgreenaliance.com.

Here I will moderate and focus on PLM and Sustainability topics. I am looking forward to interacting with many of you.

PLM and digitization

For the last two years, I have been speaking and writing about the gap between current PLM-practices, based on shareable documents and files and the potential future based on shareable data, the Model-Based Enterprise.

Last year I wrote a series of posts giving insights on how we reached the current PLM-practices. Discovering sometimes inconsistencies and issues due to old habits or technology changes. I grouped these posts on a single blog page with the title:  Learning from the past.

This year I will create a collection of posts focusing on the transition towards a Model-Based Enterprise. Probably the summary page will be called: Working towards the future currently in private mode.

Your feedback

I am always curious about your feedback – to understand in which kind of environment your PLM activities take place. Which topics are unclear? What am I missing in my experience?

Therefore, I created a small anonymous survey for those who want to be interacting with me. On purpose, the link is at the bottom of the post, so when you answer the survey, you get my double appreciation, first for reaching the end of this post and second for answering the survey.

Take the survey here.

Conclusion

Most of us will have a challenging year ahead of us. Sharing and discussing challenges and experiences will help us all to be better in what we are doing. I look forward to our 2021 journey.

For those living in the Northern Hemisphere: This week, we had the shortest day, or if you like the dark, the longest night. This period has always been a moment of reflection. What have we done this year?

Rob Ferrone (Quick Release), the Santa on the left (the leftist), and Jos Voskuil (TacIT), the Santa on the right (the rightist), share in a dialogue their highlights from 2020

Wishing you all a great moment of reflection and a smooth path into a Corona-proof future.

It will be different; let’s make it better.

 

Last week I shared my first review of the PLM Roadmap / PDT Fall 2020 conference, organized by CIMdata and Eurostep. Having digested now most of the content in detail, I can state this was the best conference of 2020. In my first post, the topics I shared were mainly the consultant’s view of digital thread and digital twin concepts.

This time, I want to focus on the content presented by the various Aerospace & Defense working groups who shared their findings, lessons-learned (so far) on topics like the Multi-view BOM, Supply Chain Collaboration, MBSE Data interoperability.

These sessions were nicely wrapped with presentations from Alberto Ferrari (Raytheon), discussing the digital thread between PLM and Simulation Lifecycle Management and Jeff Plant (Boeing) sharing their Model-Based Engineering strategy.

I believe these insights are crucial, although there might be people in the field that will question if this research is essential. Is not there an easier way to achieve to have the same results?

Nicely formulated by Ilan Madjar as a comment to my first post:

Ilan makes a good point about simplifying the ideas to the masses to make it work. The majority of companies probably do not have the bandwidth to invest and understand the future benefits of a digital thread or digital twins.

This does not mean that these topics should not be studied. If your business is in a small, simple eco-system and wants to work in a connected mode, you can choose a vendor and a few custom interfaces.

However, suppose you work in a global industry with an extensive network of partners, suppliers, and customers.

In that case, you cannot rely on ad-hoc interfaces or a single vendor. You need to invest in standards; you need to study common best practices to drive methodology, standards, and vendors to align.

This process of standardization is so crucial if you want to have a sustainable, connected enterprise. In the end, the push from these companies will lead to standards, allowing the smaller companies to ad-here or connect to.

The future is about Connected through Standards, as discussed in part 1 and further in this post. Let’s go!

Global Collaboration – Defining a baseline for data exchange processes and standards

Katheryn Bell (Pratt & Whitney Canada) presented the progress of the A&D Global Collaboration workgroup. As you can see from the project timeline, they have reached the phase to look towards the future.

Katheryn mentioned the need to standardize terminology as the first point of attention. I am fully aligned with that point; without a standardized terminology framework, people will have a misunderstanding in communication.

This happens even more in the smaller businesses that just pick sometimes (buzz) terms without a full understanding.

Several years ago, I talked with a PLM-implementer telling me that their implementation focus was on systems engineering. After some more explanations, it appeared they were making an attempt for configuration management in reality. Here the confusion was massive. Still, a standard, common terminology is crucial in our domain, even if it seems academic.

The group has been analyzing interoperability standards, standards for long-time archival and retrieval (LOTAR), but also has been studying the ISO 44001 standard related to Collaborative business relationship management systems

In the Q&A session, Katheryn explained that the biggest problem to solve with collaboration was the risk of working with the wrong version of data between disciplines and suppliers.

Of course, such errors can lead to huge costs if they are discovered late (or too late). As some of the big OEMs work with thousands of suppliers, you can imagine it is not an issue easily discovered in a more ad-hoc environment.

The move to a standardized Technical Data Package based on a Model-Based Definition is one of these initiatives in this domain to reduce these types of errors.

You can find the proceedings from the Global Collaboration working group here.

 

Connect, Trace, and Manage Lifecycle of Models, Simulation and Linked Data: Is That Easy?

I loved Alberto Ferrari‘s (Raytheon) presentation how he described the value of a model-based digital thread, positioning it in a targeted enterprise.

Click on the image and discover how business objectives, processes and models go together supported by a federated infrastructure.

Alberto’s presentation was a kind of mind map from how I imagine the future, and it is a pity if you have not had the chance to see his session.

Alberto also focused on the importance of various simulation capabilities combined with simulation lifecycle management. For Alberto, they are essential to implement digital twins. Besides focusing on standards, Alberto pleas for a semantic integration, open service architecture with the importance of DevSecOps.

Enough food for thought; as Alberto mentioned, he presented the corporate vision, not the current state.

More A&D Action Groups

There were two more interesting specialized sessions where teams from the A&D action groups provided a status update.

Brandon Sapp (Boeing) and Ian Parent (Pratt & Whitney) shared the activities and progress on Minimum Model-Based Definition (MBD) for Type Design Certification.

As Brandon mentioned, MBD is already a widely used capability; however, MBD is still maturing and evolving.  I believe that is also one of the reasons why MBD is not yet accepted in mainstream PLM. Smaller organizations will wait; however, can your company afford to wait?

More information about their progress can be found here.

Mark Williams (Boeing) reported from the A&D Model-Based Systems Engineering action group their first findings related to MBSE Data Interoperability, focusing on an Architecture Model Exchange Solution.  A topic interesting to follow as the promise of MBSE is that it is about connected information shared in models. As Mark explained, data exchange standards for requirements and behavior models are mature, readily available in the tools, and easily adopted. Exchanging architecture models has proven to be very difficult. I will not dive into more details, respecting the audience of this blog.

For those interested in their progress, more information can be found here

Model-Based Engineering @ Boeing

In this conference, the participation of Boeing was significant through the various action groups. As the cherry on the cake, there was Jeff Plant‘s session, giving an overview of what is happening at Boeing. Jeff is Boeing’s director of engineering practices, processes, and tools.

In his introduction, Jeff mentioned that Boeing has more than 160.000 employees in over 65 countries. They are working with more than 12.000 suppliers globally. These suppliers can be manufacturing, service or technology partnerships. Therefore you can imagine, and as discussed by others during the conference, streamlined collaboration and traceability are crucial.

The now-famous MBE Diamond symbol illustrates the model-based information flows in the virtual world and the physical world based on the systems engineering approach. Like Katheryn Bell did in her session related to Global Collaboration, Jeff started explaining the importance of a common language and taxonomy needed if you want to standardize processes.

Zoom in on the Boeing MBE Taxonomy, you will discover the clarity it brings for the company.

I was not aware of the ISO 23247 standard concerning the Digital Twin framework for manufacturing, aiming to apply industry standards to the model-based definition of products and process planning. A standard certainly to follow as it brings standardization on top of existing standards.

As Jeff noted: A practical standard for implementation in a company of any size. In my opinion, mandatory for a sustainable, connected infrastructure.

Jeff presented the slide below, showing their standardization internally around federated platforms.

This slide resembles a lot the future platform vision I have been sharing since 2017 when discussing PLM’s future at PLM conferences, when explaining the differences between Coordinated and Connected – see also my presentation here on Slideshare.

You can zoom in on the picture to see the similarities. For me, the differences were interesting to observe. In Jeff’s diagram, the product lifecycle at the top indicates the platform of (central) interest during each lifecycle stage, suggesting a linear process again.

In reality, the flow of information through feedback loops will be there too.

The second exciting detail is that these federated architectures should be based on strong interoperability standards. Jeff is urging other companies, academics and vendors to invest and come to industry standards for Model-Based System Engineering practices.  The time is now to act on this domain.

It reminded me again of Marc Halpern’s message mentioned in my previous post (part 1) that we should be worried about vendor alliances offering an integrated end-to-end data flow based on their solutions. This would lead to an immense vendor-lock in if these interfaces are not based on strong industry standards.

Therefore, don’t watch from the sideline; it is the voice (and effort) of the companies that can drive standards.

Finally, during the Q&A part, Jeff made an interesting point explaining Boeing is making a serious investment, as you can see from their participation in all the action groups. They have made the long-term business case.

The team is confident that the business case for such an investment is firm and stable, however in such long-term investment without direct results, these projects might come under pressure when the business is under pressure.

The virtual fireside chat

The conference ended with a virtual fireside chat from which I picked up an interesting point that Marc Halpern was bringing in. Marc mentioned a survey Gartner has done with companies in fast-moving industries related to the benefits of PLM. Companies reported improvements in accuracy and product development. They did not see so much a reduced time to market or cost reduction. After analysis, Gartner believes the real issue is related to collaboration processes and supply chain practices. Here lead times did not change, nor the number of changes.

Marc believes that this topic will be really showing benefits in the future with cloud and connected suppliers. This reminded me of an article published by McKinsey called The case for digital reinvention. In this article, the authors indicated that only 2 % of the companies interview were investing in a digital supply chain. At the same time, the expected benefits in this area would have the most significant ROI.

The good news, there is consistency, and we know where to focus for early results.

Conclusion

It was a great conference as here we could see digital transformation in action (groups). Where vendor solutions often provide a sneaky preview of the future, we saw people working on creating the right foundations based on standards. My appreciation goes to all the active members in the CIMdata A&D action groups as they provide the groundwork for all of us – sooner or later.

Translate

Email subscription to this blog

Categories

%d bloggers like this: