You are currently browsing the category archive for the ‘Digital Twin’ category.

Once and a while, the discussion pops up if, given the changes in technology and business scope, we still should talk about PLM. John Stark and others have been making a point that PLM should become a profession.

In a way, I like the vagueness of the definition and the fact that the PLM profession is not written in stone. There is an ongoing change, and who wants to be certified for the past or framed to the past?

However, most people, particularly at the C-level, consider PLM as something complex, costly, and related to engineering. Partly this had to do with the early introduction of PLM, which was a little more advanced than PDM.

The focus and capabilities made engineering teams happy by giving them more access to their data. But unfortunately, that did not work, as engineers are not looking for more control.

Old (current) PLM

Therefore, I would like to suggest that when we talk about PLM, we frame it as Product Lifecycle Data Management (the definition). A PLM infrastructure or system should be considered the System of Record, ensuring product data is archived to be used for manufacturing, service, and proving compliance with regulations.

In a modern way, the digital thread results from building such an infrastructure with related artifacts. The digital thread is somehow a slow-moving environment, connecting the various as-xxx structures (As-Designed, As-Planned, As-Manufactured, etc.). Looking at the different PLM vendor images, Aras example above, I consider the digital thread a fancy name for traceability.

I discussed the topic of Digital Thread in 2018:  Document Management or Digital Thread. One of the observations was that few people talk about the quality of the relations when providing traceability between artifacts.

The quality of traceability is relevant for traditional Configuration Management (CM). Traditional CM has been framed, like PLM, to be engineering-centric.

Both PLM and CM need to become enterprise activities – perhaps unified.

Read my blog post and see the discussion with Martijn Dullaart, Lisa Fenwick and Maxim Gravel when discussing the future of Configuration Management.

New digital PLM

In my posts, I talked about modern PLM. I described it as data-driven, often in relation to a model-based approach. And as a result of the data-driven approach, a digital PLM environment could be connected to processes outside the engineering domain. I wrote a series of posts related to the potential of such a new PLM infrastructure (The road to model-based and connected PLM)

Digital PLM, if implemented correctly, could serve people along the full product lifecycle, from marketing/portfolio management until service and, if relevant, decommissioning). The bigger challenge is even connecting eco-systems to the same infrastructure, in particular suppliers & partners but also customers. This is the new platform paradigm.

Some years ago, people stated IoT is the new PLM  (IoT is the new PLM – PTC 2017). Or MBSE is the foundation for a new PLM (Will MBSE be the new PLM instead of IoT? A discussion @ PLM Roadmap conference 2018).

Even Digital Transformation was mentioned at that time. I don’t believe Digital Transformation is pointing to a domain, more to an ongoing process that most companies have t go through. And because it is so commonly used, it becomes too vague for the specifics of our domain. I liked Monica Schnitger‘s LinkedIn post: Digital Transformation? Let’s talk. There is enough to talk about; we have to learn and be more specific.

 

What is the difference?

The challenge is that we need more in-depth thinking about what a “digital transformed” company would look like. What would impact their business, their IT infrastructure, and their organization and people? As I discussed with Oleg Shilovitsky, a data-driven approach does not necessarily mean simplification.

I just finished recording a podcast with Nina Dar while writing this post. She is even more than me, active in the domain of PLM and strategic leadership toward a digital and sustainable future. You can find the pre-announcement of our podcast here (it was great fun to talk), and I will share the result later here too.

What is clear to me is that a new future data-driven environment becomes like a System of Engagement. You can simulate assumptions and verify and qualify trade-offs in real-time in this environment. And not only product behavior, but you can also simulate and analyze behaviors all along the lifecycle, supporting business decisions.

This is where I position the digital twin. Modern PLM infrastructures are in real-time connected to the business. Still, PLM will have its system of record needs; however, the real value will come from the real-time collaboration.

The traditional PLM consultant should transform into a business consultant, understanding technology. Historically this was the opposite, creating friction in companies.

Starting from the business needs

In my interactions with customers, the focus is no longer on traditional PLM; we discuss business scenarios where the company will benefit from a data-driven approach. You will not obtain significant benefits if you just implement your serial processes again in a digital PLM infrastructure.

Efficiency gains are often single digit, where new ways of working can result in double-digit benefits or new opportunities.

Besides traditional pressure on companies to remain competitive, there is now a new additional driver that I have been discussing in my previous post, the Innovation Dilemma. To survive on our planet, we and therefore also companies, need to switch to sustainable products and business models.

This is a push for innovation; however, it requires a coordinated, end-to-end change within companies.

Be the change

When do you decide to change your business model from pushing products to the marker into a business model of Product as a Service? When do you choose to create repairable and upgradeable products? It is a business need. Sustainability does not start with the engineer. It must be part of the (new) DNA of a company.

Interesting to read is this article from Jan Bosch that I read this morning: Resistance to Change. Read the article as it makes so much sense, but we need more than sense – we need people to get involved. My favorite quote from the article:

“The reasonable man adapts himself to the world; the unreasonable one persists in trying to adapt the world to himself. Therefore, all progress depends on the unreasonable man”.

Conclusion

PLM consultants should retrain themselves in System Thinking and start from the business. PLM technology alone is no longer enough to support companies in their (digital/sustainable) transformation. Therefore, I would like to introduce BLM (Business Lifecycle Management) as the new TLA.

However, BLM has been already framed as Black Lives Matter. I agree with that, extending it to ALM (All Lives Matter).

What do you think should we leave the comfortable term PLM behind us for a new frame?

In February, the PLM Global Green Alliance published our first interview discussing the relationship between PLM and Sustainability with the main vendors. We talked with Darren West from SAP.

You can find the interview here: PLM and Sustainability: talking with SAP. We spoke with Darren about SAP’s Responsible Design and Production module, allowing companies to understand their environmental and economic impact by calculating fees and taxes and implement measures to reduce regulatory costs. The high reliance on accurate data was one of the topics in our discussion.

In March,  we interviewed Zoé Bezpalko and Jon den Hartog from Autodesk. Besides Autodesk’s impressive sustainability program, we discussed Autodesk’s BIM technology helping the construction industry to become greener and their Generative Design solution to support the designer in making better material usage or reuse decisions.

The discussion ended with discussing Life Cycle Assessment tools to support the engineer in making sustainable decisions.

In my last blog post, the Innovation Dilemma, I explored the challenges of a Life Cycle Assessment. As it appears, it is not about just installing a tool. The concepts of a data-driven PLM infrastructure and digital twins are strong transformation prerequisites combined with the Inner Development Goals (IDG).

The IDGs are a human attitude needed besides the Sustainability Development Goals.

Therefore we were happy to discuss last week with Florence Verzelen, Executive Vice President Industry, Marketing & Sustainability and Xavier Adam, Worldwide Sustainability Senior Manager from Dassault Systemes. We discussed Dassault Systemes’ business sustainability goals and product offerings based on the 3DEXPERIENCE platform.

Have a look at the discussion below:


The slides shown in the recording can be found HERE.

What I learned

Dassault Systemes’ purpose has been to help their customers imagine sustainable innovations capable of harmonizing product, nature, and life for many years. A statement that now is slowly bubbling up in other companies too. Dassault Systemes has set a clear and interesting target for themselves in 2025. In that year two/thirds of their sales should come from solutions that make their customers more sustainable.

Their Eco-design solution is one of the first offerings to reach this objective. Their Life Cycle Assessment solution can govern your (virtual) product design on multiple criteria, not only greenhouse gas emissions.  It will be interesting to follow up on this topic to see how companies make the change internally by relying on data and virtual twins of a product or a manufacturing process.

Want to learn more?

Conclusion

80 % of the environmental impact of products is decided during the design phase. A Lifecycle Assessment Solutions combined with a virtual product model, the virtual design twin, allows you to decide on trade-offs in the virtual space before committing to the physical solution. Creating a data-driven, closed-loop between design, engineering, manufacturing and operations based on accurate data is the envisioned infrastructure for a sustainable future.

Yes, it is not a typo. Clayton Christensen famous book written in 1995 discussed the Innovator’s Dilemma when new technologies cause great firms to fail. This was the challenge two decades ago. Existing prominent companies could become obsolete quickly as they were bypassed by new technologies.

The examples are well known. To mention a few: DEC (Digital Equipment Corporation), Kodak, and Nokia.

Why the innovation dilemma?

This decade the challenge has become different. All companies are forced to become more sustainable in the next ten years. Either pushed by global regulations or because of their customer demands. The challenge is this time different. Besides the priority of reducing greenhouse gas emissions, there is also the need to transform our society from a linear, continuous growth economy into a circular doughnut economy.

The circular economy makes the creation, the usage and the reuse of our products more complex as the challenge is to reduce the need for raw materials and avoid landfills.

The circular economy concept – the regular product lifecycle in the middle

The doughnut economy makes the values of an economy more complex as it is not only about money and growth, human and environmental factors should also be considered.

Doughnut Economics: Trying to stay within the green boundaries

To manage this complexity, I wrote SYSTEMS THINKING – a must-have skill in the 21st century, focusing on the logical part of the brain. In my follow-up post, Systems Thinking: a second thought, I looked at the human challenge. Our brain is not rational and wants to think fast to solve direct threats. Therefore, we have to overcome our old brains to make progress.

An interesting and thought-provoking was shared by Nina Dar in this discussion, sharing the video below. The 17 Sustainability Development Goals (SDGs) describe what needs to be done. However, we also need the Inner Development Goals (IDGs) and the human side to connect. Watch the movie:

Our society needs to change and innovate; however, we cannot. The Innovation Dilemma.

The future is data-driven and digital.

What is clear to me is that companies developing products and services have only one way to move forward: becoming data-driven and digital.

Why data-driven and digital?

Let’s look at something companies might already practice, REACH (Registration, Evaluation, Authorization and Restriction of Chemicals). This European directive, introduced in 2007, had the aim to protect human health and protect the environment by communicating information on chemicals up and down the supply chain. This would ensure that manufacturers, importers, and their customers are aware of information relating to the health and safety of the products supplied.

The regulation is currently still suffering in execution as most of the reporting and evaluation of chemicals is done manually. Suppliers report their chemicals in documents, and companies report the total of chemicals in their summary reports. Then, finally, authorities have to go through these reports.

Where the scale of REACH is limited, the manual effort to have end-to-end reporting is relatively high. In addition, skilled workers are needed to do the job because reporting is done in a document-based manner.

Life Cycle Assessments (LCA)

Where you might think REACH is relatively simple, the real new challenges for companies are the need to perform Life Cycle Assessments for their products. In a Life Cycle Assessment. The Wiki definition of LCA says:

Life cycle assessment or LCA (also known as life cycle analysis) is a methodology for assessing environmental impacts associated with all the stages of the life cycle of a commercial product, process, or service. For instance, in the case of a manufactured product, environmental impacts are assessed from raw material extraction and processing (cradle), through the product’s manufacture, distribution and use, to the recycling or final disposal of the materials composing it (grave)

This will be a shift in the way companies need to define products. Much more thinking and analysis are required in the early design phases. Before committing to a physical solution, engineers and manufacturing engineers need to simulate and calculate the impact of their design decisions in the virtual world.

This is where the digital twin of the design and the digital twin of the manufacturing process becomes relevant. And remember: Digital Twins do not run on documents – you need connected data and various types of models to calculate and estimate the environmental impact.

LCA done in a document-based manner will make your company too slow and expensive.

I described this needed transformation in my series from last year: The road to model-based and connected PLM – nine posts exploring the technology and concept of a model-based, data-driven PLM infrastructure.

Digital Product Passport (DPP)

The European Commission has published an action plan for the circular economy, one of the most important building blocks of the European Green Deal. One of the defined measures is the gradual introduction of a Digital Product Passport (DPP). As the quality of an LCA depends on the quality and trustworthy information about products and materials, the DPP is targeting to ensure circular economy metrics become reliable.

This will be a long journey. If you want to catch a glimpse of the complexity, read this Medium article: The digital product passport and its technical implementation related to the DPP for batteries.

The innovation dilemma

Suppose you agree with my conclusion that companies need to change their current product or service development into a data-driven and model-based manner. In that case, the question will come up: where to start?

Becoming data-driven and model-based, of course, is not the business driver. However, this change is needed to be able to perform Life Cycle Assessments and comply with current and future regulations by remaining competitive.

A document-driven approach is a dead-end.

Now let’s look at the real dilemmas by comparing a startup (clean sheet / no legacy) and an existing enterprise (experience with the past/legacy). Is there a winning approach?

The Startup

Having lived in Israel – the nation where almost everyone is a startup – and working with startups afterward in the past 10 years, I always get inspired by these people’s energy in startup companies. They have a unique value proposition most of the time, and they want to be visible on the market as soon as possible.

This approach is the opposite of systems thinking. It is often a very linear process to deliver this value proposition without exploring the side effects of such an approach.

For example, the new “green” transportation hype. Many cities now have been flooded with “green” scooters and electric bikes to promote transportation as a service. The idea behind this concept is that citizens do not require to own polluting motorbikes or cars anymore, and transportation means will be shared. Therefore, the city will be cleaner and greener.

However, these “green” vehicles are often designed in the traditional linear way. Is there a repair plan or a plan to recycle the batteries? Reuse of materials used.? Most of the time, not. Please, if you have examples contradicting my observations, let me know. I like to hear good news.

When startup companies start to scale, they need experts to help them grow the company. Often these experts are seasoned people, perhaps close to retirement. They will share their experience and what they know best from the past:  traditional linear thinking.

As a result, even though startup companies can start with a clean sheet, their focus on delivering the product or service blocks further thinking. Instead, the seasoned experts will drive the company towards ways of working they know from the past.

Out of curiosity: Do you know or work in a startup that has started with a data-driven and model-based vision from scratch?  Please add the name of this company in the comments, and let’s learn how they did it.

The Existing company

Working in an established company is like being on board a big tanker. Changing its direction takes a clear eye on the target and navigation skills to come there. Unfortunately, most of the time, these changes take years as it is impossible to switch the PLM infrastructure and the people skills within a short time.

From the bimodal approach in 2015 to the hybrid approach for companies, inspired by this 2017 McKinsey article: Toward an integrated technology operating model, I discovered that this is probably the best approach to ensure a change will happen. In this approach – see image – the organization keeps running on its document-driven PLM infrastructure. This type of infrastructure becomes the system of record. Nothing different from what PLM currently is in most companies.

In parallel, you have to start with small groups of people who independently focus on a new product, a new service. Using the model-based approach, they work completely independently from the big enterprise in a data-driven approach. Their environment can be considered the future system of engagement.

The data-driven approach allows all disciplines to work in a connected, real-time manner. Mastering the new ways of working is usually the task of younger employees that are digital natives. These teams can be completed by experienced workers who behave as coaches. However, they will not work in the new environment; these coaches bring business knowledge to the team.

People cannot work in two modes, but organizations can. As you can see from the McKinsey chart, the digital teams will get bigger and more important for the core business over time. In parallel, when their data usage grows, more and more data integration will occur between the two operation modes. Therefore, the old PLM infrastructure can remain a System of Record and serve as a support backbone for the new systems of engagement.

The Innovation Dilemma conclusion

The upcoming ten years will push organizations to innovate their ways of working to become sustainable and competitive. As discussed before, they must learn to work in a data-driven, connected manner. Both startups and existing enterprises have challenges – they need to overcome the “thinking fast and acting slow” mindset. Do you see the change in your company?

 

Note: Before publishing this post, I read this interesting and complementary post from Jan Bosch Boost your digitalization: instrumentation.

It is in the air – grab it.

 

After two quiet weeks of spending time with my family in slow motion, it is time to start the year.

First of all, I wish you all a happy, healthy, and positive outcome for 2022, as we need energy and positivism together. Then, of course, a good start is always cleaning up your desk and only leaving the relevant things for work on the desk.

Still, I have some books at arm’s length, either physical or on my e-reader, that I want to share with you – first, the non-obvious ones:

The Innovators Dilemma

A must-read book was written by Clayton Christensen explaining how new technologies can overthrow established big companies within a very short period. The term Disruptive Innovation comes up here. Companies need to remain aware of what is happening outside and ready to adapt to your business. There are many examples even recently where big established brands are gone or diminished in a short period.

In his book, he wrote about DEC (Digital Equipment Company)  market leader in minicomputers, not having seen the threat of the PC. Or later Blockbuster (from video rental to streaming), Kodak (from analog photography to digital imaging) or as a double example NOKIA (from paper to market leader in mobile phones killed by the smartphone).

The book always inspired me to be alert for new technologies, how simple they might look like, as simplicity is the answer at the end. I wrote about in 2012: The Innovator’s Dilemma and PLM, where I believed cloud, search-based applications and Facebook-like environments could disrupt the PLM world. None of this happened as a disruption; these technologies are now, most of the time, integrated by the major vendors whose businesses are not really disrupted. Newcomers still have a hard time to concur marketspace.

In 2015 I wrote again about this book, The Innovator’s dilemma and Generation change. – image above. At that time, understanding disruption will not happen in the PLM domain. Instead, I predict there will be a more evolutionary process, which I would later call: From Coordinated to Connected.

The future ways of working address the new skills needed for the future. You need to become a digital native, as COVID-19 pushed many organizations to do so. But digital native alone does not bring success. We need new ways of working which are more difficult to implement.

Sapiens

The book Sapiens by Yuval Harari made me realize the importance of storytelling in the domain of PLM and business transformation. In short, Yuval Harari explains why the human race became so dominant because we were able to align large groups around an abstract theme. The abstract theme can be related to religion, the power of a race or nation, the value of money, or even a brand’s image.

The myth (read: simplified and abstract story) hides complexity and inconsistencies. It allows everyone to get motivated to work towards one common goal. A Yuval says: “Fiction is far more powerful because reality is too complex”.

Too often, I have seen well-analyzed PLM projects that were “killed” by management because it was considered too complex. I wrote about this in 2019  PLM – measurable or a myth? claiming that the real benefits of PLM are hard to predict, and we should not look isolated only to PLM.

My 2020 follow-up post The PLM ROI Myth, eludes to that topic. However, even if you have a soundproof business case at the management level, still the myth might be decisive to justify the investment.

That’s why PLM vendors are always working on their myths: the most cost-effective solution, the most visionary solution, the solution most used by your peers and many other messages to influence your emotions, not your factual thinking. So just read the myths on their websites.

If you have no time to read the book, look at the above 2015 Ted to grasp the concept and use it with a PLM -twisted mind.

Re-use your CAD

In 2015, I read this book during a summer holiday (meanwhile, there is a second edition). Although it was not a PLM book, it was helping me to understand the transition effort from a classical document-driven enterprise towards a model-based enterprise.

Jennifer Herron‘s book helps companies to understand how to break down the (information) wall between engineering and manufacturing.

At that time, I contacted Jennifer to see if others like her and Action Engineering could explain Model-Based Definition comprehensively, for example, in Europe- with no success.

As the Model-Based Enterprise becomes more and more the apparent future for companies that want to be competitive or benefit from the various Digital Twin concepts. For that reason, I contacted Jennifer again last year in my post: PLM and Model-Based Definition.

As you can read, the world has improved, there is a new version of the book, and there is more and more information to share about the benefits of a model-based approach.

I am still referencing Action Engineering and their OSCAR learning environment for my customers. Unfortunately, many small and medium enterprises do not have the resources and skills to implement a model-based environment.

Instead, these companies stay on their customers’ lowest denominator: the 2D Drawing. For me, a model-based definition is one of the first steps to master if your company wants to provide digital continuity of design and engineering information towards manufacturing and operations. Digital twins do not run on documents; they require model-based environments.

The book is still on my desk, and all the time, I am working on finding the best PLM practices related to a Model-Based enterprise.

It is a learning journey to deal with a data-driven, model-based environment, not only for PLM but also for CM experts, as you might have seen from my recent dialogue with CM experts: The future of Configuration Management.

Products2019

This book was an interesting novelty published by John Stark in 2020. John is known for his academic and educational books related to PLM. However, during the early days of the COVID-pandemic, John decided to write a novel. The novel describes the learning journey of Jane from Somerset, who, as part of her MBA studies, is performing a research project for the Josef Mayer Maschinenfabrik. Her mission is to report to the newly appointed CEO what happens with the company’s products all along the lifecycle.

Although it is not directly a PLM book, the book illustrates the complexity of PLM. It Is about people and culture; many different processes, often disconnected. Everyone has their focus on their particular discipline in the center of importance. If you believe PLM is all about the best technology only, read this book and learn how many other aspects are also relevant.

I wrote about the book in 2020: Products2019 – a must-read if you are new to PLM if you want to read more details. An important point to pick up from this book is that it is not about PLM but about doing business.

PLM is not a magical product. Instead, it is a strategy to support and improve your business.

System Lifecycle Management

Another book, published a little later and motivated by the extra time we all got during the COVID-19 pandemic, was Martin Eigner‘s book System Lifecycle Management.

A 281-page journey from the early days of data management towards what Martin calls System Lifecycle Management (SysLM). He was one of the first to talk about System Lifecycle Management instead of PLM.

I always enjoyed Martin’s presentations at various PLM conferences where we met. In many ways, we share similar ideas. However, during his time as a professor at the University of Kaiserslautern (2003-2017), he explored new concepts with his students.

I briefly mentioned the book in my series The road to model-based and connected PLM (Part 5) when discussing SLM or SysLM. His academic research and analysis make this book very valuable. It takes you in a very structured way through the times that mechatronics becomes important, next the time that systems (hardware and software) become important.

We discussed in 2015 the applicability of the bimodal approach for PLM. However, as many enterprises are locked in their highly customized PDM/PLM environments, their legacy blocks the introduction of modern model-based and connected approaches.

Where John Stark’s book might miss the PLM details, Martin’s book brings you everything in detail and with all its references.

It is an interesting book if you want to catch up with what has happened in the past 20 years.

More Books …..

More books on my desk have helped me understand the past or that helped me shape the future. As this is a blog post, I will not discuss more books this time reaching my 1500 words.

Still books worthwhile to read – click on their images to learn more:

I discussed this book two times last year. An introduction in PLM and Modularity and a discussion with the authors and some readers of the book: The Modular Way – a follow-up discussion

x

x

A book I read this summer contributed to a better understanding of sustainability. I mentioned this book in my presentation for the Swedish CATIA Forum in October last year – slide 29 of The Challenges of model-based and traditional plm. So you could see it as an introduction to System Thinking from an economic point of view.

System Thinking becomes crucial for a sustainable future, as I addressed in my post PLM and Sustainability.

Sustainability is my area of interest at the PLM Green Global Alliance, an international community of professionals working with Product Lifecycle Management (PLM) enabling technologies and collaborating for a more sustainable decarbonized circular economy.

Conclusion

There is a lot to learn. Tell us something about your PLM bookshelf – which books would you recommend. In the upcoming posts, I will further focus on PLM education. So stay tuned and keep on learning.

This week I attended the PLM Roadmap & PDT Fall 2021 with great expectations based on my enthusiasm last year. Unfortunately, the excitement was less this time, and I will explain in my conclusions why. This time it was unfortunate again a virtual event which makes it hard to be interactive, something I realize I am missing a lot.

Over two hundred attendees connected for the two days, and you can find the agenda here. Typically I would discuss the relevant sessions; now, I want to group some of them related to a theme, as there was complementary information in these sessions.

Disruption

Again like in the spring, the theme was focusing on DISRUPTION. The word disruption can give you an uncomfortable feeling when you are not in power. It is more fun to disrupt than to be disrupted, as I mentioned in my spring presentation. Read The week after PLM Roadmap & PDT Spring 2021

In his keynote speech Peter Bilello (CIMdata) kicked off with: The Critical Dozen: 12 familiar, evolving trends and enablers of digital transformation that you cannot or should not live without.

You can see them on the slide below:

I believe many of them should be familiar to you as these themes have been “in the air” already for quite some time. Vendors first and slowly companies start to investigate them when relevant. You will find many of them back in my recent series: The road to model-based and connected PLM, where I explored the topics that would cross your path on that journey.

Like Peter said: “For most of the topics you cannot pick and choose as they are all connected.”

Another interesting observation was that we are more and more moving away from the concept of related structures (digital thread) but more to connected datasets (digital web). Marc Halpern first introduced this topic last year at the 2020 conference and has become an excellent image to frame what we should imagine in a connected world.

Digital web also has to do with the uprise of the graph database mentioned by Peter Bilello as a potentially disruptive technology during the fireside chat. Relational databases can be seen as rigid, associated with PLM structures. On the other hand, graph databases can be associated with flexible relations between different types of data – the image of the digital web.

Where Peter was mainly telling WHAT was happening, two presentations caught my attention because of the HOW.

First of all, Dr. Rodney Ewing (Cummins) ‘s session: A Balanced Strategy to Reap Continuous Business Value from Digital PLM was a great story of a transformational project. It contained both having a continuous delivery of business value in mind while moving to the connected enterprise.

As Rodney mentioned, the contribution of TCS was crucial here, which I can imagine. It is hard for a company to understand what is happening in the outside (PLM) world when applying it to your company. Their transformation roadmap is an excellent example of having the long-term vision in mind, meanwhile delivering value during the transformation.

Talking about the right partner and synergy, the second presentation I liked in this context of disruption was Ian Quest’s presentation (Quick Release): Open-source Disruption in Support of Audacious Goals. As a sponsor of the conference, they had ten minutes to pitch their area of expertise.

After Ian’s presentation, focused on audacious goals (for non-English natives translated as “brave” goals), there was only one word that stuck to my mind: pragmatic.

Instead of discussions about the complexity, Ian gave examples of where a pragmatic data-centric approach could lead to great benefits, as you can see from one of the illustrated benefits below:

Standards

A characteristic topic of this conference is that we always talk about standards. Torbjörn Holm (Eurostep) gave an excellent overview of where standards have led to significant benefits. For example, the containerization of goods has dramatically improved transportation of goods (we all benefit) while killing proprietary means of transport (trains, type of ships, type of unloading).  See the image below:

Torbjörn rightfully expanded this story to the current situation in the construction industry or the challenges for asset operators. Unfortunately, in these practices, many content suppliers remain focusing on their unique capabilities, reluctantly neglecting the demand for interoperability among the whole value chain.

It is a topic Marc Halpern also mentioned last year as an outcome of their Gartner PLM benefits survey. Gartner’s findings:

Time to Market is not so much improved by using PLM as the inefficient interaction with suppliers is the impediment.

Like transport before containerization, the exchange of information is not standardized and designed for digital exchange. Torbjorn believes that more and more companies will insist on exchange standards –  like CHIFOS – an ISO1596-derived exchange standard in the process industry. It is a user-driven standard, the best standard.

In this context, the presentation from Kenny Swope (Boeing) and Jean Yves Delaunay (Airbus) The Business Value of Standards-based Information Interoperability for Aerospace & Defense illustrated this fact.

While working for competitors, the Aerospace industry understands the criticality of standards to become more efficient and less vendor-dependent.  In the aerospace & defense group, they discuss these themes. The last year’s 2020 Fall sessions showed the results. You can read their publications here

The A&D PLM action group uses the following framework when evaluating standards – as you can see on the image below:

The result – and this is a combined exercise of many participating experts from the field; this is their recommendation:

To conclude:
People often complain about standards, framed by proprietary data format vendors, that they lead to a rigid environment, blocking agility.

In reality, standards allow companies to be more agile as the (proprietary) data flow is less an issue. Remember the containerization example.

Sustainability and System Thinking

This conference has always been known for its attention to the circular economy and green thinking. In the past, these topics might have been considered disconnected from our PLM practices; now, they have become a part of everyone’s mission.

Two presentations stood out on this topic for me. First, Ken Webster, with his keynote speech: In the future, you will own nothing and you will be happy was a significant oversight of how we as consumers currently are disconnected from the circular economy. His plea, as shown below, for making manufacturers responsible for the legal ownership of the materials in the products they deliver would impact consumer behavior.

Product as a Service (PaaS) and new ways to provide a service is becoming essential. For example, buildings as power stations, as they are a place to collect solar or wind energy?

His thoughts are aligned with what is happening in Europe related to the European Green Deal (not in his presentation). There is a push for a PaaS model for all products as this would be an excellent stimulant for the circular economy.  PaaS combined with a Digital Product Passport – more on that next year.

Making upgrades to your products has less impact on the environment than creating new products to sell (and creating waste of the old product).  Ken Webster was an interesting statement about changing the economy – do we want to own products or do we want to benefit from the product and leave the legal ownership to the manufacturer.

A topic I discussed in the PLM Roadmap & PDT Conference Spring 2021 – look here at slide 11

Patrick Hillberg‘s presentation Rising to the challenge of engineering and optimizing . . . what?  was the one closest to my heart. We discussed Sustainability and Systems Thinking with Patrick in our PLM Global Green Alliance, being pretty aligned on this topic.  Patrick started by explaining the difference between Systems Engineering and Systems Thinking. Looking at the product go-to-market of an organization is more than the traditional V-model. Economic pressure and culture will push people to deviate from the ideal technological plan due to other priorities.

Expanding on this observation, Partick stated that there are limits to growth, a topic discussed by many people involved in the sustainable economy. Economic growth is impossible on a limited planet, and we have to take more dimensions into account. Patrick gave some examples of that, including issues related to the infamous Boeing 737 Max example.

For Patrick, the COVID-pandemic is the end of the old 2nd Industrial Revolution and a push for a new Fourth Industrial Revolution, which is not only technical, as the slide below indicates.

With Patrick, I believe we are at a decisive moment to disrupt ourselves, reconsider many things we do and are used to doing. Even for PLM practitioners, this is a new path to go.

Data

There were two presentations related to digitization and the shift from document-based to a data-driven approach.

First, there was Greg Weaver (Gulfstream) with his presentation Indexing Content – Finding Your Needle in the Haystack. Greg explained that by using indexation of existing document-based information combined with a specific dashboard, they could provide fast access to information that otherwise would have been hidden in so many document or even paper archives.

It was a pragmatic solution, making me feel nostalgic seeing the SmarTeam profile cards. It was an excellent example of moving to a digital enterprise, and Gulfstream has always been a front runner on this topic.

Warning: Don’t use this by default at home (your company). The data in a regulated industry like Aerospace is expected to be of high quality due to the configuration management processes in place. If your company does not have a strong CM practice, the retrieved data might be inaccurate.

Martijn Dullaart (ASML)’s presentation The Next disruption, please…..  was the next step into the future. With his statement “No CM = No Trust,” he made an essential point for data-driven environments.

There is a need for Configuration Management, and I touched on this topic in my last post: The road to model-based and connected PLM (part 9 – CM).

Martijn’s presentation can also be found on his blog here, and I encourage you to read it (saving me copy & paste text). It was interesting to see that Martijn improved his CM pyramid, as you can see, more discipline and activity-oriented instead of a system view. With Martijn and others, I will elaborate on this topic soon.

Conclusion

This has been an extremely long post, and thanks for reading until the end. Many interesting topics were presented at the conference. I was less excited this time because many of these topics are triggers for a discussion. Innovation comes from meeting people with different backgrounds. In a live conference, you would meet during the break or during the famous dinner. How can we ensure we follow up on all this interesting information.

Your thoughts? Contact me for a Corona Friday discussion.

When I started this series in July, I expected to talk mostly about new ways of working, enabled through a data-driven and model-based approach. However, when analyzing what is needed for such a future (part 3), it became apparent that many of these new ways of working are dependent on technology.

From coordinated to connected sounds like a business change;

however, it all depends on technology. And here I have to thank Marc Halpern (Gartner’s Research VP, Engineering and Design Technologies)  again, who came with this brilliant scheme below:

So now it is time to address the last point from my starting post:

Configuration Management requires a new approach. The current methodology is very much based on hardware products with labor-intensive change management. However, the world of software products has different configuration management and change procedures. Therefore, we need to merge them into a single framework. Unfortunately, this cannot be the BOM framework due to the dynamics in software changes.

Configuration management at this moment

PLM and CM are often considered overlapping. My March 2019 post: PLM and Configuration Management – a happy marriage? shares some thoughts related to this point

Does having PLM or PDM installed mean you have implemented CM? There is this confusion because revision management is considered the same as configuration management. Read my March 2020 post: What the FFF is happening? Based on a vivid discussion launched by  Yoann Maingon, CEO and founder of Ganister, an example of a modern, graph database-based, flexible PLM solution.

To hear it from a CM-side,  I discussed it with Martijn Dullaart in my February 2021 post: PLM and Configuration Management. We also zoomed in on CM2 in this post as a methodology.

Martijn is the Lead Architect for Enterprise Configuration Management at ASML (Our Dutch national pride) and chairperson of the Industry 4.0 committee of the Integrated Process Excellence (IPX) Congress.

As mentioned before in a previous post (part 6), he will be speaking at the PLM Roadmap & PDT Fall conference starting this upcoming week.

In this post, I want to talk about the CM future. For understanding the current situation, you can find a broad explanation here on Wikipedia. Have a look at CM in the context of the product lifecycle, ensuring that the product As-Specified and As-Designed information matches the As-Built and As-Operated product information.

A mismatch or inconsistency between these artifacts can lead to costly errors, particularly in later lifecycle stages. CM originated from the Aerospace and Defense industry for that reason. However, companies in other industries might have implemented CM practices too. Either due to regulations or thanks to the understanding that configuration mistakes can cause significant damage to the company.

Historically configuration management addresses the needs of “slow-moving” products. For example, the design of an airplane could take years before manufacturing started. Tracking changes and ensuring consistency of all referenced datasets was often a manual process.

On purpose, I wrote “referenced datasets,” as the information was not connected in a single environment most of the time. The identifier of a dataset ( an item or a document) was the primary information carrier used for mentally connecting other artifacts to keep consistency.

The Institute of Process Excellence (IPX) has been one of the significant contributors to configuration management methodology. They have been providing (and still offer) CM2 training and certification.

As mentioned before, PLM vendors or implementers suggest that a PLM system could fully support Configuration Management. However, CM is more than change management, release management and revision management.

As the diagram from Martijn Dullaart shows, PLM is one facet of configuration management.

Of course, there are also (a few) separate CM tools focusing on the configuration management process. CMstat’s EPOCH CM tool is an example of such software. In addition, on their website, you can find excellent articles explaining the history and their future thoughts related to CM.

The future will undoubtedly be a connected, model-based, software-driven environment. Naturally, therefore, configuration management processes will have to change. (Impressive buzz word sentence, still I hope you get the message).

From coordinated to connected has a severe impact on CM. Let’s have a look at the issues.

Configuration Management – the future

The transition to a data-driven and model-based infrastructure has raised the following questions:

  • How to deal with the granularity of data – each dataset needs to be validated. For example, a document (a collection of datasets) needs to be validated in the document-based approach. How to do this efficiently?
  • The behavior of a product (or system) will more and more dependent on software. Product CM practices have been designed for the hardware domain; now, we need a mix of hardware and software CM practices.
  • Due to the increased complexity of products (or systems) and the rapid changes due to software versions, how do we guarantee the As-Operated product is still matching the As-Designed / As-Certified definitions.

I don’t have answers to these questions. I only share observations and trends I see in my actual world.

Granularity of data

The concept of datasets has been discussed in my post (part 6). Now it is about how to manage the right sets of connected data.

The image on the left, borrowed from Erik Herzog’s presentation at the PDM Roadmap & PDT Fall conference in 2020, is a good illustration of the challenge.

At that time, Erik suggested that OSLC could be the enabler of a digital CM backbone for an enterprise. Therefore, it was a pleasure to see Erik providing an update at the yearly OSLC Fest conference this week.

You can find the agenda and Erik’s presentation here on day 2.

OSLC as a framework seems to be a good candidate for supporting modern CM scenarios. It allows a company to build full traceability between all relevant artifacts (if digital available). I can see the beauty of the technical infrastructure.

Still, it is about people and processes first. Therefore, I am curious to learn from my readers who believe and experiment with such a federated infrastructure.

More software

Traditional working companies might believe that software should be treated as part of the Bill of Materials. In this theory, you treat software code as a part, with a part number and revision. In this way, you might believe configuration management practices do not have to change. However, there are some fundamental differences in why we should decouple hardware and software.

First, for the same hardware solution, there might be a whole collection of valid software codes. Just like your computer. How many valid software codes, even from the same application, can you run on this hardware? Managing a computer system and its software through a Bill of Materials is unimaginable.

A computer, of course, is designed for running all kinds of software versions. However, modern products in the field, like cars, machines, electrical devices, all will have a similar type of software-driven flexibility.

For that reason, I believe that companies that deliver software-driven products should design a mechanism to check if the combination of hardware and software is valid. For a computer system, a software mismatch might not be costly or painful; for an industrial system, it might be crucial to ensure invalid combinations can exist. Click on the image to learn more.

Solutions like Configit or pure::variants might lead to a solution. In Feb 2021, I discussed in PLM and Configuration Lifecycle Management with Henrik Hulgaard, the CTO from Configit, the unique features of their solution.

I hope to have a similar post shortly with Pure Systems to understand their added value to configuration management.

Software change management is entirely different from hardware change management. The challenge is to have two different change management approaches under one consistent umbrella without creating needless overhead.

Increased complexity – the digital twin?

With the increased complexity of products and many potential variants of a solution, how can you validate a configuration? Perhaps we should investigate the digital twin concept, with a twin for each instance we want to validate.

Having a complete virtual representation of a product, including the possibility to validate the software behavior on the virtual product, would allow you to run (automated) validation tests to certify and later understand a product in the field.

No need for inspection on-site or test and fix upgrades in the physical world. Needed for space systems for sure, but why not for every system in the long term. When we are able to define and maintain a virtual twin of our physical product (on-demand), we can validate.

I learned about this concept at the 2020 Digital Twin conference in the Netherlands. Bart Theelen from Canon Production Printing explained that they could feed their simulation models with actual customer data to simulate and analyze the physical situation. In some cases, it is even impossible to observe the physical behavior. By tuning the virtual environment, you might understand what happens in the physical world.

An eye-opener and an advocate for the model-based approach. Therefore, I am looking forward to the upcoming PLM Roadmap & PDT Fall conference. Hopefully, Martijn Dullaart will share his thoughts on combining CM and working in a model-based environment. See you there?

Conclusion

Finally, we have reached in this series the methodology part, particularly the one related to configuration management and traceability in a very granular, digital environment.  

After the PLM Roadmap & PDT fall conference, I plan to follow up with three thought leaders on this topic: Martijn Dullaart (ASML), Maxime Gravel (Moog) and Lisa Fenwick (CMstat).  What would you ask them?

So far, I have been discussing PLM experiences and best practices that have changed due to introducing electronic drawings and affordable 3D CAD systems for the mainstream. From vellum to PDM to item-centric PLM to manage product designs and manufacturing specifications.

Although the technology has improved, the overall processes haven’t changed so much. As a result, disciplines could continue to work in their own comfort zone, most of the time hidden and disconnected from the outside world.

Now, thanks to digitalization, we can connect and format information in real-time. Now we can provide every stakeholder in the company’s business to have almost real-time visibility on what is happening (if allowed). We have seen the benefits of platformization, where the benefits come from real-time connectivity within an ecosystem.

Apple, Amazon, Uber, Airbnb are the non-manufacturing related examples. Companies are trying to replicate these models for other businesses, connecting the concept owner (OEM ?), with design and manufacturing (services), with suppliers and customers. All connected through information, managed in data elements instead of documents – I call it connected PLM

Vendors have already shared their PowerPoints, movies, and demos from how the future would be in the ideal world using their software. The reality, however, is that implementing such solutions requires new business models, a new type of organization and probably new skills.

The last point is vital, as in schools and organizations, we tend to teach what we know from the past as this gives some (fake) feeling of security.

The reality is that most of us will have to go through a learning path, where skills from the past might become obsolete; however, knowledge of the past might be fundamental.

In the upcoming posts, I will share with you what I see, what I deduct from that and what I think would be the next step to learn.

I firmly believe connected PLM requires the usage of various models. Not only the 3D CAD model, as there are so many other models needed to describe and analyze the behavior of a product.

I hope that some of my readers can help us all further on the path of connected PLM (with a model-based approach). This series of posts will be based on the max size per post (avg 1500 words) and the ideas and contributes coming from you and me.

What is platformization?

In our day-to-day life, we are more and more used to direct interaction between resellers and services providers on one side and consumers on the other side. We have a question, and within 24 hours, there is an answer. We want to purchase something, and potentially the next day the goods are delivered. These are examples of a society where all stakeholders are connected in a data-driven manner.

We don’t have to create documents or specialized forms. An app or a digital interface allows us to connect. To enable this type of connectivity, there is a need for an underlying platform that connects all stakeholders. Amazon and Salesforce are examples for commercial activities, Facebook for social activities and, in theory, LinkedIn for professional job activities.

The platform is responsible for direct communication between all stakeholders.

The same applies to businesses. Depending on the products or services they deliver, they could benefit from one or more platforms. The image below shows five potential platforms that I identified in my customer engagements. Of course, they have a PLM focus (in the middle), and the grouping can be made differently.

Five potential business platforms

The 5 potential platforms

The ERP platform
is mainly dedicated to the company’s execution processes – Human Resources, Purchasing, Finance, Production scheduling, and potentially many more services. As platforms try to connect as much as possible all stakeholders. The ERP platform might contain CRM capabilities, which might be sufficient for several companies. However, when the CRM activities become more advanced, it would be better to connect the ERP platform to a CRM platform. The same logic is valid for a Product Innovation Platform and an ERP platform.  Examples of ERP platforms are SAP and Oracle (and they will claim they are more than ERP)

Note: Historically, most companies started with an ERP system, which is not the same as an ERP platform.  A platform is scalable; you can add more apps without having to install a new system. In a platform, all stored data is connected and has a shared data model.

The CRM platform

a platform that is mainly focusing on customer-related activities, and as you can see from the diagram, there is an overlap with capabilities from the other platforms. So again, depending on your core business and products, you might use these capabilities or connect to other platforms. Examples of CRM platforms are Salesforce and Pega, providing a platform to further extend capabilities related to core CRM.

The MES platform
In the past, we had PDM and ERP and what happened in detail on the shop floor was a black box for these systems. MES platforms have become more and more important as companies need to trace and guide individual production orders in a data-driven manner. Manufacturing Execution Systems (and platforms) have their own data model. However, they require input from other platforms and will provide specific information to other platforms.

For example, if we want to know the serial number of a product and the exact production details of this product (used parts, quality status), we would use an MES platform. Examples of MES platforms (none PLM/ERP related vendors) are Parsec and Critical Manufacturing

The IoT platform

these platforms are new and are used to monitor and manage connected products. For example, if you want to trace the individual behavior of a product of a process, you need an IoT platform. The IoT platform provides the product user with performance insights and alerts.

However, it also provides the product manufacturer with the same insights for all their products. This allows the manufacturer to offer predictive maintenance or optimization services based on the experience of a large number of similar products.  Examples of IoT platforms (none PLM/ERP-related vendors) are Hitachi and Microsoft.

The Product Innovation Platform (PIP)

All the above platforms would not have a reason to exist if there was not an environment where products were invented, developed, and managed. The Product Innovation Platform PIP – as described by CIMdata  -is the place where Intellectual Property (IP) is created, where companies decide on their portfolio and more.

The PIP contains the traditional PLM domain. It is also a logical place to manage product quality and technical portfolio decisions, like what kind of product platforms and modules a company will develop. Like all previous platforms, the PIP cannot exist without other platforms and requires connectivity with the other platforms is applicable.

Look below at the CIMdata definition of a Product Innovation Platform.

You will see that most of the historical PLM vendors aiming to be a PIP (with their different flavors): Aras, Dassault Systèmes, PTC and Siemens.

Of course, several vendors sell more than one platform or even create the impression that everything is connected as a single platform. Usually, this is not the case, as each platform has its specific data model and combining them in a single platform would hurt the overall performance.

Therefore, the interaction between these platforms will be based on standardized interfaces or ad-hoc connections.

Standard interfaces or ad-hoc connections?

Suppose your role and information needs can be satisfied within a single platform. In that case, most likely, the platform will provide you with the right environment to see and manipulate the information.

However, it might be different if your role requires access to information from other platforms. For example, it could be as simple as an engineer analyzing a product change who needs to know the actual stock of materials to decide how and when to implement a change.

This would be a PIP/ERP platform collaboration scenario.

Or even more complex, it might be a product manager wanting to know how individual products behave in the field to decide on enhancements and new features. This could be a PIP, CRM, IoT and MES collaboration scenario if traceability of serial numbers is needed.

The company might decide to build a custom app or dashboard for this role to support such a role. Combining in real-time data from the relevant platforms, using standard interfaces (preferred) or using API’s, web services, REST services, microservices (for specialists) and currently in fashion Low-Code development platforms, which allow users to combine data services from different platforms without being an expert in coding.

Without going too much in technology, the topics in this paragraph require an enterprise architecture and vision. It is opportunistic to think that your existing environment will evolve smoothly into a digital highway for the future by “fixing” demands per user. Your infrastructure is much more likely to end up congested as spaghetti.

In that context, I read last week an interesting post Low code: A promising trend or Pandora’s box. Have a look and decide for yourself

I am less focused on technology, more on methodology. Therefore, I want to come back to the theme of my series: The road to model-based and connected PLM. For sure, in the ideal world, the platforms I mentioned, or other platforms that run across these five platforms, are cloud-based and open to connect to other data sources. So, this is the infrastructure discussion.

In my upcoming blog post, I will explain why platforms require a model-based approach and, therefore, cause a challenge, particularly in the PLM domain.

It took us more than fifty years to get rid of vellum drawings. It took us more than twenty years to introduce 3D CAD for design and engineering. Still primarily relying on drawings. It will take us for sure one generation to switch from document-based engineering to model-based engineering.

Conclusion

In this post, I tried to paint a picture of the ideal future based on connected platforms. Such an environment is needed if we want to be highly efficient in designing, delivering, and maintaining future complex products based on hardware and software. Concepts like Digital Twin and Industry 4.0 require a model-based foundation.

In addition, we will need Digital Twins to reach our future sustainability goals efficiently. So, there is work to do.

Your opinion, Your contribution?

 

 

 

 

 

 

Another episode of “The PLM Doctor is IN“. This time a question from Ilan Madjar, partner and co-founder of XLM Solutions. Ilan is my co-moderator at the PLM Global Green Alliance for sustainability topics.

All these activities resulted in the following question(s) related to the Digital Twin. Now sit back and enjoy.

PLM and the Digital Twin

Is it a new concept? How to implement and certify the result?

Relevant topics discussed in this video

Conclusion

I hope you enjoyed the answer and look forward to your questions and comments. Let me know if you want to be an actor in one of the episodes.


The main rule: A (single) open question that is puzzling you related to PLM.

Last week I wrote about the recent PLM Road Map & PDT Spring 2021 conference day 1, focusing mainly on technology. There were also interesting sessions related to exploring future methodologies for a digital enterprise. Now on Day 2, we started with two sessions related to people and methodology, indispensable when discussing PLM topics.

Designing and Keeping Great Teams

This keynote speech from Noshir Contractor, Professor of Behavioral Sciences in the McCormick School of Engineering & Applied Science, intrigued me as the subtitle states: Lessons from Preparing for Mars. What Can PLM Professionals Learn from This?

You might ask yourself, is a PLM implementation as difficult and as complex as a mission to Mars? I hoped, so I followed with great interest Noshir’s presentation.

Noshir started by mentioning that many disruptive technologies have emerged in recent years, like Teams, Slack, Yammer and many more.

The interesting question he asked in the context of PLM is:

As the domain of PLM is all about trying to optimize effective collaboration, this is a fair question

Structural Signatures

Noshir shared with us that it is not the most crucial point to look at people’s individual skills but more about who they know.
Measure who they work with is more important than who they are.

Based on this statement, Noshir showed some network patterns of different types of networks.

Click on the image to see the enlarged picture.

It is clear from these patterns how organizations communicate internally and/or externally. It would be an interesting exercise to perform in a company and to see if the analysis matches the perceived reality.

Noshir’s research was used by NASA to analyze and predict the right teams for a mission to Mars.

Noshir went further by proposing what PLM can learn from teams that are going into space. And here, I was not sure about the parallel. Is a PLM project comparable to a mission to Mars? I hope not! I have always advocated that a PLM implementation is a journey. Still, I never imagined that it could be a journey into the remote unknown.

Noshir explained that they had built tools based on their scientific model to describe and predict how teams could evolve over time. He believes that society can also benefit from these learnings. Many inventions from the past were driven by innovations coming from space programs.

I believe Noshir’s approach related to team analysis is much more critical for organizations with a mission. How do you build multidisciplinary teams?

The proposed methodology is probably best for a holocracy based organization. Holocrazy is an interesting concept for companies to get their employees committed, however, it also demands a type of involvement that not every person can deliver.  For me, coming back to PLM, as a strategy to enable collaboration, the effectiveness of collaboration depends very much on the organizational culture and created structure.

DISRUPTION – EXTINCTION or still EVOLUTION?

We talk a lot about disruption because disruption is a painful process that you do not like to happen to yourself or your company. In the context of this conference’s theme, I discussed the awareness that disruptive technologies will be changing the PLM Value equation.

However, disruptive technologies are not alone sufficient. In PLM, we have to deal with legacy data, legacy processes, legacy organization structures, and often legacy people.

A disruption like the switch from mini-computers to PCs (killed DEC) or from Symbian to iOS (killed Nokia) is therefore not likely to happen that fast. Still, there is a need to take benefit from these new disruptive technologies.

My presentation was focusing on describing the path of evolution and focus areas for the PLM community. Doing nothing means extinction; experimenting and learning towards the future will provide an evolutionary way.

Starting from acknowledging that there is an incompatibility between data produced most of the time now and the data needed in the future, I explained my theme: From Coordinated to Connected. As a PLM community, we should spend more time together in focus groups, conferences on describing and verifying methodology and best practices.

Nigel Shaw (EuroStep) and Mark Williams (Boeing) hinted in this direction during this conference  (see day 1). Erik Herzog (SAAB Aeronautics) brought this topic to last year’s conference (see day 3). Outside this conference, I have comparable touchpoints with Martijn Dullaert when discussing Configuration Management in the future in relation to PLM.

In addition, this decade will probably be the most disruptive decade we have known in humanity due to external forces that push companies to change. Sustainability regulations from governments (the Paris agreement),  the implementation of circular economy concepts combined with the positive and high Total Share Holder return will push companies to adapt themselves more radical than before.

What is clear is that disruptive technologies and concepts, like Industry 4.0, Digital Thread and Digital Twin, can serve a purpose when implemented efficiently, ensuring the business becomes sustainable.

Due to the lack of end-to-end experience, we need focus groups and conferences to share progress and lessons learned. And we do not need to hear the isolated vendor success stories here as a reference, as often they are siloed again and leading to proprietary environments.

You can see my full presentation on SlideShare: DISRUPTION – EXTINCTION or still EVOLUTION?

 

Building a profitable Digital T(win) business

Beatrice Gasser,  Technical, Innovation, and Sustainable Development Director from the Egis group, gave an exciting presentation related to the vision and implementation of digital twins in the construction industry.

The Egis group both serves as a consultancy firm as well as an asset management organization. You can see a wide variety of activities on their website or have a look at their perspectives

Historically the construction industry has been lagging behind having low productivity due to fragmentation, risk aversion and recently, more and more due to the lack of digital talent. In addition, some of the construction companies make their money from claims inside of having a smooth and profitable business model.

Without innovation in the construction industry, companies working the traditional way would lose market share and investor-focused attention, as we can see from the BCG diagram I discussed in my session.

The digital twin of construction is an ideal concept for the future. It can be built in the design phase to align all stakeholders, validate and integrate solutions and simulate the building operational scenarios at almost zero materials cost. Egis estimates that by using a digital twin during construction, the engineering and construction costs of a building can be reduced between 15 and 25 %

More importantly, the digital twin can also be used to first simulate operations and optimize energy consumption. The connected digital twin of an existing building can serve as a new common data environment for future building stakeholders. This could be the asset owner, service companies, and even the regulatory authorities needing to validate the building’s safety and environmental impact.

Beatrice ended with five principles essential to establish a digital twin, i.e

I think the construction industry has a vast potential to disrupt itself. Faster than the traditional manufacturing industries due to their current needs to work in a best-connected manner.

Next, there is almost no legacy data to deal with for these companies. Every new construction or building is a unique project on its own. The key differentiators will be experience and efficient ways of working.

It is about the belief, the guts and the skilled people that can make it work – all for a more efficient and sustainable future.

 

 

Leveraging PLM and Cloud Technology for Market Success

Stan Przybylinski, Vice President of CIMdata, reported their global survey related to the cloud, completed in early 2021.  Also, Stan typified Industry 4.0 as a connected vision and cloud and digital thread as enablers to implementing this vision.

The companies interviewed showed a lot of goodwill to make progress – click on the image to see the details. CIMdata is also working with PLM Vendors to learn and describe better the areas of beneft. I remain curious about who comes with a realization and business case that is future-proof. This will define our new PLM Value Equation.

 

Conclusion

These were two exciting days with enough mentioning of disruptive technologies. Our challenge in the PLM domain will be to give them a purpose. A purpose is likely driven by external factors related to the need for a sustainable future.  Efficiency and effectiveness must come from learning to work in connected environments (digital twin, digital thread, industry 4.0, Model-Based (Systems) Engineering.

Note: You might have seen the image below already – a nice link between sustainability and the mission to Mars

One of my favorite conferences is the PLM Road Map & PDT conference. Probably because in the pre-COVID days, it was the best PLM conference to network with peers focusing on PLM practices, standards, and sustainability topics. Now the conference is virtual, and hopefully, after the pandemic, we will meet again in the conference space to elaborate on our experiences further.

Last year’s fall conference was special because we had three days filled with a generic PLM update and several A&D (Aerospace & Defense) working groups updates, reporting their progress and findings. Sessions related to the Multiview BOM researchGlobal Collaboration, and several aspects of Model-Based practices: Model-Based Definition, Model-Based Engineering & Model-Based Systems engineering.

All topics that I will elaborate on soon. You can refresh your memory through these two links:

This year, it was a two-day conference with approximately 200 attendees discussing how emerging technologies can disrupt the current PLM landscape and reshape the PLM Value Equation. During the first day of the conference, we focused on technology.

On the second day, we looked in addition to the impact new technology has on people and organizations.

Today’s Emerging Trends & Disrupters

Peter Bilello, CIMdata’s President & CEO, kicked off the conference by providing CIMdata observations of the market. An increasing number of technology capabilities, like cloud, additive manufacturing, platforms, digital thread, and digital twin, all with the potential of realizing a connected vision. Meanwhile, companies evolve at their own pace, illustrating that the gap between the leaders and followers becomes bigger and bigger.

Where is your company? Can you afford to be a follower? Is your PLM ready for the future? Probably not, Peter states.

Next, Peter walked us through some technology trends and their applicability for a future PLM, like topological data analytics (TDA), the Graph Database, Low-Code/No-Code platforms, Additive Manufacturing, DevOps, and Agile ways of working during product development. All capabilities should be related to new ways of working and updated individual skills.

I fully agreed with Peter’s final slide – we have to actively rethink and reshape PLM – not by calling it different but by learning, experimenting, and discussing in the field.

Digital Transformation Supporting Army Modernization

An interesting viewpoint related to modern PLM came from Dr. Raj Iyer, Chief Information Officer for IT Reform from the US Army. Rai walked us through some of the US Army’s challenges, and he gave us some fantastic statements to think about. Although an Army cannot be compared with a commercial business, its target remains to be always ahead of the competition and be aware of the competition.

Where we would say “data is the new oil”, Rai Iyer said: “Data is the ammunition of the future fight – as fights will more and more take place in cyberspace.”

The US Army is using a lot of modern technology – as the image below shows. The big difference here with regular businesses is that it is not about ROI but about winning fights.

Also, for the US Army, the cloud becomes the platform of the future. Due to the wide range of assets, the US Army has to manage, the importance of product data standards is evident.  – Rai mentioned their contribution and adherence to the ISO 10303 STEP standard crucial for interoperability. It was an exciting insight into the US Army’s current and future challenges. Their primary mission remains to stay ahead of the competition.

Joining up Engineering Data without losing the M in PLM

Nigel Shaw’s (Eurostep) presentation was somehow philosophical but precisely to the point what is the current dilemma in the PLM domain.  Through an analogy of the internet, explaining that we live in a world of HTTP(s) linking, we create new ways of connecting information. The link becomes an essential artifact in our information model.

Where it is apparent links are crucial for managing engineering data, Nigel pointed out some of the significant challenges of this approach, as you can see from his (compiled) image below.

I will not discuss this topic further here as I am planning to come back to this topic when explaining the challenges of the future of PLM.

As Nigel said, they have a debate with one of their customers to replace the existing PLM tools or enhance the existing PLM tools. The challenge of moving from coordinated information towards connected data is a topic that we as a community should study.

Integration is about more than Model Format.

This was the presentation I have been waiting for. Mark Williams from Boeing had built the story together with Adrian Burton from Airbus. Nigel Shaw, in the previous session, already pointed to the challenge of managing linked information. Mark elaborated further about the model-based approach for system definition.

All content was related to the understanding that we need a  model-based information infrastructure for the future because storing information in documents (the coordinated approach) is no longer viable for complex systems. Mark ‘slide below says it all.

Mark stressed the importance of managing model information in context, and it has become a challenge.

Mark mentioned that 20 years ago, the IDC (International Data Corporation) measured Boeing’s performance and estimated that each employee spent 2 ½ hours per day. In 2018, the IDC estimated that this number has grown to 30 % of the employee’s time and could go up to 50 % when adding the effort of reusing and duplicating data.

The consequence of this would be that a full-service enterprise, having engineering, manufacturing and services connected, probably loses 70 % of its information because they cannot find it—an impressive number asking for “clever” ways to find the correct information in context.

It is not about just a full indexed search of the data, as some technology geeks might think. It is also about describing and standardizing metadata that describes the models. In that context, Mark walked through a list of existing standards, all with their pros and cons, ending up with the recommendation to use the ISO 10303-243 – MoSSEC standard.

MoSSEC standing for Modelling and Simulation information in a collaborative Systems Engineering Context to manage and connect the relationships between models.

MoSSEC and its implication for future digital enterprises are interesting, considering the importance of a model-based future. I am curious how PLM Vendors and tools will support and enable the standard for future interoperability and collaboration.

Additive Manufacturing
– not as simple as paper printing – yet

Andreas Graichen from Siemens Energy closed the day, coming back to the new technologies’ topic: Additive Manufacturing or in common language 3D Printing. Andreas shared their Additive Manufacturing experiences, matching the famous Gartner Hype Cycle. His image shows that real work needs to be done to understand the technology and its use cases after the first excitement of the hype is over.

Material knowledge was one of the important topics to study when applying additive manufacturing. It is probably a new area for most companies to understand the material behaviors and properties in an Additive Manufacturing process.

The ultimate goal for Siemens Energy is to reach an “autonomous” workshop anywhere in the world where gas turbines could order their spare parts by themselves through digital warehouses. It is a grand vision, and Andreas confirmed that the scalability of Additive Manufacturing is still a challenge.

For rapid prototyping or small series of spare parts, Additive Manufacturing might be the right solution. The success of your Additive Manufacturing process depends a lot on how your company’s management has realistic expectations and the budget available to explore this direction.

Conclusion

Day 1 was enjoyable and educational, starting and ending with a focus on disruptive technologies. The middle part related to data the data management concepts needed for a digital enterprise were the most exciting topics to follow up in my opinion.

Next week I will follow up with reviewing day 2 and share my conclusions. The PLM Road Map & PDT Spring 2021 conference confirmed that there is work to do to understand the future (of PLM).

 

Translate

Email subscription to this blog

Categories

%d bloggers like this: