You are currently browsing the category archive for the ‘PLM’ category.
In the past two weeks, I had several discussions with peers in the PLM domain about their experiences.
Some of them I met after a long time again face-to-face at the LiveWorx 2023 event. See my review of the event here: The Weekend after LiveWorx 2023.
And there were several interactions on LinkedIn, leading to a more extended discussion thread (an example of a digital thread ?) or a Zoom discussion (a so-called 2D conversation).
To complete the story, I also participated in two PLM podcasts from Share PLM, where we interviewed Johan Mikkelä (currently working at FLSmidth) and, in the second episode Issam Darraj (presently working at ABB) about their PLM experiences. Less a discussion, more a dialogue, trying to grasp the non-documented aspects of PLM. We are looking for your feedback on these podcasts too.
All these discussions led to a reconfirmation that if you are a PLM practitioner, you need a broad skillset to address the business needs, translate them into people and process activities relevant to the industry and ultimately implement the proper collection of tools.
As a sneaky preview for the podcast sessions, we asked both Johan and Issam about the importance of the tools. I will not disclose their answers here; you have to listen.
Let’s look at some of the discussions.
NOTE: Just before pushing the Publish button, Oleg Shilovitsky published this blog article PLM Project Failures and Unstoppable PLM Playbook. I will comment on his points at the end of this post. It is all part of the extensive discussion.
PLM, LinkedIn and complexity
The most popular discussions on LinkedIn are often related to the various types of Bills of Materials (eBOM, mBOM, sBOM), Part numbering schemes (intelligent or not), version and revision management and the famous FFF discussions.
This post: PLM and Configuration Management Best Practices: Working with Revisions, from Andreas Lindenthal, was a recent example that triggered others to react.
I had some offline discussions on this topic last week, and I noticed Frédéric Zeller wrote his post with the title PLM, LinkedIn and complexity, starting his post with (quote):
I am stunned by the average level of posts on the PLM on LinkedIn.
I’m sorry, but in 2023 :
- Part Number management (significant, non-significant) should no longer be a problem.
- Revision management should no longer be a question.
- Configuration management theory should no longer be a question.
- Notions of EBOMs, MBOMs … should no longer be a question.
So why are there still problems on these topics?
You can see from the at least 40+ comments that this statement created a lot of reactions, including mine. Apparently, these topics are touching many people worldwide, and there is no simple, single answer to each of these topics. And there are so many other topics relevant to PLM.
Talking later with Frederic for one hour in a Zoom session, we discussed the importance of the right PLM data model.
I also wrote a series about the (traditional) PLM data model: The importance of a (PLM) data model.
Frederic is more of a PLM architect; we even discussed the wording related to the EBOM and the MBOM. A topic that I feel comfortable discussing after many years of experience seeing the attempts that failed and the dreams people had. And this was only one aspect of PLM.
You also find the discussion related to a PLM certification in the same thread. How would you certify a person as a PLM expert?
There are so many dimensions to PLM. Even more important, the PLM from 10-15 years ago (more of a system discussion) is no longer the PLM nowadays (a strategy and an infrastructure) –
This is a crucial difference. Learning to use a PLM tool and implement it is not the same as building a PLM strategy for your company. It is Tools, Process, People versus Process, People, Tools and Data.
Time for Methodology workshops?
I recently discussed with several peers what we could do to assist people looking for best practices discussion and lessons learned. There is a need, but how to organize them as we cannot expect this to be voluntary work.
In the past, I suggested MarketKey, the organizer of the PI DX events, extend its theme workshops. For example, instead of a 45-min Focus group with a short introduction to a theme (e.g., eBOM-mBOM, PLM-ERP interfaces), make these sessions last at least half a day and be independent of the PLM vendors.
Apparently, it did not fit in the PI DX programming; half a day would potentially stretch the duration of the conference and more and more, we see two days of meetings as the maximum. Longer becomes difficult to justify even if the content might have high value for the participants.
I observed a similar situation last year in combination with the PLM roadmap/PDT Europe conference in Gothenburg. Here we had a half-day workshop before the conference led by Erik Herzog(SAAB Aeronautics)/ Judith Crockford (Europstep) to discuss concepts related to federated PLM – read more in this post: The week after PLM Roadmap/PDT Europe 2022.
It reminded me of an MDM workshop before the 2015 Event, led by Marc Halpern from Gartner. Unfortunately, the federated PLM discussion remained a pretty Swedish initiative, and the follow-up did not reach a wider audience.
And then there are the Aerospace and Defense PLM action groups that discuss moderated by CIMdata. It is great that they published their findings (look here), although the best lessons learned are during the workshops.
However, I also believe the A&D industry cannot be compared to a mid-market machinery manufacturing company. Therefore, it is helpful for a smaller audience only.
And here, I inserted a paragraph dedicated to Oleg’s recent post, PLM Project Failures and Unstoppable PLM Playbook – starting with a quote:
How to learn to implement PLM? I wrote about it in my earlier article – PLM playbook: how to learn about PLM? While I’m still happy to share my knowledge and experience, I think there is a bigger need in helping manufacturing companies and, especially PLM professionals, with the methodology of how to achieve the right goal when implementing PLM. Which made me think about the Unstoppable PLM playbook ©.
I found a similar passion for helping companies to adopt PLM while talking to Helena Gutierrez. Over many conversations during the last few months, we talked about how to help manufacturing companies with PLM adoption. The unstoppable PLM playbook is still a work in progress, but we want to start talking about it to get your feedback and start the conversation.
It is an excellent confirmation of the fact that there is a need for education and that the education related to PLM on the Internet is not good enough.
As a former teacher in Physics, I do not believe in the Unstoppable PLM Playbook, even if it is a branded name. Many books are written by specific authors, giving their perspectives based on their (academic) knowledge.
Are they useful? I believe only in the context of a classroom discussion where the applicability can be discussed,
Therefore my questions to vendor-neutral global players, like CIMdata, Eurostep, Prostep, SharePLM, TCS and others, are you willing to pick up this request? Or are there other entities that I missed? Please leave your thoughts in the comments. I will be happy to assist in organizing them.
There are many more future topics to discuss and document too.
- What about the potential split of a PLM infrastructure between Systems of Record & Systems of Engagement?
- What about the Digital Thread, a more and more accepted theme in discussions, but what is the standard definition?
- Is it traceability as some vendors promote it, or is it the continuity of data, direct usable in various contexts – the DevOps approach?
Who likes to discuss methodology?
When asking myself this question, I see the analogy with standards. So let’s look at the various players in the PLM domain – sorry for the immense generalization.
Strategic consultants: standards are essential, but spare me the details.
Vendors: standards are limiting the unique capabilities of my products
Implementers: two types – Those who understand and use standards as they see the long-term benefits. Those who avoid standards as it introduces complexity.
Companies: they love standards if they can be implemented seamlessly.
Universities: they love to explore standards and help to set the standards even if they are not scalable
Just replace standards with methodology, and you see the analogy.
We like to discuss the methodology.
As I mentioned in the introduction, I started to work with Share PLM on a series of podcasts where we interview PLM experts in the field that have experience with the people, the process, the tools and the data side. Through these interviews, you will realize PLM is complex and has become even more complicated when you consider PLM a strategy instead of a tool.
We hope these podcasts might be a starting point for further discussion – either through direct interactions or through contributions to the podcast. If you have PLM experts in your network that can explain the complexity of PLM from various angles and have the experience. Please let us know – it is time to share.
Conclusion
By switching gears, I noticed that PLM has become complex. Too complex for a single person to master. With an aging traditional PLM workforce (like me), it is time to consolidate the best practices of the past and discuss the best practices for the future. There are no simple answers, as every industry is different. Help us to energize the PLM community – your thoughts/contributions?
Last week I enjoyed visiting LiveWorx 2023 on behalf of the PLM Global Green Alliance. PTC had invited us to understand their sustainability ambitions and meet with the relevant people from PTC, partners, customers and several of my analyst friends. It felt like a reunion.
In addition, I used the opportunity to understand better their Velocity SaaS offering with OnShape and Arena. The almost 4-days event, with approximately 5000 attendees, was massive and well-organized.
So many people were excited that this was again an in-person event after four years.
With PTC’s broad product portfolio, you could easily have a full agenda for the whole event, depending on your interests.
I was personally motivated that I had a relatively full schedule focusing purely on Sustainability, leaving all these other beautiful end-to-end concepts for another time.
Here are some of my observations
Jim Heppelman’s keynote
The primary presentation of such an event is the keynote from PTC’s CEO. This session allows you to understand the company’s key focus areas.
My takeaways:
- Need for Speed: Software-driven innovation, or as Jim said, Software is eating the BOM, reminding me of my recent blog post: The Rise and Fall of the BOM. Here Jim was referring to the integration with ALM (CodeBeamer) and IoT to have full traceability of products. However, including Software also requires agile ways of working.
- Need for Speed: Agile ways of working – the OnShape and Arena offerings are examples of agile working methods. A SaaS solution is easy to extend with suppliers or other stakeholders. PTC calls this their Velocity offering, typical Systems of Engagement, and I spoke later with people working on this topic. More in the future.
- Need for Speed: Model-based digital continuity – a theme I have discussed in my blog post too. Here Jim explains the interaction between Windchill and ServiceMax, both Systems of Record for product definition and Operation.
- Environmental Sustainability: introducing Catherine Kniker, PTC’s Chief Strategy and Sustainability Officer, announcing that PTC has committed to Science Based Targets, pledging near-term emissions reductions and long-term net-zero targets – see image below and more on Sustainability in the next section.
- A further investment in a SaaS architecture, announcing CREO+ as a SaaS solution supporting dynamic multi-user collaboration (a System of Engagement)
- A further investment in the partnership with Ansys fits the needs of a model-based future where modeling and simulation go hand in hand.
You can watch the full session Path to the Future: Products in the Age of Transformation here.
Sustainability
The PGGA spoke with Dave Duncan and James Norman last year about PTC’s sustainability initiatives. Remember: PLM and Sustainability: talking with PTC. Therefore, Klaus Brettschneider and I were happy to meet Dave and James in person just before the event and align on understanding what’s coming at PTC.
We agreed there is no “sustainability super app”; it is more about providing an open, digital infrastructure to connect data sources at any time of the product lifecycle, supporting decision-making and analysis. It is all about reliable data.
Product Sustainability 101
On Tuesday, Dave Duncan gave a great introductory session, Product Sustainability 101, addressing Business Drivers and Technical Opportunities. Dave started by explaining the business context aiming at greenhouse gas (GHG) reduction based on science-based targets, describing the content of Scope 1, Scope 2 and Scope 3 emissions.
The image above, which came back in several presentations later that week, nicely describes the mapping of lifecycle decisions and operations in the context of the GHG protocol.
Design for Sustainability (DfS)
On Wednesday, I started with a session moderated by James Norman titled Design for Sustainability: Harnessing Innovation for a Resilient Future. The panel consisted of Neil D’Souza (CEO Makersite), Tim Greiner (MD Pure Strategies), Francois Lamy (SVP Product Management PTC) and Asheen Phansey (Director ESG & Sustainability at PagerDuty). You can find the topic discussed below:
Some of the notes I took:
- No specific PLM modules are needed, LCA needs to become an additional practice for companies, and they rely on a connected infrastructure.
- Where to start? First, understand the current baseline based on data collection – what is your environmental impact? Next, decide where to start
- The importance of Design for Service – many companies design products for easy delivery, not for service. Being able to service products better will extend their lifetime, therefore reducing their environmental impact (manufacturing/decommissioning)
- There Is a value chain for carbon data. In addition, suppliers significantly impact reaching net zero, as many OEMs have an Assembly To Order process, and most of the emissions are done during part manufacturing.
DfS: an example from Cummins
Next, on Wednesday, I attended the session from David Genter from Cummins, who presented their Design for Sustainability (DfS) project.
Dave started by sharing their 2030 sustainability goals:
- On Facilities and Operations: A reduction of 50 % of GHG emissions, reducing water usage by 30 %, reducing waste by 25 % and reducing organic compound emissions by 50%
- Reducing Scope 3 emissions for new products by 25%
- In general, reducing Scope 3 emissions by 55M metric tons.
The benefits for products were documented using a standardized scorecard (example below) to ensure the benefits are real and not based on wishful thinking.
Many motivated people wanted to participate in the project, and the ultimate result demonstrated that DfS has both business value for Cummins and the environment.
The project has been very well described in this whitepaper: How Cummins Made Changes to Optimize Product Designs for the Environment – a recommended case study to read.
Tangible Strategies for Improving Product Sustainability
The session was a dialogue between Catherine Kniker and Dave Duncan, discussing the strategies to move forward with Sustainability.
They reiterated the three areas where we as a PLM community can improve: Material choice and usage, Addressing Energy Emissions and Reducing Waste. And it is worth addressing them all, as you can see below – it is not only about carbon reduction.
It was an informative dialogue going through the different aspects of where we, as an engineering/ PLM community, can contribute. You can watch their full dialog here: Tangible Strategies for Improving Product Sustainability.
Conclusion
It was encouraging to see that at such an event as LiveWorx, you could learn about Sustainability and discuss Sustainability with the audience and PTC partners. And as I mentioned before, we need to learn to measure (data-driven / reliable data), and we need to be able to work in a connected infrastructure (digital thread) to allow design, simulation, validation and feedback to go hand in hand. It requires adapting a business strategy, not just a tactical solution. With the PLM Global Green Alliance, we are looking forward to following up on these.
NOTE: PTC covered the expenses associated with my participation in this event but did not in any way influence the content of this post – I made my tour fully independent through the conference and got encouraged by all the conversations I had.
Imagine you are a supplier working for several customers, such as big OEMs or smaller companies. In Dec 2020, I wrote about PLM and the Supply Chain because it was an underexposed topic in many companies. Suppliers need their own PLM and IP protection and work as efficiently as possible with their customers, often the OEMs.
Most PLM implementations always start by creating the ideal internal collaboration between functions in the enterprise. Historically starting with R&D and Engineering, next expanding to Manufacturing, Services and Marketing. Most of the time in this logical order.
In these implementations, people are not paying much attention to the total value chain, customers and suppliers. And that was one of the interesting findings at that time, supported by surveys from Gartner and McKinsey:
- Gartner: Companies reported improvements in the accuracy of product data and product development as the main benefit of their PLM implementation. They did not see so much of a reduced time to market or reduced product development costs. After analysis, Gartner believes the real issue is related to collaboration processes and supply chain practices. Here the lead times did not change, nor did the number of changes.
- McKinsey: In their article, The Case for Digital Reinvention, digital supply chains were mentioned as the area with the potential highest ROI; however, as the image shows below, it was the area with the lowest investment at that time.
In 2020 we were in the middle of broken supply chains and wishful thinking related to digital transformation, all due to COVID-19.
Meanwhile, the further digitization in PLM (systems of engagement) and the new topic, Sustainability of the supply chain, became visible.
Therefore it is time to make a status again, also driven by discussions in the past few weeks.
The old “connected” approach (loose-loose).
A preferred way for OEMs in the past was to have the Supplier or partner directly work in their PLM environment. The OEM could keep control of the product development process and the incremental maturity of the BOM, where the Supplier could connect their part data and designs to the OEM environment. T
The advantage for the OEM is clear – direct visibility of the supplier data when available. The benefit for the Supplier could also be immediate visibility of the broader context of the part they are responsible for.
However, the disadvantages for a supplier are more significant. Working in the OEM environment exposes all your IP and hinders knowledge capitalization from the Supplier. Not a big thing for perhaps a tier 3 supplier; however, the more advanced the products from the Supplier are, the higher the need to have its own PLM environment.
Therefore the old connected approach is a loose-loose relationship in particular for the Supplier and even for the OEM (having less knowledgeable suppliers)
The modern “connected” approach (wins t.b.d.)
In this situation, the target infrastructure is a digital infrastructure, where datasets are connected in real-time, providing the various stakeholders in engagement access to a filtered set of data relevant to their roles.
In my terminology, I refer to them as Systems of Engagement, as the target is that all stakeholders work in this environment.
The counterpart of Systems of Engagement is the Systems of Record, which provides a product baseline, manufacturing baseline, and configuration baseline of information consumed by other disciplines.
These baselines are often called Bills of Information, and the traditional PLM system has been designed as a System of Record. Major Bills of Information are the eBOM, the mBOM and sometimes people talk about the sBOM(service BOM).
Typical examples of Systems of Engagement I have seen in alphabetical order are:
- Arena Solutions has a long-term experience in BOM collaboration between engineering teams, suppliers and contract manufacturers.
- CATENA-X might be a strange player in this list, as CATENA-X is more a German Automotive consortium targeting digital collaboration between stakeholders, ensuring security and IP protection.
- Colab is a provider of cloud-based collaboration software allowing design teams and suppliers to work in real time together.
- OnShape – a cloud-based collaborative product design environment for dispersed engineering teams and partners.
- OpenBOM – a SaaS solution focusing on BOM collaboration connected to various CAD systems along with design teams and their connected suppliers
These are some of the Systems of Engagement I am aware of. They focus on specific value streams that can improve the targeted time to market and product introduction efficiency. In companies with no extensive additional PLM infrastructure, they can become crucial systems of engagement.
The main challenge for these systems of engagement is how they will connect to traditional Systems or Records – the classical PLM systems that we know in the market (Aras, Dassault, PTC, Siemens).
Image on the left from a presentation done by Eric Herzog from SAAB at last year’s CIMdata/PDT conference.
You can read more about this here.
When establishing a mix of Systems of Engagement and Systems of Record in your organization digitally connected, we will see overall benefits. My earlier thoughts, in general, are here: Time to split PLM?
The almost Connected approach
As I mentioned, in most companies, it is already challenging to manage their internal System of Record, which is needed for current operations and the traceability of information. In addition, most of the data stored in these systems is document-driven, not designed for real-time collaboration. So how would these companies collaborate with their suppliers?
The Model-Based Enterprise
In the bigger image below, I am referring to an image published by Jennifer Herron from her book Re-use Your CAD, where she describes the various stages of interaction between engineering, manufacturing and the extended enterprise.
Her mission is to promote and educate organizations in moving to a Model-Based Definition and, in the long term, to a Model-Base Enterprise.
The ultimate target of information exchange in this diagram is that the OEM and the Supplier are separate entities. However, they can exchange Digital Product Definition Packages and TDPs over the web (electronically). In this exchange, we have a mix of systems of engagement and systems of record on the OEM and Supplier sides.
Depending on the type of industry, in my ecosystem of companies, many suppliers are still at level 2, dreaming or pushed to become level 3, illustrating there is a difficult job to do – learning new practices. And why would you move to the next level?
Every step can have significant benefits, as reported by companies that did this.
So what’s stopping your company from moving ahead? People, Processes, Skills, Work Pressure? It is one of the most common excuses: “We are too busy, no time to improve”.
A supply chain collaboration hub
On March 21, I discussed with Magnus Färneland from Eurostep their cloud-based PLM collaboration hub, ShareAspace. You can read the interview here: PLM and Supply Chain Collaboration
I believe this concept can be compelling for a connected enterprise. The OEM and the Supplier share (or connect) only the data they want to share, preferably based on the PLCS data schema (ISO 10303-239).
In a primitive approach, this can be BOM structures with related files; however, it could become a real model-based connection hub in the advanced mode. “
Now you ask yourself why this solution is not booming.
In my opinion, there are several points to consider:
- Who designs, operates and maintains the collaboration hub?
It is likely not the suppliers, and when the OEM takes ownership, they might believe there is no need for the extra hub; just use the existing PLM infrastructure. - Could a third party find a niche market for this? Eurostep has already been working on this for many years, but adopting the concept seems higher in de BIM or Asset Management domains. Here the owner/operator sees the importance of a collaboration hub.
A final remark, we are still far from a connected enterprise; concepts like Catena-X and others need to become mature to serve as a foundation – there is a lot of technology out there -now we need the skilled people and tested practices to use the right technology and tune solutions concepts.
Sustainability demands a connected enterprise.
I focused on the Supplier dilemma this time because it is one of the crucial aspects of a circular economy and sustainable product development.
Only by using virtual models of the To-Be products/systems can we seriously optimize them. Virtual models and Digital Twins do not run on documents; they require accurate data from anywhere connected.
You can read more details in my post earlier this year: MBSE and Sustainability or look at the PLM and Sustainability recording on our PLM Global Green Alliance YouTube channel.
Conclusion
Due to various discussions I recently had in the field, it became clear that the topic of supplier integration in a best-connected manner is one of the most important topics to address in the near future. We cannot focus longer on our company as an isolated entity – value streams implemented in a connected manner become a must.
And now I am going to enjoy Liveworx in Boston, learning, discussing and understanding more about what PTC is doing and planning in the context of digital transformation and sustainability. More about that in my next post: The week(end) after Liveworx 2023 (to come)
This month it is exactly 15 years ago that I started my blog, a little bit nervous and insecure. Blogging had not reached mainstream yet, and how would people react to my shared experiences?
The main driver behind my blog in 2008 was to share field experiences when implementing PLM in the mid-market.
As a SmarTeam contractor working closely with Dassault and IBM PLM, I learned that implementing PLM (or PDM) is more than a technology issue.
Discussing implementations made me aware of the importance of the human side. Customers had huge expectations with such a flexible toolkit, and implementers made money by providing customization to any user request.
No discussion if it was needed, as the implementer always said: “Yes, we can (if you pay)”.
The parallel tree
And that’s where my mediation started. At a particular moment, the customer started to get annoyed of again another customization. The concept of a “parallel tree,” a sync between the 3D CAD structure and the BOM, was many times a point of discussion.
So many algorithms have been invented to convert a 3D CAD structure into a manufacturing BOM. Designing glue and paint in CAD as this way it would appear in the BOM.
The “exploded” data model
A result of customizations that ended up in failure were the ones with a crazy data model, too many detailed classes, and too many attributes per class.
Monsters were created by some well-willingly IT departments collecting all the user needs, however unworkable by the end users. See my 2015 post here: The Importance of a PLM data model
The BOM concepts
While concepts and best practices have become stable for traditional PLM, where we talk more about a Product Information backbone, there is still considerable debate about this type of implementation. The leading cause for the discussion is that companies often start from their systems and newly purchased systems and then try to push the people and processes into that environment.
For example, see this recent discussion we had with Oleg Shilovitsky (PLM, ERP, MES) and others on LinkedIn.
These were the days before we entered into digital transformation in the PLM domain, and starting from 2015, you can see in my blog posts the mission. Exploring what a digital enterprise would look like and what the role of PLM will be.
The Future
Some findings I can already share:
- No PLM system can do it all – where historically, companies bought a PLM system; now, they have to define a PLM strategy where the data can flow (controlled) in any direction. The PLM strategy needs to be based on value streams of connected data between relevant stakeholders supported by systems of engagement. From System to Strategy.
- Master Data Management and standardization of data models might still be a company’s internal activity (as the environment is stable). Still, to the outside world/domains, there is a need for flexible connections (standard flows / semantic web). From Rigid to Flexible.
- The meaning of the BOM will change for coordinated structures towards an extract of a data-driven PLM environment, where the BOM mainly represents the hardware connected to software releases. Configuration management practices must also change (see Martijn – and the Rise and Fall of the BOM). From Placeholders to Baselines.
- Digital Transformation in the PLM domain is not an evolution of the data. Legacy data has never been designed to be data-driven; migration is a mission impossible. Therefore there is a need to focus on a hybrid environment with two modes: enterprise backbone (System of Record) and product-centric infrastructure (Systems of Engagements). From Single Source of Truth to Authoritative Source of Truth.
Switching Gears
Next week I have reached the liable age for my Dutch pension, allowing me to switch gears.
Instead of driving in high-performance mode, I will start practicing driving in a touristic mode, moving from points of interest to other points of interest while caring for the environment.
Here are some of the topics to mention at this moment.
Reviving the Share PLM podcast
Together with the Share PLM team, we decided to revive their podcast as Season 2. I referred to their podcast last year in my PLM Holiday thoughts 2022 post.
The Share PLM team has always been the next level of what I started alone in 2008. Sharing and discussing PLM topics with interest on the human side, supporting organizational change through targeted e-learning deliverables based on the purpose of a PLM implementation. People (first), Processes (needed) and the Tools (how) – in this order.
In Season 2 of the podcast, we want to discuss with experienced PLM practitioners the various aspects of PLM – not only success stories you often hear at PLM conferences.
Experience is what you get when you do not get what you expect.
And PLM is a domain where experience with people, processes and tools counts.
Follow our podcast here, subscribe to it on your favorite platform and feel free to send us questions. Besides the longer interviews, we will also discuss common questions in separate recordings or as a structured part of the podcast.
Sustainability!
I noticed from my Sustainability related blog posts that they resonate less with my blogging audience. I am curious about the reason behind this.
Does it mean in our PLM community, Sustainability is still too vague and not addressed in the reader’s daily environment? Or is it because people do not see the relation to PLM and are more focused on carbon emissions, greenhouse gasses and the energy transition – a crucial part of the sustainable future that currently gets much attention?
I just discovered this week I just read this post: CEO priorities from 2019 until now: What has changed? As the end result shows below, sustainability has been ranked #7 in 2019, and after some ups and downs, it is still at priority level #7. This is worrying me as it illustrates that at the board level, not so much has changed, despite the increasing understanding of the environmental impact and the recent warnings from the climate. The warnings did not reach the boardrooms yet.
In addition, I will keep on exploring the relationship between PLM and Sustainability, and in that context, I am looking forward to my learnings and discussions at the upcoming PTC Liveworx event in Boston. Do I see yo there?
Here I hope to meet with their sustainability thought leaders and discuss plans to come up with concrete activities related to PLM and Sustainability.
Somehow it is similar to the relationship between Digital Transformation and the PLM domain. Although we talk already for over 10 years about the digitalization of the entire business; in the PLM domain, it has just started,
Awareness sessions
Companies have a considerable challenge translating a C-level vision into a successful business transformation supported by people active in the field.
Or on the opposite, highly motivated people in the organization see the opportunity to improve their current ways of working dramatically due to digitization.
However, they struggle with translating their deep understanding into messages and actions that are understood and supported by the executive management. In the past ten years, I have been active in various transformational engagements, serving as a “translator” between all stakeholders. I will continue this work as it is a unique way to coach companies, implementers and software vendors to understand each other.
Conclusions
Fifteen years of blogging has brought me a lot – constantly forcing yourself to explain what you observe around you and what it means for the PLM domain. My purpose in sharing these experiences with you in a non-academic matter has led to a great network of people and discussions. Some are very interactive, like Håkan Kårdén and Oleg Shilovitsky (the top two) and others, in an indirect way, provide their feedback.
Switching gears will not affect the blogging and the network – It might even lead to deeper insights as the time to observe and enjoy will be longer.
Keep your seatbelts fastened.
We are happy to start the year with the next PLM Global Green Alliances (PGGA) series round: PLM and Sustainability.
Last year we spoke mainly with the prominent PLM software editors (Aras, Autodesk, Dassault Systèmes, PTC, SAP) and Sustaira (Sustainability platform – Siemens partner).
This time we talked with Mark Reisig, Sustainability and Green Energy Practice Director & Executive Consultant from CIMdata. The good news is that discussing a PLM strategy and Sustainability is no longer a software discussion.
With CIMdata’s sustainability offering introduced last year, it becomes clear that the topic of sustainability reached a broader level than the tools.
CIMdata
CIMdata is well known in the PLM domain, focusing on Market Analysis, Education, Research & Strategic Management Consulting, all related to PLM.
Last year, Mark joined CIMdata as Green Energy Practice Director & Executive Consultant. Listening to Mark, you will discover he has an exciting background, starting with the “Keeling Curve”, his early interest in oceanography and wind turbines, working with GE later in his career and many years active in the PLM domain.
Learn more from the 40 minutes discussion with Mark below.
You can download the slides shown during the recording HERE
What we have learned
CIMdata has been discussing and promoting a circular economy already for a long time. A sustainable future and a circular economy have been a theme in many of the PLM Roadmap & PDT conferences. It is a logical relation as implementing a circular strategy depends significantly on the product design approach.
CIMdata also combines Sustainability with the need to digitize the processes and data handled. A data-driven approach will allow companies to measure (and estimate) better their environmental impact.
- CIMdata believes sustainability must be embedded in PLM for companies to reduce their product carbon footprint, and they must have greater visibility into their supply chain.
Mark mentions that focusing on a sustainable business model (product & business) is crucial for survival in the upcoming years, and this has increasingly landed at the board level of companies.
- The major change has to be driven by the business. PLM vendors will not drive the change; they will align their portfolio offerings based on the market needs.
- It was clear Mark has a lot of experience in wind energy throughout his whole lifecycle 😊
Want to learn more
Mark already pointed to several valuable resources in our discussion to learn more. Here are the most important links related to CIMdata
- Sustainability and Green Energy Consulting Practice
Recent webinar: The Green Energy Transition and Sustainability from January 23, 2023
- Upcoming webinar: Meeting Sustainability and Green Energy Transition Objectives: The Industrial Perspective, April 27, 2023, 11:00 AM EDT
Conclusions
Last year we discussed sustainability with the software vendors and their product offerings. They all mentioned the importance of a data-driven approach and education. CIMdata has broadened the available sustainability offering for companies by providing additional education and strategy support.
Education at all levels is essential to make sustainable decisions. Sustainable for the company’s business and, above all, sustainable for the planet.
I will be @Livework in Boston, aiming to discuss PLM and Sustainability on behalf of the PGGA with PTC thought leaders. Will you be there too?
I was happy to present and participate at the 3DEXEPRIENCE User Conference held this year in Paris on 14-15 March. The conference was an evolution of the previous ENOVIA User conferences; this time, it was a joint event by both the ENOVIA and the NETVIBES brand.
The conference was, for me, like a reunion. As I have worked for over 25 years in the SmarTeam, ENOVIA and 3DEXPERIENCE eco-system, now meeting people I have worked with and have not seen for over fifteen years.
My presentation: Sustainability Demands Virtualization – and it should happen fast was based on explaining the transformation from a coordinated (document-driven) to a connected (data-driven) enterprise.
There were 100+ attendees at the conference, mainly from Europe, and most of the presentations were coming from customers, where the breakout sessions gave the attendees a chance to dive deeper into the Dassault Systèmes portfolio.
Here are some of my impressions.
The power of ENOVIA and NETVIBES
I had a traditional view of the 3DEXPERIENCE platform based on my knowledge of ENOVIA, CATIA and SIMULIA, as many of my engagements were in the domain of MBSE or a model-based approach.
However, at this conference, I discovered the data intelligence side that Dassault Systèmes is bringing with its NETVIBES brand.
Where I would classify the ENOVIA part of the 3DEXPERIENCE platform as a traditional System of Record infrastructure (see Time to Split PLM?).
I discovered that by adding NETVIBES on top of the 3DEXPERIENCE platform and other data sources, the potential scope had changed significantly. See the image below:
As we can see, the ontologies and knowledge graph layer make it possible to make sense of all the indexed data below, including the data from the 3DEXPERIENCE Platform, which provides a modern data-driven layer for its consumers and apps.
The applications on top of this layer, standard or developed, can be considered Systems of Engagement.
My curiosity now: will Dassault Systèmes keep supporting the “old” system of record approach – often based on BOM structures (see also my post: The Rise and Fall of the BOM) combined with the new data-driven environment? In that case, you would have both approaches within one platform.
The Virtual Twin versus the Digital Twin
It is interesting to notice that Dassault Systèmes consistently differentiates between the definition of the Virtual Twin and the Digital Twin.
According to the 3DS.com website:
Digital Twins are simply a digital form of an object, a virtual version.
Unlike a digital twin prototype that focuses on one specific object, Virtual Twin Experiences let you visualize, model and simulate the entire environment of a sophisticated experience. As a result, they facilitate sustainable business innovation across the whole product lifecycle.
Understandably, Dassault Systemes makes this differentiation. With the implementation of the Unified Product Structure, they can connect CAD geometry as datasets to other non-CAD datasets, like eBOM and mBOM data.
The Unified Product Structure was not the topic of this event but is worthwhile to notice.
REE Automotive
The presentation from Steve Atherton from REE Automotive was interesting because here we saw an example of an automotive startup that decided to go pure for the cloud.
REE Automotive is an Israeli technology company that designs, develops, and produces electric vehicle platforms. Their mission is to provide a modular and scalable electric vehicle platform that can be used by a wide range of industries, including delivery and logistics, passenger cars, and autonomous vehicles.
Steve Atherton is the PLM 3DExperience lead for REE at the Engineering Centre in Coventry in the UK, where they have most designers. REE also has an R&D center in Tel Aviv with offshore support from India and satellite offices in the US
REE decided from the start to implement its PLM backbone in the cloud, a logical choice for such a global spread company.
The cloud was also one of the conference’s central themes, and it was interesting to see that a startup company like REE is pushing for an end-to-end solution based on a cloud solution. So often, you see startups choosing traditional systems as the senior members of the startup to take their (legacy) PLM knowledge to their next company.
The current challenge for REE is implementing the manufacturing processes (EBOM- MBOM) and complying as much as possible with the out-of-the-box best practices to make their cloud implementation future-proof.
Groupe Renault
Olivier Mougin, Head of PLM at Groupe RENAULT, talked about their Renaulution Virtual Twin (RVT) program. Renault has always been a strategic partner of Dassault Systèmes.
I remember them as one of the first references for the ENOVIA V6 backbone.
The Renaulution Virtual Twin ambition: from engineering to enterprise platform, is enormous, as you can see below:
Each of the three pillars has transformational aspects beyond traditional ways of working. For each pillar, Olivier explained the business drivers, expected benefits, and why a new approach is needed. I will not go into the details in this post.
However, you can see the transformation from an engineering backbone to an enterprise collaboration platform – The Renaulution!.
Ahmed Lguaouzi, head of marketing at NETVIBES, enforced the extended power of data intelligence on top of an engineering landscape as the target architecture.
Renault’s ambition is enormous – the ultimate dream of digital transformation for a company with a great legacy. The mission will challenge Renault and Dassault Systèmes to implement this vision, which can become a lighthouse for others.
3DS PLM Journey at MIELE
An exciting session close to my heart was the digital transformation story from MIELE, explained by André Lietz, head of the IT Products PLM @ Miele. As an old MIELE dishwasher owner, I was curious to learn about their future.
Miele has been a family-owned business since 1899, making high-end domestic and commercial equipment. They are a typical example of the power of German mid-market companies. Moreover, family-owned gives them stability and the opportunity to develop a multi-year transformation roadmap without being distracted by investor demands every few years.
André, with his team, is responsible for developing the value chain inside the product development process (PDP), the operation of nearly 90 IT applications, and the strategic transformation of the overarching PLM Mission 2027+.
As the slide below illustrates, the team is working on four typical transformation drivers:
- Providing customers with connected, advanced products (increasing R&D complexity)
- Providing employees with a modern, digital environment (the war for digital talent)
- Providing sustainable solutions (addressing the whole product lifecycle)
- Improving internal end-to-end collaboration and information visibility (PLM digital transformation)
André talked about their DELMIA pilot plant/project and its benefits to connect the EBOM and MBOM in the 3DEXPERIENCE platform. From my experience, this is a challenging topic, particularly in German companies, where SAP dominated the BOM for over twenty years.
I am curious to learn more about the progress in the upcoming years. The vision is there; the transformation is significant, but they have the time to succeed! This can be another digital transformation example.
And more …
Besides some educational sessions by Dassault Systemes (Laurent Bertaud – NETVIBES data science), there were also other interesting customer testimonies from Fernando Petre (IAR80 – Fly Again project), Christian Barlach (ISC Sustainable Construction) and Thelma Bonello (Methode Electronics – end-to-end BOM infrastructure). All sessions helped to get a better understanding about what is possible and what is done in the domain of PLM.
Conclusion
I learned a lot during these days, particularly the virtual twin strategy and the related capabilities of data intelligence. As the event was also a reunion for me with many people from my network, I discovered that we all aim at a digital transformation. We have a mission and a vision. The upcoming years will be crucial to implement the mission and realizing the vision. It will be the early adopters like Renault pushing Dassault Systèmes to deliver. I hope to stay tuned. You too?
NOTE: Dassault Systèmes covered some of the expenses associated with my participation in this event but did not in any way influence the content of this post.
I am writing this post because one of my PLM peers recently asked me this question: “Is the BOM losing its position? He was in discussion with another colleague who told him:
“If you own the BOM, you own the Product Lifecycle”.
This statement made me think of ä recent post from Jan Bosch recent post: Product Development fallacy #8: the bill of materials has the highest priority.
Software becomes increasingly an essential part of the final product, and combined with Jan’s expertise in software development, he wrote this article. I recommend reading the full post (4 min read) and next browse through the comments.
If you cannot afford these 10 minutes, here is my favorite quote from the article:
An excessive focus on the bill of materials leads to significant challenges for companies that are undergoing a digital transformation and adopting continuous value delivery. The lack of headroom, high coupling and versioning hell may easily cause an explosion of R&D expenditure over time.
Where did the BOM focus come from? A historical overview related to the rise (and fall) of the BOM.
In the beginning, there was the drawing.
Before the era of computers, there was “THE drawing”, describing assemblies, subassemblies or parts. And on the drawing, you can find the parts list if relevant. This parts list was the first Bill of Material, describing the parts/materials shown on the drawing.
Next came MRP/ERP
With the introduction of the MRP system (Material Requirement Planning), it was the first step that by using computers, people could collect the material requirements for one system as data and process. Entering new materials/parts described on drawings was still a manual process, as well as referring to existing parts on the drawing. Reuse of parts was a manual process based on individual knowledge.
In the nineties, MRP evolved into ERP (Enterprise Resource Planning), which included the MRP part and added resource and manufacturing planning and financial reporting.
The ERP system became the most significant IT system, the execution system of the company. As it was the first enterprise system implemented, it was the first moment we learned about implementation challenges – people change and budget overruns. However, as the ERP system brought visibility to the company’s execution, it became a “must-have” system for management.
The introduction of mainstream 2D CAD did not affect the company’s culture so much. Drawings became electronic drawings, and the methodology of the parts list on the drawing remained.
Sometimes the interaction with the MRP/ERP system was enhanced by an interface – sending the drawing BOM to ERP. The advantage of the interface: no manual transfer of data reducing typos and BOM errors. The disadvantages at that time: relatively expensive (connectivity between systems was a challenge) and mostly one direction.
And then there was PDM.
In parallel with the introduction of ERP systems, mainstream 3D CAD systems became affordable, particularly SolidWorks, Solid Edge and Inventor. These 3D CAD systems allow sharing of parts and assemblies in different products, and the PDM database was the first aid to support part reuse, versioning and standardization.
By extracting the parts from the assemblies and subassemblies, it was possible to generate a BOM structure in the PDM system to be transferred or typed into the ERP system. We did not talk about EBOM or MBOM then, as there was only one BOM in the ERP system, and the PDM system was a tool to feed the ERP system.
Many companies still have based their processes on this approach. ERP (read SAP nowadays) is the central execution system, and PDM is an external system. You might remember the story and image from my previous post about people, processes and tools. The bad practice example: Asking the ERP system to provide a part number when starting to design a part.
And then products started to change.
In the early 2000s, I worked with SmarTeam to define the E&E (Electronics and Electrical) template. One of the new concepts was to synchronize all design data coming from different disciplines to a single BOM structure.
It was the time we started to talk about the EBOM. A type of BOM, as the structure to consolidate all the design data, was based on parts.
The EBOM, most of the time, reflects the design intent in logical groups and sending the relevant parts in the correct order to the ERP system was a favorite expensive customization for service providers. How to transfer an engineering BOM view to an ERP system that only understands the manufacturing view?
Note: not all ERP systems have the data model to differentiate between engineering parts and manufacturing parts
The image below illustrates the challenge and the customer’s perception.
The automated link between the design side (EBOM) and manufacturing side (MBOM) was a mission impossible – too many exceptions for the (spaghetti) code.
And then came the MBOM.
The identified issues connecting PDM and ERP led to the concept of implementing the MBOM in the PLM system. The MBOM in PLM is one of the characteristics of a PLM implementation compared to a PDM implementation. In a traditional PLM system, there is an interaction and connection between the EBOM and MBOM. EBOM parts should end up as MBOM parts. This interaction can be supported by automation, however, as it is in the same system, still leaving manual changes possible.
The MBOM structure in PLM could then be the information structure to transfer to the ERP system; however, there is more, as Jörg W. Fischer wrote in his provoking post-Die MBOM muss weg (The MBOM must go). He rightly points out (in German) that the MBOM is not a structure on its own but a combination of different views based on Assembly Drawings, Process Planning and Material Requirements.
His conclusion:
Calling these structures, MBOM is trying to squeeze all three structures into one. That usually doesn’t work and then leads to much more emotional discussions in the project. It also costs a lot of money. It is, therefore, better not to use the term MBOM at all.
And indeed, just having an MBOM in your PLM system might help you to prepare some of the manufacturing steps, the needed resources and parts. The MBOM result still has to be localized at the local plant where the manufacturing takes place. And here, the systems used are the ERP system and the MES system.
The main advantage of having the MBOM in the PLM system is the direct relation between specification and manufacturing intent, allowing manufacturing engineering to work collaboratively with engineering in the same environment.
- The first benefit is fewer iterations and a shorter time to production, thanks to early interaction and manufacturing involvement in the engineering process.
- The second benefit is: product knowledge is centralized in a single system. Consolidating your Product Knowledge in ERP does not make sense due to global localization and the missing capabilities to manage the iterative engineering processes on non-existing parts.
And then came the SBOM, the xBOM
Traditional PLM vendors and implementations kept using xBOM structures as placeholders for related specification data (mechanical designs, electrical, software deliverables, serialized products). Most of the time, related files.
And with this approach, talking about digital thread, PLM systems also touch on the concepts of Configuration Management.
I will not go into the details here but look at the two images by clicking on them and see a similar mindset.
It is about the traceability of information in structures and systems. These structures work well in a relatively static and linear product development and delivery environment, as illustrated below:
Engineering change and release processes are based on managing the changes in different structures from the left to the right.
And then came software!
Modern connected products are no longer mechanical products. The product’s functionality no longer depends on the mechanical properties but mainly on embedded electronics and software used. For example, look at the mechanical design of a telecom transmission tower – its behavior merely comes from non-mechanical components, and they can change over time. Still, the Bill of Material contains a lot of concrete and steel parts.
The ultimate example is comparing a Tesla (software on wheels) with a traditional car. For modern connected products, electronics and software need to be part of the solution. Software and electronics allow the product to be upgraded over time. Managing these products in the same manner as mechanical products is impossible, inefficient and therefore threatening your company’s future business.
I requote Jan Bosch:
An excessive focus on the bill of materials leads to significant challenges for companies that are undergoing a digital transformation and adopting continuous value delivery. The lack of headroom, high coupling and versioning hell may easily cause an explosion of R&D expenditure over time.
The model-based, connected enterprise
I will not solve the puzzle of the future in this post. You can read my observations in my series: The road to model-based and connected PLM. We need a new infrastructure with at least two modes. One that still serves as a System of Record, storing information in a traditional manner, like a Bill of Materials for the static parts, as not everyone and everything can be connected.
In addition, we need various Systems of Engagement that enable close to real-time interaction between products (systems) and relevant stakeholders for the engagement scope(multidisciplinary / consumers).
Digital twins are examples of such environments. Currently, these Systems of Engagement often work disconnected from the System of Record due to the lack of understanding of how to connect. (standard connectors? / OSLC?)
Our mission is to explore, as I wrote in my post Time to split PLM and drop our mechanical mindset.
And while I was finalizing this post, I read a motivating post from Jan Bosch again for all of you working on understanding and pushing the digital transformation in your eco-system.
The title: Be the protagonist of your life: 15 rules A starting point for more to come.
Conclusion
The BOM is no longer the master of the product lifecycle when it comes to managing connected products, where functionality mainly depends on software. BOM structures with related documents are just one of the extracted baselines from a data-driven, connected enterprise. This traditional PLM infrastructure requires other, non-BOM-driven structures to represent the actual status of a virtual or physical product.
The BOM is not dead, but there is more ………
Your thoughts?
Those who have read my blog posts over the years will have seen the image to the left.
The people, processes and tools slogan points to the best practice of implementing (PLM and CM) systems.
Theoretically, a PLM implementation will move smoothly if the company first agrees on the desired processes and people involved before a system implementation using the right tools.
Too often, companies start from their historical landscape (the tools – starting with a vendor selection) and then try to figure out the optimal usage of their systems. The best example of this approach is the interaction between PDM(PLM) and ERP.
PDM and ERP
Historically ERP was the first enterprise system that most companies implemented. For product development, there was the PDM system, an engineering tool, and for execution, there was the ERP system. Since ERP focuses on the company’s execution, the system became the management’s favorite.
The ERP system and its information were needed to run and control the company. Unfortunately, this approach has introduced the idea that the ERP system should also be the source of the part information, as it was often the first enterprise system for a company. The PDM system was often considered an engineering tool only. And when we talk about a PLM system, who really implements PLM as an enterprise system or was it still an engineering tool?
This is an example of Tools, Processes, and People – A BAD PRACTICE.
Imagine an engineer who wants to introduce a new part needed for a product to deliver. In many companies at the beginning of this century, even before starting the exercise, the engineer had to request a part number from the ERP system. This is implementation complexity #1.
Next, the engineer starts developing versions of the part based on the requirements. Ultimately the engineer might come to the conclusion this part will never be implemented. The reserved part number in ERP has been wasted – what to do?
It sounds weird, but this was a reality in discussions on this topic until ten years ago.
Next, as the ERP system could only deal with 7 digits, what about part number reuse? In conclusion, it is a considerable risk that reused part numbers can lead to errors. With the introduction of the PLM systems, there was the opportunity to bridge the gap between engineering and manufacturing. Now it is clear for most companies that the engineer should create the initial part number.
Only when the conceptual part becomes approved to be used for the realization of the product, an exchange with the ERP system will be needed. Using the same part number or not, we do not care if we can map both identifiers between these environments and have traceability.
It took almost 10 years from PDM to PLM until companies agreed on this approach, and I am curious about your company’s status.
Meanwhile, in the PLM world, we have evolved on this topic. The part and the BOM are no longer simple entities. Instead, we often differentiate between EBOM and MBOM, and the parts in those BOMs are not necessarily the same.
In this context, I like Prof. Dr. Jörg W. Fischer‘s framing:
EBOM is the specification, and MBOM is the realization.
(Leider schreibt Er viel auf Deutsch).
An interesting discussion initiated by Jörg last week was again about the interaction between PLM and ERP. The article is an excellent example of how potentially mainstream enterprises are thinking. PLM = Siemens, ERP = SAP – an illustration of the “tools first” mindset before the ideal process is defined.
There was nothing wrong with that in the early days, as connectivity between different systems was difficult and expensive. Therefore people with a 20 year of experience might still rely on their systems infrastructure instead of data flow.
But enough about the bad practice – let’s go to people, processes, (data), and Tools
People, Processes, Data and Tools?
I got inspired by this topic, seeing this post two weeks ago from Juha Korpela, claiming:
Okay, so maybe a hot take, maybe not, but: the old “People, Process, Technology” trinity is one of the most harmful thinking patterns you can have. It leaves out a key element: Data.
His full post was quite focused on data, and I liked the ” wrapping post” from Dr. Nicolas Figay here, putting things more in perspective from his point of view. The reply made me think about how this discussion fits into the PLM digital transformation discussion. How would it work in the two major themes I use to explain the digital transformation in the PLM landscape?
For incidental readers of my blog, these are the two major themes I am using:
- From Coordinated to Connected, based on the famous diagram from Marc Halpern (image below). The coordinated approach based on documents (files) requires a particular timing (processes) and context (Bills of Information) – it is the traditional and current PLM approach for most companies. On the other hand, the Connected approach is based on connected datasets (here, we talk about data – not files). These connected datasets are available in different contexts, in real-time, to be used by all kinds of applications, particularly modeling applications. Read about it in the series: The road to model-based and connected PLM.
. - The need to split PLM, thinking in System(s) of Record and Systems of Engagement. (example below) The idea behind this split is driven by the observation that companies need various Systems of Record for configuration management, change management, compliance and realization. These activities sound like traditional PLM targets and could still be done in these systems. New in the discussion is the System of Engagement which focuses on a specific value stream in a digitally connected manner. Here data is essential.I discussed the coexistence of these two approaches in my post Time to Split PLM. A post on LinkedIn with many discussions and reshares illustrating the topic is hot. And I am happy to discuss “split PLM architectures” with all of you.
These two concepts discuss the processes and the tools, but what about the people? Here I came to a conclusion to complete the story, we have to imagine three kinds of people. And this will not be new. We have the creators of data, the controllers of data and the consumers of data. Let’s zoom in on their specifics.
A new representation?
I am looking for a new simplifaction of the people, processes, and tools trinity combined with data; I got inspired by the work Don Farr did at Boeing, where he worked on a new visual representation for the model-based enterprise. You might have seen the image on the left before – click on it to see it in detail.
I wrote the first time about this new representation in my post: The weekend after CIMdata Roadmap / PDT Europe 2018
Related to Configuration Management, Martijn Dullaart and Martin Haket have also worked on a diagram with their peers to depict the scope of CM and Impact Analysis. The image leads to the post with my favorite quote: Communication is merely an exchange of information, but connections tell the story.
Below I share my first attempt to combine the people, process and tools trinity with the concepts of document and data, system(s) of record and system(s) of engagement. Trying to build the story. Look if you recognize the aspects of the discussion above, and feel free to develop enhancements.
I look forward to your suggestions. Like the understanding that we have to split PLM thinking, as it impacts how we look at implementations.
Conclusion
Digital transformation in the PLM domain is forcing us to think differently. There will still be processes based on people collecting, interpreting and combining information. However, there will also be a new domain of connected data interpreted by models and algorithms, not necessarily depending on processes.
Therefore we need to work on new representations that can be used to tell this combined story. What do you think? How can we improve?
In this post, I want to explain why Model-Based Systems Engineering (MBSE) and Sustainability are closely connected. I would claim sustainability in our PLM domain will depend on MBSE.
Can we achieve Sustainability without MBSE? Yes, but it will be costly and slow. And as all businesses want to be efficient and agile, they should consider MBSE.
What is MBSE?
The abbreviation MBSE stands for Model-Based Systems Engineering, a specialized manner to perform Systems Engineering. Look at the Wikipedia definition in short:
MBSE is a technical approach to systems engineering that focuses on creating and exploiting domain models as the primary means of information exchange rather than on document-based information exchange.
Model-Based fits in the digital transformation scope of PLM – from a document-based approach to a data-driven, model-based one. In 2018, I focused on facets of the model-based enterprise and related to MBSE in this post: Model-Based: System Engineering (MBSE).
My conclusion in that post was:
Model-Based Systems Engineering might have been considered as a discipline for the automotive and aerospace industry only. As products become more and more complex, thanks to IoT-based applications and software, companies should consider evaluating the value of model-based systems engineering for their products/systems.
I drew this conclusion before I focused on sustainability and systems thinking. Implementing sustainability concepts, like the Circular Economy, require more complex engineering efforts, justifying a Model-Based Systems Engineering approach. Let’s have a look.
If you want to learn more about why we need MBSE, look at this excellent keynote speech lecture from Zhang Xin Guo at the Incose 2018 conference below:
The Mission / the stakeholders
A company might deliver products to the market with the best price/quality ratio and regulatory compliance, perceived and checked by the market. This approach is purely focusing on economic parameters.
There is no need for a system engineering approach as the complexity is manageable. The mission is more linear, a “job to do,” and a limited number of stakeholders are involved in this process.
… with sustainability
Once we start to include sustainability in our product’s mission, we need a systems engineering approach, as several factors will push for different considerations. The most obvious considerations are the choice of materials and the optimizing the production process (reducing carbon emissions).
However, the repairability/serviceability of the product should be considered with a more extended lifetime vision.
What about upgradeability and reusing components? Will the customer pay for these extra sustainable benefits?
Probably Yes, when your customer has a long-term vision, as the overall lifecycle costs of the product will be lower.
Probably No if none of your competitors delivers non-sustainable products much cheaper.
As long as regulations will not hurt traditional business models, there might be no significant change.
However, the change has already started. Higher energy prices will impact the production of specific resources and raise costs. In addition, energy-intensive manufacturing processes will lead to more expensive materials. Combined with raising carbon taxes, this will be a significant driver for companies to reconsider their product offering and manufacturing processes.
The more expensive it becomes to create new products, the more attractive repairable and upgradable products will become. And this brings us to the concept of the circular economy, which is one of the pillars of sustainability.
In short, looking at the diagram – the vertical flow from renewables and finite materials from part to product to product in service leads ultimately to wasted resources if there are no feedback loops. This is the traditional product delivery process that most companies are using.
You can click on the image to the left to zoom in on the details.
The renewable loop on the left side of the diagram is the usage of renewables during production and the use of the product. The more we use renewables instead of fossil fuels, the more sustainable this loop will be. This is the area where engineers should use simulations to find the optimal manufacturing processes and product behavior. Again click on the image to zoom in on the details.
The right side of the loop, related to the materials, is where we see the options for repairable, serviceable, upgradeable, and even further refurbishment and recycling to avoid leakage of precious materials. This is where mechanical engineers should dominate the activities. Focussing on each of the loops and how to enable them in the product. Click on the image to see the relevant loops.
Looking at the circular economy diagram, it is clear that we are no longer talking about a linear process – it has become the implementation of a system. Systems Engineering or MBSE?
The benefits of MBSE
Developing products with the circular economy in mind is no longer a “job to do,” a simple linear exercise. Instead, if we walk down the systems engineering V-shape, there are a lot of modeling exercises to perform before we reach the final solution.
To illustrate the benefits of MBSE, let’s walk through the following scenario.
A well-known company sells lighting projects for stadiums and public infrastructure. Their current business model is based on reliable lighting equipment with a competitive price and range of products.
Most of the time, their contracts have clauses about performance/cost and maintenance. The company sells the products when they win the deal and deliver spare parts when needed.
Their current product design is quite linear – without systems engineering.
Now this company has decided to change its business model towards Product As A Service, or in their terminology LaaS (Lightening as a Service). For a certain amount per month, they will provide lighting to their customers, a stadium, a city, and a road infrastructure.
To implement this business model, this is how they used a Model-Based Systems Engineering approach.
Modeling the Mission
Before even delivering any products, the process starts with describing and analyzing the business model needed for Lightening as a Service.
Then, with modeling estimates about the material costs, there are exercises about the resources required to maintain the service, the potential market, and the possible price range.
It is the first step of using a model to define the mission of the service. After that, the model can be updated, adjusted, and used for a better go-to-market approach when the solution becomes more mature.
Part of the business modeling is also the intention to deliver serviceable and upgradeable products. As the company now owns the entire lifecycle, this is the cheapest way to guarantee a continuous or improved service over time.
Modeling the Functions
Providing Lighting as a Service also means you must be in touch with your installations in real time. Power consumption needs to be measured and analyzed in real-time for (predictive) maintenance, and the light-providing service should be as cheap as possible during operation.
Therefore LED technology is the most reliable, and connectivity functions need to be implemented in the solution. The functional design ensures installation, maintenance and service can be done in a connected manner (cheapest in operation – beneficial for the business).
Modeling the Logical components
As an owner of the solution, the design of the logical components of the lighting solution is also crucial. How to address various lighting demands efficiently? Modularity is one of the first topics to address. With modular components, it is possible to build customer-specific solutions with a reduced engineering effort. However, the work needs to be done by generically designing the solutions and focusing on the interfaces.
Such a design starts with a logical process and flow diagrams combined with behavior modeling. Without already having a physical definition, we can analyze the components’ behavior within an electrical scheme. Decisions on whether specific scenarios will be covered by hardware or software can be analyzed here. The company can define the lower-level requirements for the physical component by using virtual trade-offs on the logical models.
At this stage, we have used business modeling, functional modeling and logical modeling to understand our solution’s behavior.
Modeling the Physical product
The final stage of the solution design is to implement the logical components into a physical solution. The placement of components and interfaces between the components becomes essential. For the physical design, there are still a lot of sustainability requirements to verify:
- Repairability and serviceability – are the components reachable and replaceable? Reducing the lifecycle costs of the solution
- Upgradeability – are there components that can behave differently due to software choices, or are there components that can be replaced with improved functionality. Reducing the cost of creating entirely new solutions.
- Reuse & recyclable – are the materials used in the solution recyclable or reusable, reducing the cost of new materials or reducing the cost of dumping waste.
- RoHS/ REACH compliance
The image below from Zhang Xin Guo’s presentation nicely demonstrates the iterative steps before reaching a physical product
Before committing to a hardware implementation, the virtual product can be analyzed, behavior can be simulated, and it carbon impact can be calculated for the various potential variants.
The manufacturing process and energy usage during operation are also a part of the carbon impact calculation. The best performing virtual solution, including its simulations models, can be chosen for the realization to ensure the most environmentally friendly solution.
The digital twin for follow-up
Once the solution has been realized, the company still has a virtual model of the solution. By connecting the physical product’s observed and measured behavior, the virtual side’s modeling can be improved or used to identify improvement candidates – maintenance or upgrades. At this stage, the virtual twin is the actual twin of the physical solution. Without going deeper into the digital twin at this stage, I hope you also realize MBSE is a starting point for implementing digital twins serving sustainability outcomes.
The image below, published by Boeing, illustrates the power of the connected virtual and physical world and the various types of modeling that help to assess the optimal solution.
Conclusion
For sustainability, it all starts with the design. The design decisions for the product contribute for 80 % to the carbon footprint of the solution. Afterward, optimization is possible within smaller margins. MBSE is the recommended approach to get a trustworthy understanding and follow-up of the product’s environmental impact.
What do you think can we create sustainable products without MBSE?
This year started for me with a discussion related to federated PLM. A topic that I highlighted as one of the imminent trends of 2022. A topic relevant for PLM consultants and implementers. If you are working in a company struggling with PLM, this topic might be hard to introduce in your company.
Before going into the discussion’s topics and arguments, let’s first describe the historical context.
The traditional PLM frame.
Historically PLM has been framed first as a system for engineering to manage their product data. So you could call it PDM first. After that, PLM systems were introduced and used to provide access to product data, upstream and downstream. The most common usage was the relation with manufacturing, leading to EBOM and MBOM discussions.
IT landscape simplification often led to an infrastructure of siloed solutions – PLM, ERP, CRM and later, MES. IT was driving the standardization of systems and defining interfaces between systems. System capabilities were leading, not the flow of information.
As many companies are still in this stage, I would call it PLM 1.0
PLM 1.0 systems serve mainly as a System of Record for the organization, where disciplines consolidate their data in a given context, the Bills of Information. The Bill of Information then is again the place to connect specification documents, i.e., CAD models, drawings and other documents, providing a Digital Thread.
The actual engineering work is done with specialized tools, MCAD/ECAD, CAE, Simulation, Planning tools and more. Therefore, each person could work in their discipline-specific environment and synchronize their data to the PLM system in a coordinated manner.
However, this interaction is not easy for some of the end-users. For example, the usability of CAD integrations with the PLM system is constantly debated.
Many of my implementation discussions with customers were in this context. For example, suppose your products are relatively simple, or your company is relatively small. In that case, the opinion is that the System or Record approach is overkill.
That’s why many small and medium enterprises do not see the value of a PLM backbone.
This could be true till recently. However, the threats to this approach are digitization and regulations.
Customers, partners, and regulators all expect more accurate and fast responses on specific issues, preferably instantly. In addition, sustainability regulations might push your company to implement a System of Record.
PLM as a business strategy
For the past fifteen years, we have discussed PLM more as a business strategy implemented with business systems and an infrastructure designed for sharing. Therefore, I choose these words carefully to avoid overhanging the expression: PLM as a business strategy.
The reason for this prudence is that, in reality, I have seen many PLM implementations fail due to the ambiguity of PLM as a system or strategy. Many enterprises have previously selected a preferred PLM Vendor solution as a starting point for their “PLM strategy”.

One of the most neglected best practices.
In reality, this means there was no strategy but a hope that with this impressive set of product demos, the company would find a way to support its business needs. Instead of people, process and then tools to implement the strategy, most of the time, it was starting with the tools trying to implement the processes and transform the people. That is not really the definition of business transformation.
In my opinion, this is happening because, at the management level, decisions are made based on financials.
Developing a PLM-related business strategy requires management understanding and involvement at all levels of the organization.
This is often not the case; the middle management has to solve the connection between the strategy and the execution. By design, however, the middle management will not restructure the organization. By design, they will collect the inputs van the end users.
And it is clear what end users want – no disruption in their comfortable way of working.
Halfway conclusion:
Rebranding PLM as a business strategy has not really changed the way companies work. PLM systems remain a System of Record mainly for governance and traceability.
To understand the situation in your company, look at who is responsible for PLM.
- If IT is responsible, then most likely, PLM is not considered a business strategy but more an infrastructure.
- If engineering is responsible for PLM, then you are still in the early days of PLM, the engineering tools to be consulted by others upstream or downstream.
Only when PLM accountability is at the upper management level, it might be a business strategy (assume the upper management understands the details)
Connected is the game changer
Connecting all stakeholders in an engagement has been a game changer in the world. With the introduction of platforms and the smartphone as a connected device, consumers could suddenly benefit from direct responses to desired service requests (Spotify, iTunes, Uber, Amazon, Airbnb, Booking, Netflix, …).
The business change: connecting real-time all stakeholders to deliver highly rapid results.
What would be the game changer in PLM was the question? The image below describes the 2014 Accenture description of digital PLM and its potential benefits.
Is connected PLM a utopia?
Marc Halpern from Gartner shared in 2015 the slide below that you might have seen many times before. Digital Transformation is really moving from a coordinated to a connected technology, it seems.
The image below gives an impression of an evolution.
I have been following this concept till I was triggered by a 2017 McKinsey publication: “our insights/toward an integrated technology operating model“.
This was the first notion for me that the future should be hybrid, a combination of traditional PLM (system of record) complemented with teams that work digitally connected; McKinsey called them pods that become product-centric (multidisciplinary team focusing on a product) instead of discipline-centric (marketing/engineering/manufacturing/service)
In 2019 I wrote the post: The PLM migration dilemma supporting the “shocking” conclusion “Don’t think about migration when moving to data-driven, connected ways of working. You need both environments.”
One of the main arguments behind this conclusion was that legacy product data and processes were not designed to ensure data accuracy and quality on such a level that it could become connected data. As a result, converting documents into reliable datasets would be a costly, impossible exercise with no real ROI.
The second argument was that the outside world, customers, regulatory bodies and other non-connected stakeholders still need documents as standardized deliverables.
The conclusion led to the image below.

Systems of Record (left) and Systems of Engagement (right)
Splitting PLM?
In 2021 these thoughts became more mature through various publications and players in the PLM domain.
We saw the upcoming of Systems of Engagement – I discussed OpenBOM, Colab and potentially Configit in the post: A new PLM paradigm. These systems can be characterized as connected solutions across the enterprise and value chain, focusing on a platform experience for the stakeholders.
These are all environments addressing the needs of a specific group of users as efficiently and as friendly as possible.
A System of Engagement will not fit naturally in a traditional PLM backbone; the System of Record.
Erik Herzog with SAAB Aerospace and Yousef Houshmand at that time with Daimler published that year papers related to “Federated PLM” or “The end of monolithic PLM.”. They acknowledged a company needs to focus on more than a single PLM solution. The presentation from Erik Herzog at the PLM Roadmap/PDT conference was interesting because Erik talked about the Systems of Engagement and the Systems of Record. He proposed using OSLC as the standard to connect these two types of PLM.
It was a clear example of an attempt to combine the two kinds of PLM.
And here comes my question: Do we need to split PLM?
When I look at PLM implementations in the field, almost all are implemented as a System of Record, an information backbone proved by a single vendor PLM. The various disciplines deliver their content through interfaces to the backbone (Coordinated approach).
However, there is low usability or support for multidisciplinary collaboration; the PLM backbone is not designed for that.
Due to concepts of Model-Based Systems Engineering (MBSE) and Model-Based Definition (MBD), there are now solutions on the market that allow different disciplines to work jointly related to connected datasets that can be manipulated using modeling software (1D, 2D, 3D, 4D,…).
These environments, often a mix of software and hardware tools, are the Systems of Engagement and provide speedy results with high quality in the virtual world. Digital Twins are running on Systems of Engagements, not on Systems of Records.
Systems of Engagement do not need to come from the same vendor, as they serve different purposes. But how to explain this to your management, who wants simplicity. I can imagine the IT organization has a better understanding of this concept as, at the end of 2015, Gartner introduced the concept of the bimodal approach.
Their definition:
Mode 1 is optimized for areas that are more well-understood. It focuses on exploiting what is known. This includes renovating the legacy environment so it is fit for a digital world. Mode 2 is exploratory, potentially experimenting to solve new problems. Mode 2 is optimized for areas of uncertainty. Mode 2 often works on initiatives that begin with a hypothesis that is tested and adapted during a process involving short iterations.
No Conclusion – but a question this time:
At the management level, unfortunately, there is most of the time still the “Single PLM”-mindset due to a lack of understanding of the business. Clearly splitting your PLM seems the way forward. IT could be ready for this, but will the business realize this opportunity?
What are your thoughts?
If it was easy, anyone could do it. It's hard. It's supposed to be hard. Quote inspired by Tom Hanks…
Jos, what a ride you have had! And looking at some of the spaghetti system architectures of even today's businesses,…
Congratulations, Jos! I'm very happy that you'll stay active in the PLM world and continue with your blogs - during…
Jos, welcome to the world of (part-time) retirement. Enjoy your AOW. Thanks Dick, you have the experience now - enjoy…
Thanks for all the valuable thoughts you have shared with us Jos, hope your 'new career' will bring you lots…