You are currently browsing the tag archive for the ‘Data centric’ tag.
Recently, we initiated the Design for Sustainability workgroup, an initiative from two of our PGGA members, Erik Rieger and Matthew Sullivan. You can find a recording of the kick-off here on our YouTube channel.
Thanks to the launch of the Design for Sustainability workgroup, we were introduced to Dr. Elvira Rakova, founder and CEO of the startup company Direktin.
Her mission is to build the Digital Ecosystem of engineering tools and simulation for Compressed Air Systems. As typical PLM professionals with a focus on product design, we were curious to learn about developments in the manufacturing space. And it was an interesting discussion, almost a lecture.
Compressed air and Direktin
Dr. Elvira Rakova has been working with compressed air in manufacturing plants for several years, during which she has observed the inefficiency of how compressed air is utilized in these facilities. It is an available resource for all kinds of machines in the plant, often overdimensioned and a significant source of wasted energy.
To address this waste of energy, linked to CO2 emissions, she started her company to help companies scale, dimension, and analyse their compressed air usage. A mix of software and consultancy to make manufacturing processes using compressed air responsible for less carbon emissions, and for the plant owners, saving significant money related to energy usage.
For us, it was an educational discussion, and we recommend that you watch or listen to the next 36 minutes
What I learned
- The use of compressed air and its energy/environmental impact were like dark matter to me.
I never noticed it when visiting customers as a significant source to become more sustainable. - Although the topic of compressed air seems easy to understand, its usage and impact are all tough to address quickly and easily, due to legacy in plants, lack of visibility on compressed air (energy usage) and needs and standardization among the providers of machinery.
- The need for data analysis is crucial in addressing the reporting challenges of Scope 3 emissions, and it is also increasingly important as part of the Digital Product Passport data to be provided. Companies must invest in the digitalization of their plants to better analyze and improve energy usage, such as in the case of compressed air.
- In the end, we concluded that for sustainability, it is all about digital partnerships connecting the design world and the manufacturing world and for that reason, Elvira is personally motivated to join and support the Design for Sustainability workgroup
Want to learn more?
- Another educational webinar: Design Review Culture and Sustainability
- Explore the Direktin website to learn more
Conclusions
The PLM Green Global Alliance is not only about designing products; we have also seen lifecycle assessments for manufacturing, as discussed with Makersite and aPriori. These companies focused more on traditional operations in a manufacturing plant. Through our lecture/discussion on the use of compressed air in manufacturing plants, we identified a new domain that requires attention.
The title of this post is chosen influenced by one of Jan Bosch’s daily reflections # 156: Hype-as-a-Service. You can read his full reflection here.
His post reminded me of a topic that I frequently mention when discussing modern PLM concepts with companies and peers in my network. Data Quality and Data Governance, sometimes, in the context of the connected digital thread, and more recently, about the application of AI in the PLM domain.
I’ve noticed that when I emphasize the importance of data quality and data governance, there is always a lot of agreement from the audience. However, when discussing these topics with companies, the details become vague.
Yes, there is a desire to improve data quality, and yes, we push our people to improve the quality processes of the information they produce. Still, I was curious if there is an overall strategy for companies.
And who to best talk to? Rob Ferrone, well known as “The original Product Data PLuMber” – together, we will discuss the topic of data quality and governance in two posts. Here is part one – defining the playground.
The need for Product Data People
During the Share PLM Summit, I was inspired by Rob’s theatre play, “The Engineering Murder Mystery.” Thanks to the presence of Michael Finocchiaro, you might have seen the play already on LinkedIn – if you have 20 minutes, watch it now.
Rob’s ultimate plea was to add product data people to your company to make the data reliable and flow. So, for me, he is the person to understand what we mean by data quality and data governance in reality – or is it still hype?
What is data?
Hi Rob, thank you for having this conversation. Before discussing quality and governance, could you share with us what you consider ‘data’ within our PLM scope? Is it all the data we can imagine?
I propose that relevant PLM data encompasses all product-related information across the lifecycle, from conception to retirement. Core data includes part or item details, usage, function, revision/version, effectivity, suppliers, attributes (e.g., cost, weight, material), specifications, lifecycle state, configuration, and serial number.
Secondary data supports lifecycle stages and includes requirements, structure, simulation results, release dates, orders, delivery tracking, validation reports, documentation, change history, inventory, and repair data.
Tertiary data, such as customer information, can provide valuable support for marketing or design insights. HR data is generally outside the scope, although it may be referenced when evaluating the impact of PLM on engineering resources.
What is data quality?
Now that we have a data scope in mind, I can imagine that there is also some nuance in the term’ data quality’. Do we strive for 100% correct data, and is the term “100 % correct” perhaps too ambitious? How would you define and address data quality?
You shouldn’t just want data quality for data quality’s sake. You should want it because your business processes depend on it. As for 100%, not all data needs to be accurate and available simultaneously. It’s about having the proper maturity of data at the right time.
For example, when you begin designing a component, you may not need to have a nominated supplier, and estimated costs may be sufficient. However, missing supplier nomination or estimated costs would count against data quality when it is time to order parts.
And these deliverable timings will vary across components, so 100% quality might only be achieved when the last standard part has been identified and ordered.
It is more important to know when you have reached the required data quality objective for the top-priority content. The image below explains the data quality dimensions:

- Completeness (Are all required fields filled in?)
KPI Example: % of product records that include all mandatory fields (e.g., part number, description, lifecycle status, unit of measure)
- Validity (Do values conform to expected formats, rules, or domains?)
KPI Example: % of customer addresses that conform to ISO 3166 country codes and contain no invalid characters
- Integrity (Do relationships between data records hold?)
KPI Example: % of BOM records where all child parts exist in the Parts Master and are not marked obsolete
- Consistency (Is data consistent across systems or domains?)
KPI Example: % of product IDs with matching descriptions and units across PLM and ERP systems
- Timeliness (Is data available and updated when needed?)
KPI Example: % of change records updated within 24 hours of approval or effective date
- Accuracy (Does the data reflect real-world truth?)
KPI Example: % of asset location records that match actual GPS coordinates from service technician visits
Define data quality KPIs based on business process needs, ensuring they drive meaningful actions aligned with project goals.
While defining quality is one challenge, detecting issues is another. Data quality problems vary in severity and detection difficulty, and their importance can shift depending on the development stage. It’s vital not to prioritize one measure over others, e.g., having timely data doesn’t guarantee that it has been validated.
Like the VUCA framework, effective data quality management begins by understanding the nature of the issue: is it volatile, uncertain, complex, or ambiguous?
Not all “bad” data is flawed, some may be valid estimates, changes, or system-driven anomalies. Each scenario requires a tailored response; treating all issues the same can lead to wasted effort or overlooked insights.
Furthermore, data quality goes beyond the data itself—it also depends on clear definitions, ownership, monitoring, maintenance, and governance. A holistic approach ensures more accurate insights and better decision-making throughout the product lifecycle.
KPIs?
In many (smaller) companies KPI do not exist; they adjust their business based on experience and financial results. Are companies ready for these KPIs, or do they need to establish a data governance baseline first?
Many companies already use data to run parts of their business, often with little or no data governance. They may track program progress, but rarely systematically monitor data quality. Attention tends to focus on specific data types during certain project phases, often employing audits or spot checks without establishing baselines or implementing continuous monitoring.
This reactive approach means issues are only addressed once they cause visible problems.
When data problems emerge, trust in the system declines. Teams revert to offline analysis, build parallel reports, and generate conflicting data versions. A lack of trust worsens data quality and wastes time resolving discrepancies, making it difficult to restore confidence. Leaders begin to question whether the data can be trusted at all.
Data governance typically evolves; it’s challenging to implement from the start. Organizations must understand their operations before they can govern data effectively.
In start-ups, governance is challenging. While they benefit from a clean slate, their fast-paced, prototype-driven environment prioritizes innovation over stable governance. Unlike established OEMs with mature processes, start-ups focus on agility and innovation, making it challenging to implement structured governance in the early stages.
Data governance is a business strategy, similar to Product Lifecycle Management.
Before they go on the journey of creating data management capabilities, companies must first understand:
- The cost of not doing it.
- The value of doing it.
- The cost of doing it.
What is the cost associated with not doing data quality and governance?
Similar to configuration management, companies might find it a bureaucratic overhead that is hard to justify. As long as things are going well (enough) and the company’s revenue or reputation is not at risk, why add this extra work?
Product data quality is either a tax or a dividend. In Part 2, I will discuss the benefits. In Part 1, this discussion, I will focus on the cost of not doing it.
Every business has stories of costly failures caused by incorrect part orders, uncommunicated changes, or outdated service catalogs. It’s a systematic disease in modern, complex organisations. It’s part of our day-to-day working lives: multiple files with slightly different file names, important data hidden in lengthy email chains, and various sources for the same information (where the value differs across sources), among other challenges.

Above image from Susan Lauda’s presentation at the PLMx 2018 conference in Hamburg, where she shared the hidden costs of poor data. Please read about it in my blog post: The weekend after PLMx Hamburg.
Poor product data can impact more than most teams realize. It wastes time—people chase missing info, duplicate work, and rerun reports. It delays builds, decisions, and delivery, hurting timelines and eroding trust. Quality drops due to incorrect specifications, resulting in rework and field issues. Financial costs manifest as scrap, excess inventory, freight, warranty claims, and lost revenue.
Worse, poor data leads to poor decisions, wrong platforms, bad supplier calls, and unrealistic timelines. It also creates compliance risks and traceability gaps that can trigger legal trouble. When supply chain visibility is lost, the consequences aren’t just internal, they become public.
For example, in Tony’s Chocolonely’s case, despite their ethical positioning, they were removed from the Slave Free Chocolate list after 1,700 child labour cases were discovered in their supplier network.
The good news is that most of the unwanted costs are preventable. There are often very early indicators that something was going to be a problem. They are just not being looked at.
Better data governance equals better decision-making power.
Visibility prevents the inevitable.
Conclusion of part 1
Thanks to Rob’s answers, I am confident that you now have a better understanding of what Data Quality and Data Governance mean in the context of your business. In addition, we discussed the cost of doing nothing. In Part 2, we will explore how to implement it in your company, and Rob will share some examples of the benefits.
Feel free to post your questions for the original Product Data PLuMber in the comments.
Four years ago, during the COVID-19 pandemic, we discussed the critical role of a data plumber.
In the last two weeks, I have had mixed discussions related to PLM, where I realized the two different ways people can look at PLM. Are implementing PLM capabilities driven by a cost-benefit analysis and a business case? Or is implementing PLM capabilities driven by strategy providing business value for a company?
Most companies I am working with focus on the first option – there needs to be a business case.
This observation is a pleasant passageway into a broader discussion started by Rob Ferrone recently with his article Money for nothing and PLM for free. He explains the PDM cost of doing business, which goes beyond the software’s cost. Often, companies consider the other expenses inescapable.
At the same time, Benedict Smith wrote some visionary posts about the potential power of an AI-driven PLM strategy, the most recent article being PLM augmentation – Panning for Gold.
It is a visionary article about what is possible in the PLM space (if there was no legacy ☹), based on Robust Reasoning and how you could even start with LLM Augmentation for PLM “Micro-Tasks.
Interestingly, the articles from both Rob and Benedict were supported by AI-generated images – I believe this is the future: Creating an AI image of the message you have in mind.
When you have digested their articles, it is time to dive deeper into the different perspectives of value and costs for PLM.
From a system to a strategy
The biggest obstacle I have discovered is that people relate PLM to a system or, even worse, to an engineering tool. This 20-year-old misunderstanding probably comes from the fact that in the past, implementing PLM was more an IT activity – providing the best support for engineers and their data – than a business-driven set of capabilities needed to support the product lifecycle.
The System approach
Traditional organizations are siloed, and initially, PLM always had the challenge of supporting product information shared throughout the whole lifecycle, where there was no conventional focus per discipline to invest in sharing – every discipline has its P&L – and sharing comes with a cost.
At the management level, the financial data coming from the ERP system drives the business. ERP systems are transactional and can provide real-time data about the company’s performance. C-level management wants to be sure they can see what is happening, so there is a massive focus on implementing the best ERP system.
In some cases, I noticed that the investment in ERP was twenty times more than the PLM investment.
Why would you invest in PLM? Although the ERP engine will slow down without proper PLM, the complexity of PLM compared to ERP is a reason for management to look at the costs, as the PLM benefits are hard to grasp and depend on so much more than just execution.
See also my old 2015 article: How do you measure collaboration?
As I mentioned, the Cost of Non-Quality, too many iterations, time lost by searching, material scrap, manufacturing delays or customer complaints – often are considered inescapable parts of doing business (like everyone else) – it happens all the time..
The strategy approach
It is clear that when we accept the modern definition of PLM, we should be considering product lifecycle management as the management of the product lifecycle (as Patrick Hillberg says eloquently in our Share PLM podcast – see the image at the bottom of this post, too).
When you implement a strategy, it is evident that there should be a long(er) term vision behind it, which can be challenging for companies. Also, please read my previous article: The importance of a (PLM) vision.
I cannot believe that, although perhaps not fully understood, the importance of a data-driven approach will be discussed at many strategic board meetings. A data-driven approach is needed to implement a digital thread as the foundation for enhanced business models based on digital twins and to ensure data quality and governance supporting AI initiatives.
It is a process I have been preaching: From Coordinated to Coordinated and Connected.
We can be sure that at the board level, strategy discussions should be about value creation, not about reducing costs or avoiding risks as the future strategy.

Understanding the (PLM) value
The biggest challenge for companies is to understand how to modernize their PLM infrastructure to bring value.
* Step 1 is obvious. Stop considering PLM as a system with capabilities, but investigate how you transform your infrastructure from a collection of systems and (document) interfaces towards a federated infrastructure of connected tools.
Note: the paradigm shift from a Single Source of Truth (in my system) towards a Nearest Source of Truth and a Single Source of Change.
* Step 2 is education. A data-driven approach creates new opportunities and impacts how companies should run their business. Different skills are needed, and other organizational structures are required, from disciplines working in siloes to hybrid organizations where people can work in domain-driven environments (the Systems of Record) and product-centric teams (the System of Engagement). AI tools and capabilities will likely create an effortless flow of information within the enterprise.
* Step 3 is building a compelling story to implement the vision. Implementing new ways of working based on new technical capabilities requires also organizational change. If your organization keeps working similarly, you might gain some percentage of efficiency improvements.
The real benefits come from doing things differently, and technology allows you to do it differently. However, this requires people to work differently, too, and this is the most common mistake in transformational projects.
Companies understand the WHY and WHAT but leave the HOW to the middle management.
People are squeezed into an ideal performance without taking them on the journey. For that reason, it is essential to build a compelling story that motivates individuals to join the transformation. Assisting companies in building compelling story lines is one of the areas where I specialize.
Feel free to contact me to explore the opportunity for your business.
It is not the technology!
With the upcoming availability of AI tools, implementing a PLM strategy will no longer depend on how IT understands the technology, the systems and the interfaces needed.
As Yousef Hooshmand‘s above image describes, a federated infrastructure of connected (SaaS) solutions will enable companies to focus on accurate data (priority #1) and people creating and using accurate data (priority #1). As you can see, people and data in modern PLM are the highest priority.
Therefore, I look forward to participating in the upcoming Share PLM Summit on 27-28 May in Jerez.
It will be a breakthrough – where traditional PLM conferences focus on technology and best practices. This conference will focus on how we can involve and motivate people. Regardless of which industry you are active in, it is a universal topic for any company that wants to transform.
Conclusion
Returning to this article’s introduction, modern PLM is an opportunity to transform the business and make it future-proof. It needs to be done for sure now or in the near future. Therefore PLM initiatives should be considered from the value point first instead of focusing on the costs. How well are you connected to your management’s vision to make PLM a value discussion?
Enjoy the podcast – several topics discuss relate to this post.
Four years ago, I wrote a series of posts with the common theme: The road to model-based and connected PLM. I discussed the various aspects of model-based and the transition from considering PLM as a system towards considering PLM as a strategy to implement a connected infrastructure.
Since then, a lot has happened. The terminology of Digital Twin and Digital Thread has become better understood. The difference between Coordinated and Connected ways of working has become more apparent. Spoiler: You need both ways. And at this moment, Artificial Intelligence (AI) has become a new hype.
Many current discussions in the PLM domain are about structures and data connectivity, Bills of Materials (BOM), or Bills of Information(BOI) combined with the new term Digital Thread as a Service (DTaaS) introduced by Oleg Shilovitsky and Rob Ferrone. Here, we envision a digitally connected enterprise, based connected services.
A lot can be explored in this direction; also relevant Lionel Grealou’s article in Engineering.com: RIP SaaS, long live AI-as-a-service and follow-up discussions related tot his topic. I chimed in with Data, Processes and AI.

However, we also need to focus on the term model-based or model-driven. When we talk about models currently, Large Language Models (LMM) are the hype, and when you are working in the design space, 3D CAD models might be your first association.
There is still confusion in the PLM domain: what do we mean by model-based, and where are we progressing with working model-based?
A topic I want to explore in this post.
It is not only Model-Based Definition (MBD)

Before I started The Road to Model-Based series, there was already the misunderstanding that model-based means 3D CAD model-based. See my post from that time: Model-Based – the confusion.
Model-Based Definition (MBD) is an excellent first step in understanding information continuity, in this case primarily between engineering and manufacturing, where the annotated model is used as the source for manufacturing.
In this way, there is no need for separate 2D drawings with manufacturing details, reducing the extra need to keep the engineering and manufacturing information in sync and, in addition, reducing the chance of misinterpretations.
MBD is a common practice in aerospace and particularly in the automotive industry. Other industries are struggling to introduce MBD, either because the OEM is not ready or willing to share information in a different format than 3D + 2D drawings, or their supplier consider MBD too complex for them compared to their current document-driven approach.
In its current practice, we must remember that MBD is part of a coordinated approach.
Companies exchange technical data packages based on potential MBD standards (ASME Y14.47 /ISO 16792 but also JT and 3D PDF). It is not yet part of the connected enterprise, but it connects engineering and manufacturing using the 3D Model as the core information carrier.
As I wrote, learning to work with MBD is a stepping stone in understanding a modern model-based and data-driven enterprise. See my 2022 post: Why Model-based Definition is important for us all.
To conclude on MBD, Model-based definition is a crucial practice to improve collaboration between engineering, manufacturing, and suppliers, and it might be parallel to collaborative BOM structures.
And it is transformational as the following benefits are reported through ChatGPT:
- Up to 30% faster in product development cycles due to reduced need for 2D drawings and fewer design iterations. Boeing reported a 50% reduction in engineering change requests by using MBD.

- Companies using MBD see a 20–50% reduction in manufacturing errors caused by misinterpretations of 2D drawings. Caterpillar reported a 30% improvement in first-pass yield due to better communication between design and manufacturing teams.
- MBD can reduce product launch time by 20–50% by eliminating bottlenecks related to traditional drawings and manual data entry.
- 20–30% reduction in documentation costs by eliminating or reducing 2D drawings. Up to 60% savings on rework and scrap costs by reducing errors and inconsistencies.
Over five years, Lockheed Martin achieved a $300 million cost savings by implementing MBD across parts of its supply chain.
MBSE is not a silo.
For many people, Model-Based Systems Engineering(MBSE) seems to be something not relevant to their business, or it is a discipline for a small group of specialists that are conducting system engineering practices, not in the traditional document-driven V-shape approach but in an iterative process following the V-shape, meanwhile using models to predict and verify assumptions.
And what is the value connected in a PLM environment?
A quick heads up – what is a model
A model is a simplified representation of a system, process, or concept used to understand, predict, or optimize real-world phenomena. Models can be mathematical, computational, or conceptual.
We need models to:
- Simplify Complexity – Break down intricate systems into manageable components and focus on the main components.
- Make Predictions – Forecast outcomes in science, engineering, and economics by simulating behavior – Large Language Models, Machine Learning.
- Optimize Decisions – Improve efficiency in various fields like AI, finance, and logistics by running simulations and find the best virtual solution to apply.
- Test Hypotheses – Evaluate scenarios without real-world risks or costs for example a virtual crash test..
It is important to realize models are as accurate as the data elements they are running on – every modeling practices has a certain need for base data, be it measurements, formulas, statistics.
I watched and listened to the interesting podcast below, where Jonathan Scott and Pat Coulehan discuss this topic: Bridging MBSE and PLM: Overcoming Challenges in Digital Engineering. If you have time – watch it to grasp the challenges.
The challenge in an MBSE environment is that it is not a single tool with a single version of the truth; it is merely a federated environment of shared datasets that are interpreted by modeling applications to understand and define the behavior of a product.
In addition, an interesting article from Nicolas Figay might help you understand the value for a broader audience. Read his article: MBSE: Beyond Diagrams – Unlocking Model Intelligence for Computer-Aided Engineering.
Ultimately, and this is the agreement I found on many PLM conferences, we agree that MBSE practices are the foundation for downstream processes and operations.
We need a data-driven modeling environment to implement Digital Twins, which can span multiple systems and diagrams.
In this context, I like the Boeing diamond presented by Don Farr at the 2018 PLM Roadmap EMEA conference. It is a model view of a system, where between the virtual and the physical flow, we will have data flowing through a digital thread.
Where this image describes a model-based, data-driven infrastructure to deliver a solution, we can, in addition, apply the DevOp approach to the bigger picture for solutions in operation, as depicted by the PTC image below.

Model-based the foundation of the digital twins
To conclude on MBSE, I hope that it is clear why I am promoting considering MBSE not only as the environment to conceptualize a solution but also as the foundation for a digital enterprise where information is connected through digital threads and AI models (**new**)
The data borders between traditional system domains will disappear – the single source of change and the nearest source of truth – paradigm, and this post, The Big Blocks of Future Lifecycle Management, from Prof. Dr. Jörg Fischer, are all about data domains.
However, having accessible data using all kinds of modern data sources and tools are necessary to build digital twins – either to simulate and predict a physical solution or to analyze a physical solution and, based on the analysis, either adjust the solutions or improve your virtual simulations.
Digital Twins at any stage of the product life cycle are crucial to developing and maintaining sustainable solutions, as I discussed in previous lectures. See the image below:

Conclusion
Data quality and architecture are the future of a modern digital enterprise – the building blocks. And there is a lot of discussion related to Artificial Intelligence. This will only work when we master the methodology and practices related to a data-driven and sustainable approach using models. MBD is not new, MBSE perhaps still new, building blocks for a model-based approach. Where are you in your lifecycle?

In my business ecosystem, I have seen a lot of discussions about technical and architectural topics since last year that are closely connected to the topic of artificial intelligence. We are discussing architectures and solutions that will make our business extremely effective. The discussion is mostly software vendor-driven as vendors usually do not have to deal with the legacy, and they can imagine focusing on the ultimate result.
Legacy (people, skills, processes and data) is the mean inhibitor for fast forward in such situations, as I wrote in my previous post: Data, Processes and AI.
However, there are also less visible discussions about business efficiency – methodology and business models – and future sustainability.
These discussions are more challenging to follow as you need a broader and long-term vision, as implementing solutions/changes takes much longer than buying tools.
This time, I want to revisit the discussion on modularity and the need for business efficiency and sustainability.
Modularity – what is it?
Modularity is a design principle that breaks a system into smaller, independent, and interchangeable components, or modules, that function together as a whole. Each module performs a specific task and can be developed, tested, and maintained separately, improving flexibility and scalability.
Modularity is a best practice in software development. Although modular thinking takes a higher initial effort, the advantages are enormous for reuse, flexibility, optimization, or adding new functionality. And as software code has no material cost or scrap, modular software solutions excel in delivery and maintenance.
In the hardware world, this is different. Often, companies have a history of delivering a specific (hardware) solution, and the product has been improved by adding features and options where the top products remain the company’s flagships.
Modularity enables easy upgrades and replacements in hardware and engineering, reducing costs and complexity. As I work mainly with manufacturing companies in my network, I will focus on modularity in the hardware world.
Modularity – the business goal
How often have you heard that a business aims to transition from Engineering to Order (ETO) to Configure/Build to Order (BTO) or Assemble to Order (ATO)? Companies often believe that the starting point of implementing a PLM system is enough, as it will help identify commonalities in product variations, therefore leading to more modular products.
The primary targeted business benefits often include reduced R&D time and cost but also reduced risk due to component reuse and reuse of experience. However, the ultimate goal for CTO/ATO companies is to minimize R&D involvement in their sales and delivery process.
More options can be offered to potential customers without spending more time on engineering.
Four years ago, I discussed modularity with Björn Eriksson and Daniel Strandhammar, who wrote “The Modular Way” during the COVID-19 pandemic. I liked the book because it is excellent for understanding the broader scope of modularity along with marketing, sales, and long-term strategy. Each business type has its modularity benefits.
I had a follow-up discussion with panelists active in modularization and later with Daniel Strandhammar about the book’s content in this blog post: PLM and Modularity.
Next, I got involved with the North European Modularity Network (NEM) group, a group of Scandinavian companies that share modularization experiences and build common knowledge.
Historically, modularization has been a popular topic in North Europe, and meanwhile, the group is expanding beyond Scandinavia. Participants in the group focus on education-sharing strategies rather than tools.
The 2023 biannual meeting I attended hosted by Vestas in Ringkobing was an eye-opener for me.
We should work more integrated, not only on the topic of Modularity and PLM but also on a third important topic: Sustainability in the context of the Circular Economy.
You can review my impression of the event and presentation in my post: “The week after North European Modularity (NEM)“
That post concludes that Modularity, like PLM, is a strategy rather than an R&D mission. Integrating modularity topics into PLM conferences or Circular Economy events would facilitate mutual learning and collaboration.
Modularity and Sustainability
The PLM Green Global Alliance started in 2020 initially had few members. However, after significant natural disasters and the announcement of regulations related to the European Green Deal, sustainability became a management priority. Greenwashing was no longer sufficient.
One key topic discussed in the PLM Green Global Alliance is the circular economy moderated by CIMPA PLM services. The circular economy is crucial as our current consumption of Earth’s resources is unsustainable.
The well-known butterfly diagram from the Ellen MacArthur Foundation below, illustrates the higher complexity of a circular economy, both for the renewables (left) and the hardware (right)
In a circular economy, modularity is essential. The SHARE loop focuses on a Product Service Model, where companies provide services based on products used by different users. This approach requires a new business model, customer experience, and durable hardware. After Black Friday last year, I wrote about this transition: The Product Service System and a Circular Economy.
Modularity is vital in the MAINTAIN/PROLONG loop. Modular products can be upgraded without replacing the entire product, and modules are easier to repair. An example is Fairphone from the Netherlands, where users can repair and upgrade their smartphones, contributing to sustainability.
In the REUSE/REMANUFACTURE loop, modularity allows for reusing hardware parts when electronics or software components are upgraded. This approach reduces waste and supports sustainability.
The REFURBISH/REMANUFACTURE loop also benefits from modularity, though to a lesser extent. This loop helps preserve scarce materials, such as batteries, reducing the need for resource extraction from places like the moon, Mars, or Greenland.
A call for action
If you reached this point of the article, my question is now to reflect on your business or company. Modularity is, for many companies, a dream (or vision) and will become, for most companies, a must to provide a sustainable business.
Modularity does not depend on PLM technology, as famous companies like Scania, Electrolux and Vestas have shown (in my reference network).
Where is your company and its business offerings?
IMPORTANT:
If you aim to implement modularity to support the concepts of the Circular Economy, make sure you do it in a data-driven, model-based environment – here, technology counts.
Conclusion
Don’t miss the focus on the potential relevance of modularity for your company. Modularity improves business and sustainability, AND it touches all enterprise stakeholders. Technology alone will not save the business. Your thoughts?
Do you want to learn more about implementing PLM at an ETO space company?
Listen to our latest podcast: OHB’s Digital Evolution: Transforming Aerospace PLM with Lucía Núñez Núñez
Last week, my memory was triggered by this LinkedIn post and discussion started by Oleg Shilovitsky: Rethinking the Data vs. Process Debate in the Age of Digital Transformation and AI.

me, 1989
In the past twenty years, the debate in the PLM community has changed a lot. PLM started as a central file repository, combined with processes to ensure the correct status and quality of the information.
Then, digital transformation in the PLM domain became achievable and there was a focus shift towards (meta)data. Now, we are entering the era of artificial intelligence, reshaping how we look at data.
In this technology evolution, there are lessons learned that are still valid for 2025, and I want to share some of my experiences in this post.
In addition, it was great to read Martin Eigner’s great reflection on the past 40 years of PDM/PLM. Martin shared his experiences and insights, not directly focusing on the data and processes debate, but very complementary and helping to understand the future.
It started with processes (for me 2003-2014)
In the early days when I worked with SmarTeam, one of my main missions was to develop templates on top of the flexible toolkit SmarTeam.
For those who do not know SmarTeam, it was one of the first Windows PDM/PLM systems, and thanks to its open API (COM-based), companies could easily customize and adapt it. It came with standard data elements and behaviors like Projects, Documents (CAD-specific and Generic), Items and later Products.
On top of this foundation, almost every customer implemented their business logic (current practices).
And there the problems came …..
The implementations became too much a highly customized environment, not necessarily thought-through as every customer worked differently based on their (paper) history. Thanks to learning from the discussions in the field supporting stalled implementations, I was also assigned to develop templates (e.g. SmarTeam Design Express) and standard methodology (the FDA toolkit), as the mid-market customers requested. The focus was on standard processes.
You can read my 2009 observations here: Can chaos become order through PLM?
The need for standardization?
When developing templates (the right data model and processes), it was also essential to provide template processes for releasing a product and controlling the status and product changes – from Engineering Change Request to Engineering Change Order. Many companies had their processes described in their ISO 900x manual, but were they followed correctly?
In 2010, I wrote ECR/ECO for Dummies, and it has been my second most-read post over the years. Only the 2019 post The importance of EBOM and MBOM in PLM (reprise) had more readers. These statistics show that many people are, and were, seeking education on general PLM processes and data model principles.
It was also the time when the PLM communities discussed out-of-the-box or flexible processes as Oleg referred to in his post..
You would expect companies to follow these best practices, and many small and medium enterprises that started with PLM did so. However, I discovered there was and still is the challenge with legacy (people and process), particularly in larger enterprises.
The challenge with legacy
The technology was there, the usability was not there. Many implementations of a PLM system go through a critical stage. Are companies willing to change their methodology and habits to align with common best practices, or do they still want to implement their unique ways of working (from the past)?
“The embedded process is limiting our freedom, we need to be flexible”
is an often-heard statement. When every step is micro-managed in the PLM system, you create a bureaucracy detested by the user. In general, when the processes are implemented in a way first focusing on crucial steps with the option to improve later, you will get the best results and acceptance. Nowadays, we could call it an MVP approach.
I have seen companies that created a task or issue for every single activity a person should do. Managers loved the (demo) dashboard. It never lead to success as the approach created frustration at the end user level as their To-Do list grew and grew.
Another example of the micro-management mindset is when I worked with a company that had the opposite definition of Version and Revision in their current terminology. Initially, they insisted that the new PLM system should support this, meaning everywhere in the interface where Revisions was mentioned should be Version and the reverse for Version and Revision.
Can you imagine the cost of implementing and maintaining this legacy per upgrade?
And then came data (for me 2014 – now)
In 2015, during the pivotal PLM Roadmap/PDT conference related to Product Innovation Platforms, it brought the idea of framing digital transformation in the PLM domain in a single sentence: From Coordinated to Connected. See the original image from Marc Halpern here below and those who have read my posts over the years have seen this terminology’s evolution. Now I would say (till 2024): From Coordinated to Coordinated and Connected.
A data-driven approach was not new at that time. Roughly speaking, around 2006 – close to the introduction of the Smartphone – there was already a trend spurred by better global data connectivity at lower cost. Easy connectivity allowed PLM to expand into industries that were not closely connected to 3D CAD systems(CATIA, CREO or NX). Agile PLM, Aras, and SAP PLM became visible – PLM is no longer for design management but also for go-to-market governance in the CPG and apparel industry.
However, a data-driven approach was still rare in mainstream manufacturing companies, where drawings, office documents, email and Excel were the main information carriers next to the dominant ERP system.
A data-driven approach was a consultant’s dream, and when looking at the impact of digital transformation in other parts of the business, why not for PLM, too? My favorite and still valid 2014 image is the one below from Accenture describing Digital PLM. Here business and PLM come together – the WHY!
Again, the challenge with legacy
At that time, I saw a few companies linking their digital transformation to implementing a new PLM system. Those were the days the PLM vendors were battling for the big enterprise deals, sometimes motivated by an IT mindset that unifying the existing PDM/PLM systems would fulfill the digital dream. Science was not winning, but emotion. Read the PLM blame game – still actual.
One of my key observations is that companies struggle when they approach PLM transformation with a migration mindset. Moving from Coordinated to Connected isn’t just about technology—it’s about fundamentally changing how we work. Instead of a document-driven approach, organizations must embrace a data-driven, connected way of working.
The PLM community increasingly agrees that PLM isn’t a single system; it’s a strategy that requires a federated approach—whether through SaaS or even beyond it.
Before AI became a hype, we discussed the digital thread, digital twins, graph databases, ontologies, and data meshes. Legacy – people (skills), processes(rigid) and data(not reliable) – are the elephant in the room. Yet, the biggest challenge remains: many companies see PLM transformation as just buying new tools.
A fundamental transformation requires a hybrid approach—maintaining traditional operations while enabling multidisciplinary, data-driven teams. However, this shift demands new skills and creates the need to learn and adapt, and many organizations hesitate to take that risk.
In his Product Data Plumber Perspective on 2025. Rob Ferrone addressed the challenge to move forward too, and I liked one of his responses in the underlying discussion that says it all – it is hard to get out of your day to day comfort (and data):
Rob Ferrone’s quote:
Transformations are announced, followed by training, then communication fades. Plans shift, initiatives are replaced, and improvements are delayed for the next “fix-all” solution. Meanwhile, employees feel stuck, their future dictated by a distant, ever-changing strategy team.
And then there is Artificial Intelligence (2024 ……)
In the past two years, I have been reading and digesting much news related to AI, particularly generative AI.
Initially, I was a little skeptical because of all the hallucinations and hype; however, the progress in this domain is enormous.
I believe that AI has the potential to change our digital thread and digital twin concepts dramatically where the focus was on digital continuity of data.
Now this digital continuity might not be required, reading articles like The End of SaaS (a more and more louder voice), usage of the Fusion Strategy (the importance of AI) and an (academic) example, on a smaller scale, I about learned last year the Swedish Arrowhead™ fPVN project.
I hope that five years from now, there will not be a paragraph with the title Pity there was again legacy.
We should have learned from the past that there is always the first wave of tools – they come with a big hype and promise – think about the Startgate Project but also Deepseek.
Still remember, the change comes from doing things differently, not from efficiency gains. To do things differently you need an educated, visionary management with the power and skills to take a company in a new direction. If not, legacy will win (again)
Conclusion
In my 25 years of working in the data management domain, now known as PLM, I have seen several impressive new developments – from 2D to 3D, from documents to data, from physical prototypes to models and more. All these developments took decades to become mainstream. Whilst the technology was there, the legacy kept us back. Will this ever change? Your thoughts?

The pivotal 2015 PLM Roadmap / PDT conference
First, I wish you all a prosperous 2025 and hope you will take the time to digest information beyond headlines.
Taking time to digest information is my number one principle now, which means you will see fewer blog posts from my side and potentially more podcast recordings.
My theme for 2025 : “It is all about people, data,
a sustainable business and a smooth digital transformation”.
Fewer blog posts
Fewer blog posts, as although AI might be a blessing for content writers, it becomes as exciting as Wikipedia pages. Here, I think differently than Oleg Shilovitsky, whose posts brought innovative thoughts to our PLM community – “Just my thoughts”.
Now Oleg endorses AI, as you can read in his post: PLM in 2025: A new chapter of blogging transformation. I asked ChatGPT to summarize my post in 50 words, and this is the answer I got – it saves you reading the rest:
The author’s 2025 focus emphasizes digesting information deeply, reducing blog posts, and increasing podcast recordings exploring real-life PLM applications. They stress balancing people and data-centric strategies, sustainable digital transformation, AI’s transformative role, and forward-looking concepts like Fusion Strategy. Success requires prioritizing business needs, people, and accurate data to harness AI’s potential.
Summarizing blog posts with AI saves you time. Thinking about AI-generated content, I understand that when you work in marketing, you want to create visibility for your brand or offer.
Do we need a blogging transformation? I am used to browsing through marketing content and then looking for the reality beyond it – facts and figures. Now it will be harder to discover innovative thoughts in this AI-generated domain.
Am I old fashioned? Time will tell.
More podcast recordings
As I wrote in a recent post, “PLM in real life and Gen AI“, I believe we can learn much from exploring real-life examples. You can always find the theory somewhere and many of the articles make sense and address common points. Some random examples:
- Top 4 Reasons Why PLM Implementations Fail
- 13 Common PLM Implementation Problems And How to Avoid Them
- 10 steps to a Successful PLM implementation
- 11 Essential Product Lifecycle Management Best Practices for Success
Similar recommendations exist for topics like ERP, MES, CRM or Digital Transformation (one of the most hyped terms).
They all describe WHAT to do or not to do. The challenge however is: HOW to apply this knowledge in your unique environment, considering people, skills, politics and culture.
With the focus on the HOW, I worked with Helena Gutierrez last year on the Share PLM podcast series 2. In this series, we interviewed successful individuals from various organizations to explore HOW they approached PLM within their companies. Our goal was to gain insights from their experiences, particularly those moments when things didn’t go as planned, as these are often the most valuable learning opportunities.
I am excited to announce that the podcast will continue this year with Series 3! Joining me this season will be Beatriz Gonzales, Share PLM’s co-founder and new CEO. For Series 3, we’ve decided to broaden the scope of our interviews. In addition to featuring professionals working within companies, we’ll also speak with external experts, such as coaches and implementation partners, who support organizations in their PLM journey.
Our goal is to uncover not only best practices from these experts but also insights into emerging “next practices.”
Stay tuned for series 3!
#datacentric or #peoplecentric ?
The title of the paragraph covers topics from the previous paragraphs and it was also the theme from a recent post shared through LinkedIn from Lionel Grealou: Driving Transformation: Data or People First?
We all agree here that it is not either one or the other, and as the discussion related to the post further clarifies, it is about a business strategy that leads to both of these aspects.
This is the challenge with strategies. A strategy can be excellent – on paper – the success comes from the execution.
This discussion reminds me of the lecture Yousef Hooshmand gave at the PLM platform in the Netherlands last year – two of his images that could cover the whole debate:
Whatever you implement starts from the user experience, giving the data-centric approach the highest priority and designing the solution for change, meaning avoiding embedded hard-coded ways of working.
While companies strive to standardize processes to provide efficiency and traceability, the processes should be reconfigurable or adaptable when needed, reconfigured on reliable data sources.
Jan Bosch shared this last thought too in his Digital Reflection #5: Cog in the Machine. My favorite quote from this refection
“However, in a world where change is accelerating, we need to organize ourselves in ways that make it easy to incorporate change and not ulcer-inducing hard. How do we get there?”
Of course, before we reach tools and technology, the other image Yousef Hooshmand shared below gives a guiding principle that I believe everyone should follow in their context.
It starts with having a C-level long-term commitment when you want to perform a business transformation, and then, in an MVP approach, you start from the business, which will ultimately lead you to the tools and technologies.
The challenge seen in this discussion is that:
most manufacturing companies are still too focused on investing in what they are good at now and do not explore the future enough.
This behavior is why Industry 4.0 is still far from being implemented, and the current German manufacturing industry is in a crisis.
It requires an organization that understands the big picture and has a (fusion) strategy.
Fusion Strategy ?
Is the Fusion Strategy the next step, as Steef Klein often mentions in our PLM discussions? The Fusion Strategy, introduced by world-renowned innovation guru Vijay Govindarajan (The Three Box Solution) and digital strategy expert Venkat Venkatraman (Fusion Strategy), offers a roadmap that will help industrial companies combine what they do best – creating physical products – with what digital technology companies do best – capturing and analyzing data through algorithms and AI.
It is a topic I want to explore this year and see how to connect it to companies in my ecosystem. It is an unknown phenomenon as most of them struggle with a data-driven foundation and skills and focus on the right AI applications.
The End of SaaS?
A potential interesting trend als related to AI I want to clarify further is the modern enterprise architecture . Over the past two years, we have seen a growing understanding that we should not think in systems connected through interfaces but towards a digitally connected infrastructure where APIs, low-code platforms or standardized interfaces will be responsible for real-time collaboration.
I wrote about these concepts in my PLM Roadmap / PDT Europe review. Look at the section: R-evolutionizing PLM and ERP and Heliple. At that time, I shared the picture below, which illustrates the digital enterprise.
The five depicted platforms in the image ( IIoT, CRM, PLM, ERP, MES) are not necessarily a single system. They can be an ecosystem of applications and services providing capabilities in that domain. In modern ways of thinking, each platform could be built upon a SaaS portfolio, ensuring optimal and scalable collaboration based on the company’s needs.
Implementing such an enterprise based on a combination of SaaS offerings might be a strategy for companies to eliminate IT overhead.
However, known forward-thinking experts like Vijay Govindarajan and Venkat Venkatraman with their Fusion Strategy. Also, Satya Nadella, CEO of Microsoft, imagines instead of connected platforms a future with an AI layer taking care of the context of the information – the Microsoft Copilot message. Some of his statements:
This transformation is poised to disrupt traditional tools and workflows, paving the way for a new generation of applications.
The business logic is all going to these AI agents. They’re not going to discriminate between what the backend is — they’ll update multiple databases, and all the logic will be in the AI tier.
Software as a Business Weapon?
Interesting thoughts to follow and to combine with this Forbes article, The End Of The SaaS Era: Rethinking Software’s Role In Business by Josipa Majic Predin. She introduces the New Paradigm: Software as a Business Weapon.
Quote:
Instead of focusing solely on selling software subscriptions, innovative companies are using software to enhance and transform existing businesses. The goal is to leverage technology to make certain businesses significantly more valuable, efficient, and competitive.
This approach involves developing software that can improve the operations of “real world” businesses by 20-30% or more. By creating such powerful tools, technology companies can position themselves to acquire or partner with the businesses they’ve enhanced, thereby capturing a larger share of the value they’ve created.
It is interesting to see these thoughts popping up, usually 10 to 20 years ahead before companies adopt them. However, I believe with AI’s unleashed power, this is where we should be active and learn. It is an exciting area where terms like eBOM or mBOM sound hackneyed.
Sustainability?
As a PLM Green Global Alliance member, I will continue to explore topics related to PLM and how they can serve Sustainability. They are connected as the image from the 2022 PLM Roadmap/PDT Europe indicates:

I will keep on focusing on separate areas within my PGGA network.
Conclusion
I believe 2025 will be the year to focus on understanding the practical applications of AI. Amid the hype and noise, there lies significant potential to re-imagine our PLM landscape and vision. However, success begins with prioritizing the business, empowering people, and ensuring accurate data.

Most times in this PLM and Sustainability series, Klaus Brettschneider and Jos Voskuil from the PLM Green Global Alliance core team speak with PLM related vendors or service partners.
This year we have been speaking with Transition Technologies PSC, Configit, aPriori, Makersite and the PLM Vendors PTC, Siemens and SAP.
Where the first group of companies provided complementary software offerings to support sustainability – “the fourth dimension”– the PLM vendors focused more on the solutions within their portfolio.
This time we spoke with , CIMPA PLM services, a company supporting their customers with PLM and Sustainability challenges, offering an end-to-end support.
What makes them special is that they are also core partner of the PLM Global Green Alliance, where they moderate the Circular Economy theme – read their introduction here: PLM and Circular Economy.
CIMPA PLM services
We spoke with Pierre DAVID and Mahdi BESBES from CIMPA PLM services. Pierre is an environmental engineer and Mahdi is a consulting manager focusing on parts/components traceability in the context of sustainability and a circular economy. Many of the activities described by Pierre and Mahdi were related to the aerospace industry.
We had an enjoyable and in-depth discussion of sustainability, as the aerospace industry is well-advanced in traceability during the upstream design processes. Good digital traceability is an excellent foundation to extend for sustainability purposes.
CSRD, LCA, DPP, AI and more
A bunch of abbreviations you will have to learn. We went through the need for a data-driven PLM infrastructure to support sustainability initiatives, like Life Cycle Assessments and more. We zoomed in on the current Corporate Sustainability Reporting Directive(CSRD) highlighting the challenges with the CSRD guidelines and how to connect the strategy (why we do the CSRD) to its execution (providing reports and KPIs that make sense to individuals).
In addition, we discussed the importance of using the proper methodology and databases for lifecycle assessments. Looking forward, we discussed the potential of AI and the value of the Digital Product Passport for products in service.
Enjoy the 37 minutes discussion and you are always welcome to comment or start a discussion with us.
What we learned
- Sustainability initiatives are quite mature in the aerospace industry and thanks to its nature of traceability, this industry is leading in methodology and best practices.
- The various challenges with the CSRD directive – standardization, strategy and execution.
- The importance of the right databases when performing lifecycle analysis.
- CIMPA is working on how AI can be used for assessing environmental impacts and the value of the Digital Product Passport for products in service to extend its traceability
Want to learn more?
Here are some links related to the topics discussed in our meeting:
- CIMPA’s theme page on the PLM Green website: PLM and Circular Economy
- CIMPA’s commitments towards A sustainable, human and guiding approach
- Sopra Steria, CIMPA’s parent company: INSIDE #8 magazine
Conclusion
The discussion was insightful, given the advanced environment in which CIMPA consultants operate compared to other manufacturing industries. Our dialogue offered valuable lessons in the aerospace industry, that others can draw on to advance and better understand their sustainability initiatives
Due to other activities, I could not immediately share the second part of the review related to the PLM Roadmap / PDT Europe conference, held on 23-24 October in Gothenburg. You can read my first post, mainly about Day 1, here: The weekend after PLM Roadmap/PDT Europe 2024.
There were several interesting sessions which I will not mention here as I want to focus on forward-looking topics with a mix of (federated) data-driven PLM environments and the applicability of AI, staying around 1500 words.
R-evolutionizing PLM and ERP and Heliple
Cristina Paniagua from the Luleå University of Technology closed the first day of the conference, giving us food for thought to discuss over dinner. Her session, describing the Arrowhead fPTN project, fitted nicely with the concepts of the Federated PLM Heliple project presented by Erik Herzog also on Day 2.
They are both research products related to the future state of a digital enterprise. Therefore, it makes sense to treat them together.
Cristina’s session started with sharing the challenges of traditional PLM and ERP systems:
These statements align with the drivers of the Heliple project. The PLM and ERP systems—Systems of Record—provide baselines and traceability. However, Systems of Record have not historically been designed to support real-time collaboration or to create an attractive user experience.
The Heliple project focuses on connecting various modules—the horizontal bars—for systems engineering, hardware engineering, etc., as real-time collaboration environments that can be highly customized and replaceable if needed. The Heliple project explored the usage of OSLC to connect these modules, the Systems of Engagement, with the Systems of Record.
By using Lynxwork as a low-code wrapper to develop the OSLC connections and map them to the needed business scenarios, the team concluded that this approach is affordable for businesses.
Now, the Heliple team is aiming to expand their research with industry scale validation through the Demoiple project (Validate that the Heliple-2 technology can be implemented and accredited in Saab Aeronautics’ operational IT) combined with the Nextiple project, where they will investigate the role of heterogeneous information models/ontologies for heterogeneous analysis.
If you are interested in participating in Nextiple, don’t hesitate to contact Erik Herzog.
Christina’s Arrowhead flexible Production Value Network(fPVN) project aims to provide autonomous and evolvable information interoperability through machine-interpretable content for fPVN stakeholders. In less academic words, building a digital data-driven infrastructure.
The resulting technology is projected to impact manufacturing productivity and flexibility substantially.

The exciting starting point of the Arrowhead project is that it wants to use existing standards and systems as a foundation and, on top of that, create a business and user-oriented layer, using modern technologies such as micro-services to support real-time processing and semantic technologies, ontologies, system modeling, and AI for data translations and learning—a much broader and ambitious scope than the Heliple project.
I believe that in our PLM domain, this resonates with actual discussions you will find on LinkedIn, too. @Oleg Shilovitsky, @Dr. Yousef Hooshmand, @Prof. Dr. Jörg W. Fischer and Martin Eigner are a few of them steering these discussions. I consider it a perfect match for one of the images I shared about the future the digital enterprise.
Potentially, there are five platforms with their own internal ways of working, a mix of systems of record and systems of engagement, supported by an overlay of several Systems of Engagement environments.
I previously described these dedicated environments, e.g., OpenBOM, Colab, Partful, and Authentise. These solutions could also be dedicated apps supporting a specific ecosystem role.
See below my artist’s impression of how a Service Engineer would work in its app connected to CRM, PLM and ERP platform datasets:
The exciting part of the Arrowhead fPVN project is that it wants to explore the interactions between systems and user roles based on existing mature standards instead of leaving the connections to software developers.
Christina mentioned some of these standards below:
I greatly support this approach as, historically, much knowledge and effort has been put into developing standards to support interoperability. Maybe not in real-time, but the embedded knowledge in these standards will speed up the broader usage. Therefore, I concur with the concluding slide:
A final comment: Industrial users must push for these standards if they do not want a future vendor lock-in. Vendors will do what the majority of their customers ask for but will also keep their customers’ data in proprietary formats to prevent them from switching to another system.
Accelerated Product Development Enabled by Digitalization
The keynote session on Day 2, delivered by Uyiosa Abusomwan, Ph.D., Senior Global Technology Manager – Digital Engineering at Eaton, was a visionary story about the future of engineering.
With its broad range of products, Eaton is exploring new, innovative ways to accelerate product design by modeling the design process and applying AI to narrow design decisions and customer-specific engineering work. The picture below shows the areas of attention needed to model the design processes. Uyiosa mentioned the significant beneficial results that have already been reached.
Together with generative design, Eaton works towards modern digital engineering processes built on models and knowledge. His session was complementary to the Heliple and Arrowhead story. To reach such a contemporary design engineering environment, it must be data-driven and built upon open PLM and software components to fully use AI and automation.
Next Gen” Life Cycle Management in Next-Gen Nuclear Power and LTO Legacy Plants
Kent Freeland‘s presentation was a trip into memory land when he discussed the issues with Long Term Operations of legacy nuclear plants.
I spent several years in Ringhals (Sweden) discussing and piloting the setup of a PLM front-end next to the MRO (Maintenance Repair Overhaul) system. As nuclear plants developed in the sixties, they required a longer than anticipated lifecycle, with access to the right design and operational data; maintenance and upgrade changes in the plant needed to be planned and controlled. The design data is often lacking; it resides at the EPC or has been stored in a document management system with limited retrieval capabilities.
See also my 2019 post: How PLM, ALM, and BIM converge thanks to the digital twin.
Kent described these experienced challenges – we must have worked in parallel universes – that now, for the future, we need a digitally connected infrastructure for both plant design and maintenance artifacts, as envisioned below:
The solution reminded me of a lecture I saw at the PI PLMx 2019 conference, where the Swedish ESS facility demonstrated its Asset Lifecycle Data Management solution based on the 3DEXPERIENCE platform.
You can still find the presentation here: Henrik Lindblad Ola Nanzell ESS – Enabling Predictive Maintenance Through PLM & IIOT.
Also, Kent focused on the relevant standards to support a “Single Source of Truth” concept, where I would say after all the federated PLM discussions, I would go for:
“The nearest source of truth and a single source of Change”
assuming this makes more sense in a digitally connected enterprise.
Why do you need to be SMART when contracting for information?
Rob Bodington‘s presentation was complementary to Kent Freeland’s presentation. Ron, a technical fellow at Eurostep, described the challenge of information acquisition when working with large assets that require access to the correct data once the asset is in operation. The large asset could be a nuclear plant or an aircraft carrier.
In the ideal world, the asset owner wants to have a digital twin of the asset fed by different data sources through a digital thread. Of course, this environment will only be reliable when accurate data is used and presented.
Getting accurate data starts with the information acquisition process, and Rob explained that this needed to be done SMARTly – see the image below:
Rob zoomed in on the SMART keywords and the challenge the various standards provide to make the information SMARTly accessible, like the ISO 10303 / PLCS standard, the CFIHOS exchange standard and more. And then there is the ISO 8000 standard about data quality.
Click on the image to get smart.
Rob believes that AI might be the silver bullet as it might help understand the data quality, ontology and context of the data and even improve contracting, generating data clauses for contracting….
And there was a lot of AI ….
There was a dazzling presentation from Gary Langridge, engineering manager at Ocado, explaining their Ocado Smart Platform (OSP), which leverages AI, robotics, and automation to tackle the challenges of online grocery and allow their clients to excel in performance and customer responsiveness.
There was a significant AI component in his presentation, and if you are tired of reading, watch this video
But here was more AI – from the 25 sessions in this conference, 19 of them mentioned the potential or usage of AI somewhere in their speech – this is more than 75 %!
There was a dedicated closing panel discussion related to the real business value of Artificial Intelligence in the PLM domain, moderated by Peter Bilello and answered by selected speakers from the conference, Sandeep Natu (CIMdata), Lars Fossum (SAP), Diana Goenage (Dassault Systemes) and Uyiosa Abusomwan (Eaton).
The discussion was realistic and helpful for the audience. It is clear that to reap the benefits, companies must explore the technology and use it to create valuable business scenarios. One could argue that many AI tools are already available, but the challenge remains that they have to run on reliable data. The data foundation is crucial for a successful outcome.
An interesting point in the discussion was the statement from Diane Goenage, who repeatedly warned that using LLM-based solutions has an environmental impact due to the amount of energy they consume.
We have a similar debate in the Netherlands – do we want the wind energy consumed by data centers (the big tech companies with a minimum workforce in the Netherlands), or should the Dutch citizens benefit from renewable energy resources?
Conclusion
There were even more interesting presentations during these two days, and you might have noticed that I did not advertise my content. This is because I have already reached 1600 words, but I also want to spend more time on the content separately.
It was about PLM and Sustainability, a topic often covered in this conference. Unfortunately, only 25 % of the presentations touched on sustainability, and AI over-hypes the topic.
Hopefully, it is not a sign of the time?

I am sharing another follow-up interview about PLM and Sustainability with a software vendor or implementer. Last year, in November 2023, Klaus Brettschneider and Jos Voskuil from the PLM Green Global Alliance core team spoke with Transition Technologies PSC about their GreenPLM offering and their first experiences in the field.
As we noticed with most first interviews, sustainability was a topic of discussion in the PLM domain, but it was still in the early discovery phases for all of us.
Last week, we spoke again with Erik Rieger and Rafał Witkowski, both working for Transition Technologies PSC, a global IT solution integrator in the PLM world known for their PTC implementation services. The exciting part of this discussion is that system integrators are usually more directly connected to their customers in the field and, therefore, can be the source of understanding of what is happening.
ecoPLM and more
Where Erik is a and he is a long term PLM expert and Rafal is the PLM Practice Lead for Industrial Sustainability. In the interview below they shared their experiences with a first implementation pilot in the field, the value of their _ecoPLM offering in the context of the broader PTC portfolio. And of course we discussed topics closely related to these points and put them into a broader context of sustainably.
Enjoy the 34 minutes discussion and you are always welcome to comment or start a discussion with us.
The slides shown in this presentation and some more can be downloaded HERE.
What I learned
- The GreenPLM offering has changed its name into ecoPLM as TT PSC customers are focusing on developing sustainable products, with currently supporting designer to understand the carbon footprint of their products.
- They are actually in a MVP approach with a Tier 1 automotive supplier to validate and improve their solution and more customers are adding Design for Sustainability to their objective, besides Time to Market, Quality and Cost.
- Erik will provide a keynote speech at the Green PLM conference on November 14th in Berlin – The conference is targeting a German speaking audience although the papers are in English. You can still register and find more info here
- TT PSC is one of the partners completing the PTC sustainability offering and working close with their product management.
- A customer quote: “Sustainability makes PLM sexy again”
Want to learn more?
Here are some links related to the topics discussed in our meeting:
- YouTube: ecoPLM: your roadmap for eco-friendly product development
- ecoPLM – a sustainable product development website
- YouTube: Win the Net-Zero Race with PLM (and PTC)
Conclusions
We are making great progress in the support to design and deliver more sustainable products – sustainability goes beyond marketing as Rafal Witkowski mentioned – the journey has started. What do you see in your company?

































[…] (The following post from PLM Green Global Alliance cofounder Jos Voskuil first appeared in his European PLM-focused blog HERE.) […]
[…] recent discussions in the PLM ecosystem, including PSC Transition Technologies (EcoPLM), CIMPA PLM services (LCA), and the Design for…
Jos, all interesting and relevant. There are additional elements to be mentioned and Ontologies seem to be one of the…
Jos, as usual, you've provided a buffet of "food for thought". Where do you see AI being trained by a…
Hi Jos. Thanks for getting back to posting! Is is an interesting and ongoing struggle, federation vs one vendor approach.…