You are currently browsing the tag archive for the ‘Data centric’ tag.
It was a great pleasure to attend my favorite vendor-neutral PLM conference this year in Gothenburg—approximately 150 attendees, where most have expertise in the PLM domain.
We had the opportunity to learn new trends, discuss reality, and meet our peers.
The theme of the conference was:Value Drivers for Digitalization of the Product Lifecycle, a topic I have been discussing in my recent blog posts, as we need help and educate companies to understand the importance of digitalization for their business.
The two-day conference covered various lectures – view the agenda here – and of course the topic of AI was part of half of the lectures, giving the attendees a touch of reality.
In this first post, I will cover the main highlight of Day 1.
Value Drivers for Digitalization of the Product Lifecycle
As usual, the conference started with Peter Bilello, president & CEO of CIMdata, stressing again that when implementing a PLM strategy, the maximum result comes from a holistic approach, meaning look at the big picture, don’t just focus on one topic.
It was interesting to see again the classic graph (below) explaining the benefits of the end-to-end approach – I believe it is still valid for most companies; however, as I shared in my session the next day, implementing concepts of a Products Service System will require more a DevOp type of graph (more next week).

Next, Peter went through the CIMdata’s critical dozen with some updates. You can look at the updated 2024 image here.
Some of the changes: Digital Thread and Digital Twin are merged– as Digital Twins do not run on documents. And instead of focusing on Artificial Intelligence only, CIMdata introduced Augmented Intelligence as we should also consider solutions that augment human activities, not just replace them.
Peter also shared the results of a recent PLM survey where companies were asked about their main motivation for PLM investments. I found the result a little discouraging for several reasons:
The number one topic is still faster, cheaper and better – almost 65 % of the respondents see this as their priority. This number one topic illustrates that Sustainability has not reached the level of urgency, and perhaps the topic can be found in standards compliance.
Many of the companies with Sustainability in their mission should understand that a digital PLM infrastructure is the foundation for most initiatives, like Lifecycle Analysis (LCA). Sustainability is more than part of standards compliance, if it was mentioned anyway.
The second disappointing observation for the understanding of PLM is that customer support is mentioned only by 15 % of the companies. Again, connecting your products to your customers is the first step to a DevOp approach, and you need to be able to optimize your product offering to what the customer really wants.
Digital Transformation of the Value Chain in Pharma
The second keynote was from Anders Romare, Chief Digital and Information Officer at Novo Nordisk. Anders has been participating in the PDT conference in the past. See my 2016 PLM Roadmap/PDT Europe post, where Anders presented on behalf of Airbus: Digital Transformation through an e2e PLM backbone.
Anders started by sharing some of the main characteristics of the companies he has been working for. Volvo, Airbus and now Novo Nordisk. It is interesting to compare these characteristics as they say a lot about the industry’s focus. See below:
Anders is now responsible for digital transformation in Novo Nordisk, which is a challenge in a heavily regulated industry.
One of the focus areas for Novo Nordisk in 2024 is also Artificial Intelligence, as you can see from the image to the left (click on it for the details).
As many others in this conference, Anders mentioned AI can only be applicable when it runs on top of accurate data.
Understanding the potential of AI, they identified 59 areas where AI can create value for the business, and it is interesting to compare the traditional PLM curve Peter shared in his session with the potential AI-enabled drug-development curve as presented by Anders below:
Next, Anders shared some of the example cases of this exploration, and if you are interested in the details, visit their tech.life site.
When talking about the engineering framing of PLM, it was interesting to learn from Anders, who had a long history in PLM before Novo Nordisk, when he replied to a question from the audience that he would never talk about PLM at the management level. It’s very much aligned with my Don’t mention the P** word post.
A Strategy for the Management of Large Enterprise PLM Platforms
One of the highlights for me on Day 1 was Jorgen Dahl‘s presentation. Jorgen, a senior PLM director at GE Aerospace, shared their story towards a single PLM approach needed due to changes in businesses. And addressing the need for a digital thread also comes with an increased need for uptime.
I like his strategy to execution approach, as shown in the image below, as it contains the most important topics. The business vision and understanding, the imagination of the end status and What must be True?
In my experience, the three blocks are iteratively connected. When describing the strategy, you might not be able to identify the required capabilities and management systems yet.
But then, when you start to imagine the ideal end state, you will have to consider them. And for companies, it is essential to be ambitious – or, as Jorgen stated, uncomfortable ambitious. Go for the 75 % to almost 100 % to be true. Also, asking What must be True is an excellent way to allow people to be involved and creatively explore the next steps.
Note: This approach does not provide all the details, as it will be a multiyear journey of learning and adjusting towards the future. Therefore, the strategy must be aligned with the culture to avoid continuous top-down governance of the details. In that context, Jorgen stated:
“Culture is what happens when you leave the room.”
It is a more positive statement than the famous Peter Drucker’s quote: “Culture eats strategy for breakfast.”
Jorgen’s concluding slide mentions potential common knowledge, although I believe the way Jorgen used the right easy-to-digest points will be helpful for all organizations to step back, look at their initiatives, and compare where they can improve.
How a Business Capability Model and Application Portfolio Management Support Through Changing Times
Peter Vind‘s presentation was nicely connected to the presentation from Jorgen Dahl. Peter, who is an enterprise architect at Siemens Energy, started by explaining where the enterprise architect fits in an organization and comparing it to a city.
In his entertaining session, he mentioned he has to deal with the unicorns at the C-level, who, like politicians in a city, sometimes have the most “innovative” ideas – can they be realized?
Peter explained how they used Business Capability Modeling when Siemens Energy went through various business stages. First, the carve-out from Siemens AG and later the merger with Siemens Gamesa. Their challenge is to understand which capabilities remain, which are new or overlapping, both during the carve-out and merging process.
The business capability modeling leads to a classification of the applications used at different levels of the organization, such as customer-facing, operational, or supporting business capabilities.
Next, for the lifecycle of the applications, the TIME approach was used, meaning that each application was mapped to business fitness and technical fitness. Click on the diagram to see the details.
The result could look like the mapping shown below – a comprehensive overview of where the action is
It is a rational approach; however, Peter mentioned that we also should be aware of the HIPPOs in an organization. If there is a HiPPO (Highest Paid Person’s Opinion) in play, you might face a political battle too.
It was a great educational session illustrating the need for an Enterprise Architect, the value of business capabilities modeling and the TIME concept.
And some more …
There were several other exciting presentations during day 1; however, as not all presentations are publicly available, I cannot discuss them in detail; I just looked at my notes.
Driving Trade Compliance and Efficiency
Peter Sandeck, Director of Project Management at TE Connectivity shared what they did to motivate engineers to endorse their Jurisdiction and Classification Assessment (JCA) process. Peter showed how, through a Minimal Viable Product (MVP) approach and listening to the end-users, they reached a higher Customer Satisfaction (CSAT) score after several iterations of the solution developed for the JCA process.
This approach is an excellent example of an agile method in which engineers are involved. My remaining question is still – are the same engineers in the short term also pushed to make lifecycle assessments? More work; however, I believe if you make it personal, the same MVP approach could work again.
Value of Model-Based Product Architecture
Jussi Sippola, Chief Expert, Product Architecture Management & Modularity at Wärtsilä, presented an excellent story related to the advantages of a more modular product architecture. Where historically, products were delivered based on customer requirements through the order fulfillment process, now there is in parallel the portfolio management process, defining the platform of modules, features and options.
Jussi mentioned that they were able to reduce the number of parts by 50 % while still maintaining the same level of customer capabilities. In addition, thanks to modularity, they were able to reduce the production lead time by 40 % – essential numbers if you want to remain competitive.
Conclusion
Day 1 was a day where we learned a lot as an audience, and in addition, the networking time and dinner in the evening were precious for me and, I assume, also for many of the participants. In my next post, we will see more about new ways of working, the AI dream and Sustainability.
In recent years, I have assisted several companies in defining their PLM strategy. The good news is that these companies are talking first about a PLM strategy and not immediately about a PLM system selection.
In addition, a PLM strategy should not be defined in isolation but rather as an integral part of a broader business strategy. One of my favorite one-liners is:
“Are we implementing the past, or are we implementing the future?”
When companies implement the past, it feels like they modernize their current ways of working with new technology and capabilities. The new environment is more straightforward to explain to everybody in the company, and even the topic of migration can be addressed as migration might be manageable.
Note: Migration should always be considered – the elephant in the room.
I wrote about Migration Migraine in two posts earlier this year, one describing the basics and the second describing the lessons learned and the path to a digital future.
Implementing PLM now should be part of your business strategy.
Threats coming from different types of competitors, necessary sustainability-related regulations (e.g., CSRD reporting), and, on the positive side, new opportunities are coming (e.g., Product as a Service), all requiring your company to be adaptable to changes in products, services and even business models.
Suppose your company wants to benefit from concepts like the Digital Twin and AI. In that case, it needs a data-driven infrastructure—
Digital Twins do not run on documents, and algorithms need reliable data.
Digital Transformation in the PLM domain means combining Coordinated and Connected working methods. In other words, you need to build an infrastructure based on Systems of Record and Systems of Engagement. Followers of my blog should be familiar with these terms.
PLM is not an R&D and Engineering solution
(any more)
One of the biggest misconceptions still made is that PLM is implemented by a single system mainly used by R&D and Engineering. These disciplines are considered the traditional creators of product data—a logical assumption at the time when PLM was more of a silo, Managing Projects with CAD and BOM data.
However, this misconception frames many discussions towards discussions about what is the best system for my discipline, more or less strengthening the silos in an organization. Being able to break the silos is one of the technical capabilities digitization brings.
Business and IT architecture are closely related. Perhaps you have heard about Conway’s law (from 1967):
“Any organization that designs a system (defined broadly) will produce a design whose structure is a copy of the organization’s communication structure.”
This means that if you plan to implement or improve a PLM infrastructure without considering an organizational change, you will be locked again into your traditional ways of working – the coordinated approach, which is reflected on the left side of the image (click on it to enlarge it).
An organizational change impacts middle management, a significant category we often neglect. There is the C-level vision and the voice of the end user. Middle management has to connect them and still feel their jobs are not at risk. I wrote about it some years ago: The Middle Management Dilemma.
How do we adapt the business?
The biggest challenge of a business transformation is that it starts with the WHY and should be understood and supported at all organizational levels.
If there is no clear vision for change but a continuous push to be more efficient, your company is at risk!
For over 60 years, companies have been used to working in a coordinated approach, from paper-based to electronic deliverables.
- How do you motivate your organization to move in a relatively unknown direction?
- Who in your organization are the people who can build a digital vision and Strategy?
These two questions are fundamental, and you cannot outsource ownership of it.
People in the transformation teams need to be digitally skilled (not geeks), communicators (storytellers), and, very importantly, connected to the business.
Often, the candidates come from the existing business units where they have proven skills. The challenging part is educating them and making them available for this mission.
Digital transformation is not a side job.
Education can come from the outside world. Making people available to work on the new digital infrastructure is a management decision and their sense of priority.
How to get external support?
If you are connected to the PLM world like me, a lot of information is available. In academic papers, projects and in particular on LinkedIn currently, there is an overflow of architectural debates:
Recently, I participated in the discussions below:
- How to Solve PLM & ERP (Oleg Shilovitsky)
- Last week, we finally solved PLM & ERP (Prof. Dr. Jörg W. Fischer / Martin Eigner)
- PLM and MBOM: Supply Chain Debates and Future Solution Architecture (Oleg Shilovitsky)
- Could be a Knowledge Graph resp. the Linked Data technologies the key to …. (Matthias Ahrens)
The challenge with these articles is that they are for insiders and far from shareable with business people. There is always a discussion, as we are all learning to match theory with reality. For example,Prof. Dr. Jörg W. Fischer introduced the Information Architecture as a missing link. You can read his recent post here and the quote below to get interested:
All of these methods focus either on Data Architecture or Business Architecture. And the blind spot? I am convinced that an essential layer between the two is missing. We at STZ-RIM Reshape Information Management call this Information Architecture.
Still, we remain in the expert domain, which a limited group of people understands. We need to connect to the business. Where can we find more education from the business side?
The reaction below in one of the discussions says it all, in my opinion:
Starting from the business
What I have learned from my discussions with the management is:
- Don’t mention PLM – you will be cornered in the R&D / Engineering frame.
- Don’t explain their problems, and tell them that you have the solution (on PowerPoint)
- Create curiosity about topics that are relevant to the business – What if …?
- Use storytelling to imagine a future state – Spare the details.
- Build trust and confidence that you are not selling a product. Let the company discover their needs as it is their transformation.
The diagram below, presented by Yousef Hooshmand during the PLM Roadmap/PDT Europe 2023 conference in Paris, describes it all:
It will be a continuous iterative process where, starting from business values and objectives, an implementation step is analyzed, how it fits in the PLM landscape and ultimately, how measures and actions guide the implementation of the tools and technology.
It is important to stress that this is not the guidance for a system implementation; it is the guidance for a digital transformation journey. Therefore, the message in the middle of the image is: Long-term Executive Commitment!
In addition, I want to point to articles and blogs written by Jan Bosch. Jan is an Executive, professor and consultant with more than 20 years of experience in large-scale software R&D management and business.
Although our worlds do not intersect yet, the management of mechanical products and software is different; his principles fit better and better with a modern data-driven organization. Often, I feel we are fighting the same battle to coach companies in their business transformation.
In the context of this article, I recommend reviewing the BAPO model coming from the software world.
BAPO stands for Business, Architecture, Process and Organization. As the diagram below indicates, you should start from the business, defining the needs for the architecture and then the preferred ways of working. Finally, the organization has to be established in accordance with the processes.
Often, companies use the OPAB approach, which makes them feel more comfortable (Conway’s Law). For further reading in this context, I recommend the following posts from Jan Bosch:
Business and technology
I want to conclude by discussing ways to connect business and technology as you need both.
First, I want to point to an example that we presented in the Federated PLM interest group on LinkedIn. Although the discussion initially focused on technical capabilities, we concluded by connecting them to business transformational needs. The diagram below is our characteristic image used to explain the interaction between Systems of Record (the vertical pillars) and the Systems of Engagement (the horizontal bars – modularity).

Have a look at the business discussion below:
Next, the diagram below comes from a 2017 McKinsey whitepaper: Toward an integrated technology operating model. Here, the authors describe how a company can move toward an integrated technology operating model using both coordinated and connected technologies.
They do not mention PLM; they have a business focus, and it is important to mention a company can work in different modes. This is an organizational choice, but don’t let people work in two modes,
Conclusion
With this post, I hope I moved the focus from technology and tools to an understandable business focus. Even within my 1500 words, there is much more to say, and this makes our (PLM) mission so complex and interesting. Let me know where you can connect.

We, the PLM Green Global Alliance, started our first interviews with PLM-related software vendors two years ago in 2022 with SAP, and recently, we revisited them for a much broader interview.
The initial interview in 2022 focused on companies getting pushed by legislation related to plastic packaging and how they could collect and analyze their product data.
Now, two years later, we discussed a much broader scope, including the Circular Economy and even Circular Manufacturing in the automotive industry. You can read and listen to this interview following this link: The PGGA talking again with SAP on Sustainability.
However, as it is claimed that almost eighty percent of the environmental impact of a product is defined and decided during its design phase, we were eager to learn from the primary PLM vendors what they have observed.
PTC
We were fortunate to talk again with Dave Duncan, VP Sustainability at PTC, who had just returned from a three-month tour in Europe, talking with 200 manufacturers in 21 different locations and having deep discussions to understand the market and their customer’s needs.
You could follow his movements through Europe on LinkedIn and his posting from the Munich workshop was fascinating. Besides meeting customers, there were also PTC partners like MakerSite, aPriori, and Transition Technologies PSC. All three companies have recently contributed to our PGGA series related to Sustainability.
Together with Dave, we spoke again with James Norman, who is responsible for driving PTC’s solutions and strategy for the digital and Sustainability transformation. He helped us make the connection between what’s happening in the field and what PTC is considering.
When listening to the interview, you will observe that in the PLM domain, so much has changed in the past two years.
Enjoy the 36 minutes of the interview and listen to what Dave has learned from the field, as reflected by James, on how PTC is addressing Sustainability.
Slides shown during the interview combined with additional company information can be found HERE.
What we have learned
- The Corporate Sustainability Reporting Directive (CSRD) has forced companies to address Sustainability and the need for the digitalization of their processes (the digital thread)
- For Sustainability impact, do not focus just on the component properties; identify hot-spots when analyzing analyzing the impact of the product on the product level.
- As the OEM often only assembles the final product, the environmental impact is defined upstream in the supply chain.
- Modularity and Systems Thinking are crucial methodologies for implementing a Circular Economy.
- If you only consider the cradle-to-gate part of a product’s lifecycle, you might miss the big picture entirely. Even worse, you might implement design changes in the name of sustainability that result in outcomes far less sustainable than the original design. It’s crucial to look at the entire Product Service System/lifecycle to truly understand a product’s environmental impact
- We did not talk about Digital Twins and AI this time. Implementing a connected Digital Thread is, at this moment, the highest priority.
Want to learn more?
- There is the PTC Impact Report
- Dave Duncan’s article: Join PTC on Our Sustainability Journey
- A customer story: How Cummins Prioritizes Sustainability for New Product Designs
Conclusion
I enjoyed the dialogue with Dave and James and the progress we all have made towards understanding what is needed to ensure a sustainable future for our planet. So much has changed in two years.
PLM plays a crucial role in the discussion of a circular economy, the need for modularity, and sustainability reporting. All of these elements require a digital infrastructure related to the products we manufacture or use.
In addition, I was impressed by Dave’s pragmatic approach, who was in the hot spots of European manufacturing companies to understand their needs instead of telling them about their should-be dreams.
Two weeks ago, I shared my first post about PDM/PLM migration challenges on LinkedIn: How to avoid Migration Migraine – part 1. Most of the content discussed was about data migrations.
Starting from moving data stored in relational databases to modern object-oriented environments – the technology upgrade. But also the challenges a company can have when merging different data siloes (CAD & BOM related) into a single PLM backbone to extend the support of product data beyond engineering.
Luckily, the post generated a lot of reactions and feedback through LinkedIn and personal interactions last week.
The amount of interaction illustrated the relevance of the topic for people; they recognized the elephant in the room, too.
Working with a partner
Data migrations and consolidation are typically not part of a company’s core business, so it is crucial to find the right partner for a migration project. The challenge with migrations is that there is potentially a lot to do technically, but only your staff can assess the quality and value of migrations.
Therefore, when planning a migration, make sure you work on it iteratively with an experienced partner who can provide a set of tools and best practices. Often, vendors or service partners have migration tools that still need to be tuned to your As-Is and To-Be environment.
To get an impression of what a PLM service partner can do and which topics or tools are relevant in the context of mid-market PLM, you can watch this xLM webinar on YouTube. So make sure you select a partner who is familiar with your PDM/PLM infrastructure and who has the experience to assess complexity.
Migration lessons learned
In my PLM coaching career I have seen many migrations. In the early days they were more related to technology upgrades, consolidation of data and system replacements. Nowadays the challenges are more related to become more data-driven. Here are 5 lessons that I learned in the past twenty years:
- A fixed price for the migration can be a significant risk as the quality of the data and the result are hard to comprehend upfront. In case of a fixed price, either you would pay for the moon (taking all the risk), or your service partner would lose a lot of money. In a sustainable business model, there should be no losers.
- Start (even now) with checking and fixing your data quality. For example, when you are aware of a mismatch between CAD assemblies and BOM data, analyze and fix discrepancies even before the migration.
- One immediate action to take when moving from CAD assemblies to BOM structures is to check or fill the properties in the CAD system to support a smooth transition. Filling properties might be a temporary action, as later, when becoming more data-driven, some of these properties, e.g., material properties or manufacturer part numbers, should not be maintained in the CAD system anymore. However, they might help migration tools to extract a richer dataset.
- Focus on implementing an environment ready for the future. Don’t let your past data quality compromise complexity. In such a case, learn to live with legacy issues that will be fixed only when needed. A 100 % matching migration is not likely to happen because the source data might also be incorrect, even after further analysis.
- The product should probably not be configured in the CAD environment, even because the CAD tool allows it. I had this experience with SolidWorks in the past. PDM became the enemy because the users managed all configuration options in the assembly files, making it hard to use it on the BOM or Product level (the connected digital thread).
The future is data-driven
In addition, these migration discussions made me aware again that so many companies are still in the early phases of creating a unified PLM infrastructure in their company and implementing the coordinated approach – an observation I shared in my report on the PDSFORUM 2024 conference.
Due to sustainability-related regulations and the need to understand product behavior in the field (Digital Twin / Product As A Service), becoming data-driven is an unavoidable target in the near future. Implementing a connected digital thread is crucial to remaining competitive and sustainable in business.
However, the first step is to gain insights about the available data (formats and systems) and its quality. Therefore, implementing a coordinated PLM backbone should immediately contain activities to improve data quality and implement a data governance policy to avoid upcoming migration issues.
Data-driven environments, the Systems of Engagement, bring the most value when connected through a digital thread with the Systems of Record (PLM. ERP and others), therefore, design your processes, even current ones, user-centric, data-centric and build for change (see Yousef Hooshmand‘s story in this post – also image below).
The data-driven Future is not a migration.
The last part of this article will focus on what I believe is a future PLM architecture for companies. To be more precise, it is not only a PLM architecture anymore. It should become a business architecture based on connected platforms (the systems of record) and inter-platform connected value streams (the systems of engagement).
The discussion is ongoing, and from the technical and business side, I recommend reading Prof Dr. Jorg Fischer’s recent articles, for example. The Crisis of Digitalization – Why We All Must Change Our Mindset! or The MBOM is the Steering Wheel of the Digital Supply Chain! A lot of academic work has been done in the context of TeamCenter and SAP.
Also, Martin Eigner recently described in The Constant Conflict Between PLM and ERP a potential digital future of enterprise within the constraints of existing legacy systems.
In my terminology, they are describing a hybrid enterprise dominated by major Systems of Record complemented by Systems of Engagement to support optimized digital value streams.
Whereas Oleg Shilovitsky, coming from the System of Engagement side with OpenBOM, describes the potential technologies to build a digital enterprise as you can read from one of his recent posts: How to Unlock the Future of Manufacturing by Opening PLM/ERP to Connect Processes and Optimize Decision Support.
All three thought leaders talk about the potential of connected aspects in a future enterprise. For those interested in the details there is a lot to learn and understand.
For the sake of the migration story I stay out of the details. However interesting to mention, they also do not mention data migration—is it the elephant in the room?
I believe moving from a coordinated enterprise to a integrated (coordinated and connected) enterprise is not a migration, as we are no longer talking about a single system that serves the whole enterprise.
The future of a digital enterprise is a federated environment where existing systems need to become more data-driven, and additional collaboration environments will have their internally connected capabilities to support value streams.
With this in mind you can understand the 2017 McKinsey article– Our insights/toward an integrated technology operating model – the leading image below:
And when it comes to realization of such a concept, I have described the Heliple-2 project a few times before as an example of such an environment, where the target is to have a connection between the two layers through standardized interfaces, starting from OSLC. Or visit the Heliple Federated PLM LinkedIn group.
Data architecture and governance are crucial.
The image above generalizes the federated PLM concept and illustrates the two different systems connected through data bridges. As data must flow between the two sides without human intervention, the chosen architecture must be well-defined.
Here, I want to use a famous quote from Youssef Housmand’s paper From a Monolithic PLM Landscape to a Federated Domain and Data Mesh. Click on the image to listen to the Share PLM podcast with Yousef.
From a Single Source of Truth towards a principle of the Nearest Source of Truth based on a Single Source of Change
- If you agree with this quote, you have a future mindset of federated PLM.
- If you still advocate the Single Source of Truth, you are still in the Monolithic PLM phase.
It’s not a problem if you are aware that the next step should be federated and you are not ready yet.

However, in particular, environmental regulations and sustainability initiatives can only be performed in data-driven, federated environments. Think about the European Green Deal with its upcoming Ecodesign for Sustainable Products Directive (ESPR), which demands digital traceability of products, their environmental impact, and reuse /recycle options, expressed in the Digital Product Passport.
Reporting, Greenhouse Gas Reporting and ESG reporting are becoming more and more mandatory for companies, either by regulations or by the customers. Only a data-driven connected infrastructure can deal with this efficiently. Sustaira, a company we interviewed with the PLM Green Global Alliance last year, delivers such a connected infrastructure.
Read the challenges they meet in their blog post: Is inaccurate sustainability data holding you back?
Finally, to perform Life Cycle Assessments for design options or Life Cycle Analyses for operational products, you need connections to data sources in real-time. The virtual design twin or the digital twin in operation does not run on documents.
Conclusion
Data migration and consolidation to modern systems is probably a painful and challenging process. However, the good news is that with the right mindset to continue and with a focus on data quality and governance, the next step to a integrated coordinated and connected enterprise will not be that painful. It can be an evolutionary process, as the McKinsey article describes it.
In the past months, I have had several discussions related to migrating PLM data, either from one system to another or from consolidating a collection of applications into a single environment. Does this sound familiar?
Let me share some experiences and lessons learned to avoid the Migration Migraine.
It is not a technical guide but a collection of experiences and thoughts that you might have missed when considering to solve the technical dream.
Halfway I realized I was too ambitious; therefore, another post will follow this introduction. Here, I will focus on the business side and the digital transformation journey.
Garbage Out – Garbage In
The Garbage Out-In statement is somehow the paradigm we are used to in our day-to-day lives. When you buy a new computer, you use backup and restore. Even easier, nowadays, the majority of the data is already in the cloud.
This simple scenario assumes that all professional systems should be easily upgrade-able. We become unaware of the amount of data we store and its relevance.
This phenomenon already has a name: “Dark Data.” Dark Data consumes storage energy in the cloud and is no longer visible. Please read all about it here: Dark Data.
TIP 1: Every migration is a moment to clean up your data. By dragging everything with you, the burden of migrating becomes bigger. In easy migrations, do a clean-up—it prevents future, more extensive issues.
Never follow the Garbage Out – Garbage in principle, even if it is easy!
Migrations in the PLM domain are different – setting the scene.
Before discussing the various scenarios, let’s examine what companies are doing. For early PLM adopters in the Automotive, Aerospace, and Defense Industries, migrations from mainframes to modern infrastructures have become impossible. The real problem is not only the changing hardware but also the changing data and data models.
For these companies, the solution is often to build an entirely new PLM infrastructure on top of the existing infrastructure, where manageable data pieces are migrated to new environments using data lakes, dashboards, and custom apps to support modern users.
Migration in this case is a journey as long as the data lives – and we can learn from them!
Follow the money
From a business perspective, migrations are considered a negative distractor. Talking about them raises awareness of their complexity, which might jeopardize enthusiasm.
For the initiator, the PLM software vendor or implementer, it might endanger the sales deal.
Traditional IT organizations strive for simplification—one CAD, one PLM or one ERP system to manage. Although this argument makes sense, an analysis should always be done comparing the benefits and the (migration) costs and risks to reach the ideal situation.
In those discussions often, migrations are downplayed

Without naming companies, I have observed the downplaying several times, even at some prominent enterprises. So, if you recognize your company in this process, you are not alone.
TIP 2: Migrations are never simple. Make migration a serious topic of your PLM project – as important as the software. This approach means analyzing the potential migration risks and their mitigation is needed.
Please read about the Xylem story in my recent post: The week after the PDSFORUM 2024
The Big Bang has the highest risk and might again lead to garbage out—garbage in.
You are responsible for your garbage.
It may sound disparaging, but it is not. Most companies are aware that people, tools and policies have changed over the years. Due to the coordinated approach to working, disciplines did not need to care about downstream or upstream usage of their initially created data – Excel and PDFs are the bridges between disciplines.

All the actual knowledge and context are stored in the heads of experienced employees who have gotten used to dealing with inconsistencies. And they will retire, so there is an urgent need for actual data quality and governance. Read more about the journey from Coordinated to Connected in these articles.
Even if you are not yet thinking about migrations, the digital transformation in the PLM domain is coming, and we should learn to work in a connected mode.
TIP 3: Create a team in your organization that assesses the current data quality and defines the potential future enterprise (data) architecture. Then, start improving the quality of the current generated data. Like the ISO 900x standard, the ISO 8000 standard already exists for data quality.
The future is data-driven; prepare yourself for the future.
Migration scenarios and their best practices
Here are some migrations scenario’s – two in this post and more in an upcoming post.
From Relational to Object-oriented
One of my earlier projects, starting in 2010 with SmarTeam, was migrating a mainframe-based application for airplane certification to a modern Microsoft infrastructure.
The goal was to create a new environment that could be used both as a replacement for the mainframe application and as the design and validation environment to implement changes to the current airplanes during a maintenance or upgrade activity.
The need was high because detailed documentation about the logic of the current application did not exist, and only one person who understood the logic was partly available.
So, internally, the relational database was a black box. The tables in the database contained a mix of item data, document data, change status and versions. The documents were stored in directories with meaningful file names but disconnected from the application.
The initial estimate was that the project would take two to three months, so a fixed price for two months was agreed upon. However, it became almost a two-year project, and in the end, the result seemed to be reliable (there was never mathematical proof).
The disadvantage was that SmarTeam ended up being so highly customized that automatic upgrades would not work for this version anymore—a new legacy was created with modern technology.
The same story, combined with the example of Ericsson’s migration attempt, is described in the 2016 post, The PLM Migration Dilemma. For me, the lesson learned from these examples leads to the following recommendation.
TIP 4: When there is a paradigm change in the data model, don’t migrate but establish a new (data-driven) infrastructure and connect to your legacy as much as possible in read-only mode.
The automotive and aerospace industries’ story is one of paradigm change.
Listen to the SharePLM podcast Revolutionizing PLM: Insights from Yousef Hooshmand, where Yousef also discusses how to address this transition process.
CAD/PDM to PLM
Another migration step happens when companies decide to implement a traditional PLM infrastructure as a System of Record, merging PDM data (mainly CAD) and ERP data (the BOM).
Some of these companies have been working file-based and have stored their final CAD files in folders; others might have a local PDM system native to the 3D CAD. The EBOM usually existed digitally in ERP, and most of the time, it is not a “pure” EBOM but more of a hybrid EBOM/MBOM.

The image above show this type of migration can be very challenging as, in the source systems, there is not necessarily a consistent 3D CAD definition matching the BOM items. As the systems have been disconnected in the past, people have potentially added missing information or fixed information on the BOM side. As in most companies, the manufacturing definition is based on drawings, and the consistency with the 3D CAD definition is not guaranteed.
To address this challenge, companies need to assess the usability of the CAD and BOM data. Is it possible to populate the CAD files with properties that are necessary for an import? For example, does the file path contain helpful information?
I have experienced a situation where a company has poorly defined 3D parts and no properties, as all the focus was on using the 3D to generate the 2D drawing.
The relevant details for manufacturing were next added to the drawing and not anymore to the parts or models – traceability was almost impossible.
In this situation, importing the 3D CAD structures into the new PLM system has limited value. An alternative is to describe and test procedures for handling legacy data when it is needed, either to implement a design change or a new order. Leave the legacy accessible, but do not migrate.
The BOM side is, in theory, stable for manufactured products, as the data should have gone through a release process. However, the company needs to revisit its part definition process for new designs and products.
Some points to consider:
- Meaningful identifiers are not desired in a PLM system as they create a legacy. Therefore, the import of parts with smart identifiers should map to relevant part properties besides the ID. Splitting the ID into properties will create a broader usage in the future. Read more in Smart Part Numbers – do we need them?

- In addition, companies should try to avoid having logistic information, such as supplier-specific part numbers to come from the CAD system. Supplier parts in your CAD environment create inefficiencies when a supplier part becomes obsolete. Concepts such as EBOM and MBOM and potentially the SBOM should be well understood during this migration.

- Concepts of EBOM and MBOM should also be introduced when moving from an ETO to a CTO approach or when modularity is a future business strategy.

Conclusion
As every company is on its PLM journey and technology is evolving, there will always be a migration discussion. Understanding and working towards the future should be the most critical driver for migration. Migrations in the PLM domain are often more than a data migration – new ways of working should be introduced in parallel. And for that reason the “big bang” is often too costly and demotivating for the future.
Our first PGGA interview with PLM-related software vendors was two years ago with SAP. At that time, Sustainability became more visible in corporate strategies, and regulations were imminent.
This time, Klaus Brettschneider and I want to learn what has happened related to Sustainability. Is there visible progress in their organizations and customer base? And what is hot now?
And we were positively surprised by a conversation going in many directions.
SAP
The interview was again with Darren West. Darren is the product expert for SAP’s Circular Economy solutions and this time, Stephan Fester supported him. Stephan is co-leading the SAP Global Circular Manufacturing Practice and, therefore, is well-connected to the field. Last year, in particular, working in discrete manufacturing and discussing circular manufacturing.
Thanks to the expertise of our guests, the discussion went in various directions, with circularity as the central theme.
We discussed the progress of the Responsible Design & Production module that was just launched two years ago. We discussed the Green Ledger and Carbon Accounting, of course, in the context of circular manufacturing.
But also, we discussed the Digital Product Passport. Catena-X, what is it, and what is it targeting?
We also discussed how to deal with the scarcity of materials and materials harvesting. The interview could not be complete without mentioning AI.
Enjoy the 35-minute interview with Darren and Stephan on our YouTube channel.
The slides shown in this recording can be found here: PGGA talking again with SAP.
What we have learned
- Regulations heavily push SAP customers and require adequate reporting tools, not only for finance and material use but also for sustainability KPIs
- The Responsible Design & Production module launched two years ago is already in use with 60+ customers, showing the importance of having data-driven decision support for plastic packaging – to be extended to the product. Of course, as a PLM community, we are interested in understanding the next steps toward the product.
- The insights from Stephan Fester on circular manufacturing can be a logical evolution of the linear product process, as Stephan’s image shows.

- Great insights on Catena-X as an independent network for data sharing in the global network
Want to learn more?
Events and Shows:
- SAP at Hannover Messe – April 22-26, 2024 – event information
- SAP Sapphire, Orlando, USA – June 3-5, 2024 – event information
- SAP Sapphire, Barcelona, Spain – June 11-13, 2024 – event information
Websites:
- Circular Manufacturing at Scale – Microsite
- SAP Responsible Design and Production – product page
- SAP Circular Economy page
- SAP Sustainability home page
Conclusion
It was a great discussion with a company that is quite active in supporting its customers on a sustainable journey. The journey is complex and has many aspects, as Darren and Stephan shared in this dialogue. The good news is that SAP’s customers are actively implementing measures and processes – going circular is happening!
Join the PDSFORUM next month and join me to get inspired an participate in a Think Thank session on day 2 related to designing more sustainable products. Will we meet there?
Last week, I participated in the annual 3DEXPERIENCE User Conference, organized by the ENOVIA and NETVIBES brands. With approximately 250 attendees, the 2-day conference on the High-Tech Campus in Eindhoven was fully booked.
My PDM/PLM career started in 1990 in Eindhoven.
First, I spent a significant part of my school life there, and later, I became a physics teacher in Eindhoven. Then, I got infected by CAD and data management, discovering SmarTeam, and the rest is history.
As I wrote in my last year’s post, the 3DEXPERIENCE conference always feels like a reunion, as I have worked most of my time in the SmarTeam, ENOVIA, and 3DEXPERIENCE Eco-system.
Innovation Drivers in the Generative Economy
Stephane Declee and Morgan Zimmerman kicked off the conference with their keynote, talking about the business theme for 2024: the Generative Economy. Where the initial focus was on the Experience Economy and emotion, the Generative Economy includes Sustainability. It is a clever move as the word Sustainability, like Digital Transformation, has become such a generic term. The Generative Economy clearly explains that the aim is to be sustainable for the planet.
Stephane and Morgan talked about the importance of the virtual twin, which is different from digital twins. A virtual twin typically refers to a broader concept that encompasses not only the physical characteristics and behavior of an object or system but also its environment, interactions, and context within a virtual or simulated world. Virtual Twins are crucial to developing sustainable solutions.
Morgan concluded the session by describing the characteristics of the data-driven 3DEXPERIENCE platform and its AI fundamentals, illustrating all the facets of the mix of a System of Record (traditional PLM) and Systems of Record (MODSIM).
3DEXPERIENCE for All at automation.eXpress
Daniel Schöpf, CEO and founder of automation.eXpress GmbH, gave a passionate story about why, for his business, the 3DEXPERIENCE platform is the only environment for product development, collaboration and sales.
Automation.eXpress is a young but typical Engineering To Order company building special machinery and services in dedicated projects, which means that every project, from sales to delivery, requires a lot of communication.
For that reason, Daniel insisted all employees to communicate using the 3DEXPERIENCE platform on the cloud. So, there are no separate emails, chats, or other siloed systems.
Everyone should work connected to the project and the product as they need to deliver projects as efficiently and fast as possible.
Daniel made this decision based on his 20 years of experience in traditional ways of working—the coordinated approach. Now, starting from scratch in a new company without a legacy, Daniel chose the connected approach, an ideal fit for his organization, and using the cloud solution as a scalable solution, an essential criterium for a startup company.
My conclusion is that this example shows the unique situation of an inspired leader with 20 years of experience in this business who does not choose ways of working from the past but starts a new company in the same industry, but now based on a modern platform approach instead of individual traditional tools.
Augment Me Through Innovative Technology
Dr. Cara Antoine gave an inspiring keynote based on her own life experience and lessons learned from working in various industries, a major oil & gas company and major high-tech hardware and software brands. Currently, she is an EVP and the Chief Technology, Innovation & Portfolio Officer at Capgemini.
She explained how a life-threatening infection that caused blindness in one of her eyes inspired her to find ways to augment herself to keep on functioning.
With that, she drew a parallel with humanity, who continuously have been augmenting themselves from the prehistoric day to now at an ever-increasing speed of change.
The current augmentation is the digital revolution. Digital technology is coming, and you need to be prepared to survive – it is Innovate of Abdicate.
Dr. Cara continued expressing the need to invest in innovation (me: it was not better in the past 😉 ) – and, of course, with an economic purpose; however, it should go hand in hand with social progress (gender diversity) and creating a sustainable planet (innovation is needed here).
Besides the focus on innovation drivers, Dr. Cara always connected her message to personal interaction. Her recently published book Make it Personal describes the importance of personal interaction, even if the topics can be very technical or complex.
I read the book with great pleasure, and it was one of the cornerstones of the panel discussion next.
It is all about people…
It might be strange to have a session like this in an ENOVIA/NETVIBES User Conference; however, it is another illustration that we are not just talking about technology and tools.
I was happy to introduce and moderate this panel discussion,also using the iconic Share PLM image, which is close to my heart.
The panelists, Dr. Cara Antoine, Daniel Schöpf, and Florens Wolters, each actively led transformational initiatives with their companies.
We discussed questions related to culture, personal leadership and involvement and concluded with many insights, including “Create chemistry, identify a passion, empower diversity, and make a connection as it could make/break your relationship, were discussed.
And it is about processes.
Another trend I discovered is that cloud-based business platforms, like the 3DEXERIENCE platform, switch the focus from discussing functions and features in tools to establishing platform-based environments, where the focus is more on data-driven and connected processes.
Some examples:
Data Driven Quality at Suzlon Energy Ltd.
Florens Wolters, who also participated in the panel discussion “It is all about people ..” explained how he took the lead to reimagine the Sulon Energy Quality Management System using the 3DEXPERIENCE platform and ENOVIA from a disconnected, fragmented, document-driven Quality Management System with many findings in 2020 to a fully integrated data-driven management system with zero findings in 2023.
It is an illustration that a modern data-driven approach in a connected environment brings higher value to the organization as all stakeholders in the addressed solution work within an integrated, real-time environment. No time is wasted to search for related information.
Of course, there is the organizational change management needed to convince people not to work in their favorite siloes system, which might be dedicated to the job, but not designed for a connected future.
The image to the left was also a part of the “It is all about people”- session.
Enterprise Virtual Twin at Renault Group
The presentation of Renault was also an exciting surprise. Last year, they shared the scope of the Renaulution project at the conference (see also my post: The week after the 3DEXPERIENCE conference 2023).
Here, Renault mentioned that they would start using the 3DEXPERIENCE platform as an enterprise business platform instead of a traditional engineering tool.
Their presentation today, which was related to their Engineering Virtual Twin, was an example of that. Instead of using their document-based SCR (Système de Conception Renault – the Renault Design System) with over 1000 documents describing processes connected to over a hundred KPI, they have been modeling their whole business architecture and processes in UAF using a Systems of System Approach.
The image above shows Franck Gana, Renault’s engineering – transformation chief officer, explaining the approach. We could write an entire article about the details of how, again, the 3DEXPERIENCE platform can be used to provide a real-time virtual twin of the actual business processes, ensuring everyone is working on the same referential.
Bringing Business Collaboration to the Next Level with Business Experiences
To conclude this section about the shifting focus toward people and processes instead of system features, Alizée Meissonnier Aubin and Antoine Gravot introduced a new offering from 3DS, the marketplace for Business Experiences.

According to the HBR article, workers switch an average of 1200 times per day between applications, leading to 9 % of their time reorienting themselves after toggling.
1200 is a high number and a plea for working in a collaboration platform instead of siloed systems (the Systems of Engagement, in my terminology – data-driven, real-time connected). The story has been told before by Daniel Schöpf, Florens Wolters and Franck Gana, who shared the benefits of working in a connected collaboration environment.
The announced marketplace will be a place where customers can download Business Experiences.
There is was more ….
There were several engaging presentations and workshops during the conference. But, as we reach 1500 words, I will mention just two of them, which I hope to come back to in a later post with more detail.
- Delivering Sustainable & Eco Design with the 3DS LCA Solution
Valentin Tofana from Comau, an Italian multinational company in the automation and committed to more sustainable products. In the last context Valentin shared his experiences and lessons learned starting to use the 3DS LifeCycle Assessment tools on the 3DEXPERIENCE platform.
This session gave such a clear overview that we will come back with the PLM Green Global Alliance in a separate interview. - Beyond PLM. Productivity is the Key to Sustainable Business
Neerav MEHTA from L&T Energy Hydrocarbon demonstrated how they currently have implemented a virtual twin of the plant, allowing everyone to navigate, collaborate and explore all activities related to the plant.I was promoting this concept in 2013 also for Oil & Gas EPC companies, at that time, an immense performance and integration challenge. (PLM for all industries) Now, ten years later, thanks to the capabilities of the 3DEXPERIENCE platform, it has become a workable reality. Impressive.
Conclusion
Again, I learned a lot during these days, seeing the architecture of the 3DEXPERIENCE platform growing (image below). In addition, more and more companies are shifting their focus to real-time collaboration processes in the cloud on a connected platform. Their testimonies illustrate that to be sustainable in business, you have to augment yourself with digital.
Note: Dassault Systemes did not cover any of the cost for me attending this conference. I picked the topics close to my heart and got encouraged by all the conversations I had.
This post shares our second interview this year in the PLM Global Green Alliance series, where we talked with PLM-related software vendors and their activities related to Sustainability. Last year, we spoke mainly with the more traditional PLM vendors, but this year, we started with Makersite, a company specialized in Product Lifecycle Intelligence supporting sustainability analysis.
And now we are happy to talk this time with Mark Rushton, Senior Product Marketing Manager and Ryan Flavelle, Associate Product Owner, both at aPriori Technologies. For my PGGA partner Mark Reisig and me, it was an interesting discussion in a domain where the focus was not on product design at the time.
aPriori
aPriori, according to their website, focuses on Digital Manufacturing, digitizing the entire manufacturing process, from design to production, and therefore able to asses environmental impact in a reliable manner.
It was an informative dialogue. Watch the 35-minute discussion here and learn how aPriori uniquely digitizes the manufacturing processes to support Sustainability.
Slides shown during the interview combined with additional company information can be found HERE.
What we have learned
- aPriori’s customers have pushed the company to provide faster and digital sustainability insights in their manufacturing processes, illustrating that companies are really acting to understand their environmental impact. To measure is to know.
- In this interview, we saw the concepts of the digital twin of manufacturing processes and the digital twin of a plant.
- aPriori uniquely starts their impact analysis based on the 3d CAD geometry, being more accurate than what most LCA tools do, a BOM-based assessment,
Want to learn more?
Here are some links to the topics discussed in our meeting:
- How does aPriori software work – watch this demo
- aPriori’s Sustainability resources can be found here
- The blog Mark Reisig liked: the aPriori blog
- And the famous Manufacturing Insights Podcast
Conclusions
When it comes to sustainability in action, you need to be able measure and understand your environmental impact. Where traditional PLM activities focus on the design phase, there is also a lot to learn during the manufacturing phase. aPriori is doing this on a unique manner, not just based on BOM-analysis. In addition companies like aPriori have already a longer term experience with the virtual twin for manufacturing, originally used for cost and manufacturability analysis. Now extended to sustainability and their customers are working on it.
We are happy to start the year with the next round of the PLM Global Green Alliances (PGGA) series: PLM and Sustainability. This year, we will speak with some new companies, and we will also revisit some of our previous guests to learn about their progress.
Where we talked with Aras, Autodesk, CIMdata, Dassault Systèmes, PTC, SAP, Sustaira and Transition Technologies PSC, there are still a lot of software companies with an exciting portfolio related to sustainability.
Therefore, we are happy to talk this time with Makersite, a company whose AI-powered Product Lifecycle Intelligence software, according to their home page, brings together your cost, environment, compliance, and risk data in one place to make smarter, greener decisions powered by the deepest understanding of your supply chain. Let’s explore
Makersite
We were lucky to have a stimulating discussion with Neil D’Souza, Makersite’s CEO and founder, who was active in the field of sustainability for almost twenty years, even before it became a cool (or disputed) profession.
It was an exciting dialogue where we enjoyed realistic answers without all the buzzwords and marketing terms often used in the new domain of sustainability. Enjoy the 39 minutes of interaction below:
Slides shown during the interview combined with additional company information can be found HERE.
What we have learned
- Makersite’s mission, to enable manufacturers to make better products, faster, initially applied to economic parameters, can be easily extended with sustainability parameters.The power of Makersite is that it connects to enterprise systems and sources using AI, Machine Learning and algorithms to support reporting views on compliance, sustainability, costs and risk.
- Compliance and sustainability are the areas where I see a significant need for companies to invest. It is not a revolutionary business change but an extension of scope.We discussed this in the context of the stage-gate process, where sustainability parameters should be added at each gate.
- Neil has an exciting podcast, Five Lifes to Fifty, where he discusses the path to sustainable products with co-hosts Shelley Metcalfe and Jim Fava, and recently, they discussed sustainability in the context of the stage-gate process.
- Again, to move forward with sustainability, it is about creating the base and caring about the data internally to understand what’s happening, and from there, enable value engineering, including your supplier where possible (IP protection remains a topic) – confirming digital transformation (the connected way of working) is needed for business and sustainability.
Want to learn more?
Here are some links to the topics discussed in our meeting:
- The Website – Makersite.io
- Makersite data foundation – makersite-data-foundation
- Makersite demo video – makersite-platform-demo
- Neil’s LinkedIn – neilsaviodsouza
Conclusions
With Makersite, we discovered an experienced company that used its experience in cost, compliance and risk analysis, including supply chains, to extend it to the domain of sustainability. As their technology partners page shows, they can be complementary in many industries and enterprises.
We will see another complementary solution soon in our following interview. Stay tuned.
Two weeks ago, this post from Ilan Madjar drew my attention. He pointed to a demo movie, explaining how to support Smart Part Numbering on the 3DEXPERIENCE platform. You can watch the recording here.
I was surprised that Smart Part Numbering is still used, and if you read through the comments on the post, you see the various arguments that exist.
- “Many mid-market customers are still using it”
me: I think it is not only the mid-market – however, the argument is no reason to keep it alive. - “The problem remains in the customer’s desire (or need or capability) for change.”
me: This is part of the lowest resistance. - “User resistance to change. Training and management sponsorship has proven to be not enough.”
me: probably because discussions are feature-oriented, not starting from the business benefits. - “Cost and effort- rolling this change through downstream systems. The cost and effort of changing PN in PLM,ERP,MES, etc., are high. Trying to phase it out across systems is a recipe for a disaster.”
me: The hidden costs of maintaining Smart Numbers inside an organization are high and invisible, reducing the company’s competitiveness. - “Existing users often complain that it takes seconds to minutes more for unintelligent PN vs. using intelligent PN.”
me: If we talk about a disconnected user without access to information, it could be true if the number of Smart Numbers to comprehend is low.
There were many other arguments for why you should not change. It reminded me of the image below:

Smart Numbers related to the Coordinated approach
Smart Part Numbers are a characteristic of best practices from the past. Where people were working in different systems, the information moving from one system to another was done manually.

For example, it is re-entering the Bill of Materials from the PDM system into the ERP system or attaching drawings to materials/parts in the ERP system. The filename often reflects the material or part number in the latter case.
The problems with the coordinated, smart numbering approach are:
New people in the organization need to learn the meaning of the numbering scheme. This learning process reduces the flexibility of an organization and increases the risk of making errors.- Typos go unnoticed when transferring numbers from one system to another and only get noticed late when the cost of fixing the error might be 10 -100 fold.
- The argument that people will understand the meaning of a part is partly valid. A person can have a good guess of the part based on the smart part number; however, the details can be different unless you work every day with the same and small range of parts.
- Smart Numbers created a legacy. After Mergers and Acquisitions, there will be multiple part number schemes. Do you want to renumber old parts, meaning non-value-added, risky activities? Do you want to continue with various numbering schemes, meaning people need to learn more than one numbering schema – a higher entry barrier and risk of errors?

There were and still are many advanced smart numbering systems.
In one of my first PDM implementations in the Netherlands, I learned about the 12NC code system from Philips – introduced at Philips in 1963 and used to identify complete products, documentation, and bare components, up to the finest detail. At this moment, many companies in the Philips family (suppliers or offspring) still use this numbering system, illustrating that it is not only the small & medium enterprises that are reluctant to change their numbering system.
The costs of working with Smart Part Numbers are often unnoticed as they are considered a given.
From Coordinated to Connected
Digital transformation in the PLM domain means moving from coordinated practices toward practices that benefit from connected technology. In many of my blog posts, you can read why organizations need to learn to work in a connected manner. It is both for their business sustainability and also for being able to deal with regulations related to sustainability in the short term.
GHG reporting, ESG reporting, material compliance, and the DPP are all examples of the outside world pushing companies to work connected. Besides the regulations, if you are in a competitive business, you must be more efficient, innovative and faster than your competitors.

In a connected environment, relations between artifacts (datasets) are maintained in an IT infrastructure without requiring manual data transformations and people to process the data. In a connected enterprise, this non-value-added work will be reduced.
How to move away from Smart Numbering systems?
Several comments related to the Smart Numbering discussion mentioned that changing the numbering system is too costly and risky to implement and that no business case exists to support it. This statement only makes sense if you want your business to become obsolete slowly. Modern best practices based on digitization should be introduced as fast as possible, allowing companies to learn and adapt. There is no need for a big bang.
Start with mapping, prioritizing, and mapping value streams in your company. Where do we see the most significant business benefits related to cost of handling, speed, and quality?
Note: It is not necessary to start with engineering as they might be creators of data – start, for example, with the xBOM flow, where the xBOM can be a concept BOM, the engineering BOM, the Manufacturing BOM, and more. Building this connected data flow is an investment for every department; do not start from the systems.
- Next point: Do not rename or rework legacy data. These activities do not add value; they can only create problems. Instead, build new process definitions that do not depend on the smartness of the number.
Make sure these objects have, besides the part number, the right properties, the right status, and the right connections. In other words, create a connected digital thread – first internally in your company and next with your ecosystem (OEMs, suppliers, vendors)
- Next point: Give newly created artifacts a guaranteed unique ID independent of others. Each artifact has its status, properties and context. In this step, it is time to break any 1 : 1 relation between a physical part and a CAD-part or drawing. If a document gets revised, it gets a new version, but the version change should not always lead to a part number change. You can find many discussions on why to decouple parts and documents and the flexibility it provides.
- Next point: New generated IDs are not necessarily generated in a single system. The idea of a single source of truth is outdated. Build your infrastructure upon existing standards if possible. For example, the UID of the Digital Product Passport will be based on the ISO/IEC 15459 standard, similar to the UID for retail products managed by the GS1 standard. Or, probably closer to home, look into your computer’s registry, and you will discover a lot of software components with a unique ID that specific programs or applications can use in a shared manner.
When will it happen?
In January 2016, I wrote about “the impact of non-intelligent part numbers” and surprisingly almost 8 years later and we are still in the same situation.
I just read Oleg Shilovitsky’s post The Data Dilemma: Why Engineers and Manufacturing Companies Struggle to Find Time for Data Management where he mentions Legacy Systems and Processes, Overwhelming Workloads, Lack of (Data) Expertise, Short-Term Focus and Resource Constraints as inhibitors.
You probably all know the above cartoon. How can companies get out of this armor or habits? Will they be forced by the competition or by regulations. What do you think ?
Conclusion
Despite proven business benefits and insights, it remains challenging for companies to move toward modern, data-driven practices where Smart Number generators are no longer needed. When talking one-on-one to individuals, they are convinced a change is necessary, and they are pointing to the “others”.
I wish you all a prosperous 2024 and the power to involve the “others”.






























[…] (The following post from PLM Green Global Alliance cofounder Jos Voskuil first appeared in his European PLM-focused blog HERE.) […]
[…] recent discussions in the PLM ecosystem, including PSC Transition Technologies (EcoPLM), CIMPA PLM services (LCA), and the Design for…
Jos, all interesting and relevant. There are additional elements to be mentioned and Ontologies seem to be one of the…
Jos, as usual, you've provided a buffet of "food for thought". Where do you see AI being trained by a…
Hi Jos. Thanks for getting back to posting! Is is an interesting and ongoing struggle, federation vs one vendor approach.…