In the past months, I have had several discussions related to migrating PLM data, either from one system to another or from consolidating a collection of applications into a single environment. Does this sound familiar?

Let me share some experiences and lessons learned to avoid the Migration Migraine.

It is not a technical guide but a collection of experiences and thoughts that you might have missed when considering to solve the technical dream.

Halfway I realized I was too ambitious; therefore, another post will follow this introduction. Here, I will focus on the business side and the digital transformation journey.

 

Garbage Out – Garbage In

The Garbage Out-In statement is somehow the paradigm we are used to in our day-to-day lives. When you buy a new computer, you use backup and restore. Even easier, nowadays, the majority of the data is already in the cloud.

This simple scenario assumes that all professional systems should be easily upgrade-able. We become unaware of the amount of data we store and its relevance.

This phenomenon already has a name: “Dark Data.” Dark Data consumes storage energy in the cloud and is no longer visible. Please read all about it here: Dark Data.

TIP 1: Every migration is a moment to clean up your data. By dragging everything with you, the burden of migrating becomes bigger. In easy migrations, do a clean-up—it prevents future, more extensive issues.

Never follow the Garbage Out – Garbage in principle, even if it is easy!

 

Migrations in the PLM domain are different – setting the scene.

Before discussing the various scenarios, let’s examine what companies are doing. For early PLM adopters in the Automotive, Aerospace, and Defense Industries, migrations from mainframes to modern infrastructures have become impossible. The real problem is not only the changing hardware but also the changing data and data models.

For these companies, the solution is often to build an entirely new PLM infrastructure on top of the existing infrastructure, where manageable data pieces are migrated to new environments using data lakes, dashboards, and custom apps to support modern users.

Migration in this case is a journey as long as the data lives – and we can learn from them!

 

Follow the money

From a business perspective, migrations are considered a negative distractor. Talking about them raises awareness of their complexity, which might jeopardize enthusiasm.

For the initiator, the PLM software vendor or implementer, it might endanger the sales deal.

Traditional IT organizations strive for simplification—one CAD, one PLM or one ERP system to manage. Although this argument makes sense, an analysis should always be done comparing the benefits and the (migration) costs and risks to reach the ideal situation.

In those discussions often, migrations are downplayed

Without naming companies, I have observed the downplaying several times, even at some prominent enterprises. So, if you recognize your company in this process, you are not alone.

TIP 2: Migrations are never simple. Make migration a serious topic of your PLM project – as important as the software. This approach means analyzing the potential migration risks and their mitigation is needed.

Please read about the Xylem story in my recent post: The week after the PDSFORUM 2024

The Big Bang has the highest risk and might again lead to garbage out—garbage in.

 

You are responsible for your garbage.

It may sound disparaging, but it is not. Most companies are aware that people, tools and policies have changed over the years. Due to the coordinated approach to working, disciplines did not need to care about downstream or upstream usage of their initially created data – Excel and PDFs are the bridges between disciplines.

All the actual knowledge and context are stored in the heads of experienced employees who have gotten used to dealing with inconsistencies. And they will retire, so there is an urgent need for actual data quality and governance. Read more about the journey from Coordinated to Connected in these articles.

Even if you are not yet thinking about migrations, the digital transformation in the PLM domain is coming, and we should learn to work in a connected mode.

TIP 3: Create a team in your organization that assesses the current data quality and defines the potential future enterprise (data) architecture. Then, start improving the quality of the current generated data. Like the ISO 900x standard, the ISO 8000 standard already exists for data quality.

The future is data-driven; prepare yourself for the future.

 

Migration scenarios and their best practices

Here are some migrations scenario’s – two in this post and more in an upcoming post.

 

From Relational to Object-oriented

One of my earlier projects, starting in 2010 with SmarTeam, was migrating a mainframe-based application for airplane certification to a modern Microsoft infrastructure.

The goal was to create a new environment that could be used both as a replacement for the mainframe application and as the design and validation environment to implement changes to the current airplanes during a maintenance or upgrade activity.

The need was high because detailed documentation about the logic of the current application did not exist, and only one person who understood the logic was partly available.

So, internally, the relational database was a black box. The tables in the database contained a mix of item data, document data, change status and versions. The documents were stored in directories with meaningful file names but disconnected from the application.

The initial estimate was that the project would take two to three months, so a fixed price for two months was agreed upon. However, it became almost a two-year project, and in the end, the result seemed to be reliable (there was never mathematical proof).

The disadvantage was that SmarTeam ended up being so highly customized that automatic upgrades would not work for this version anymore—a new legacy was created with modern technology.

The same story, combined with the example of Ericsson’s migration attempt, is described in the 2016 post, The PLM Migration Dilemma. For me, the lesson learned from these examples leads to the following recommendation.

TIP 4: When there is a paradigm change in the data model, don’t migrate but establish a new (data-driven) infrastructure and connect to your legacy as much as possible in read-only mode.

The automotive and aerospace industries’ story is one of paradigm change.

Listen to the SharePLM podcast Revolutionizing PLM: Insights from Yousef Hooshmand, where Yousef also discusses how to address this transition process.

 

CAD/PDM to PLM

Another migration step happens when companies decide to implement a traditional PLM infrastructure as a System of Record, merging PDM data (mainly CAD) and ERP data (the BOM).

Some of these companies have been working file-based and have stored their final CAD files in folders; others might have a local PDM system native to the 3D CAD. The EBOM usually existed digitally in ERP, and most of the time, it is not a “pure” EBOM but more of a hybrid EBOM/MBOM.

The image above show this type of migration can be very challenging as, in the source systems, there is not necessarily a consistent 3D CAD definition matching the BOM items. As the systems have been disconnected in the past, people have potentially added missing information or fixed information on the BOM side. As in most companies, the manufacturing definition is based on drawings, and the consistency with the 3D CAD definition is not guaranteed.

To address this challenge, companies need to assess the usability of the CAD and BOM data. Is it possible to populate the CAD files with properties that are necessary for an import? For example, does the file path contain helpful information?

I have experienced a situation where a company has poorly defined 3D parts and no properties, as all the focus was on using the 3D to generate the 2D drawing.

The relevant details for manufacturing were next added to the drawing and not anymore to the parts or models – traceability was almost impossible.

In this situation, importing the 3D CAD structures into the new PLM system has limited value. An alternative is to describe and test procedures for handling legacy data when it is needed, either to implement a design change or a new order. Leave the legacy accessible, but do not migrate.

The BOM side is, in theory, stable for manufactured products, as the data should have gone through a release process. However, the company needs to revisit its part definition process for new designs and products.

Some points to consider:

  1. Meaningful identifiers are not desired in a PLM system as they create a legacy. Therefore, the import of parts with smart identifiers should map to relevant part properties besides the ID. Splitting the ID into properties will create a broader usage in the future. Read more in Smart Part Numbers – do we need them?
  2. In addition, companies should try to avoid having logistic information, such as supplier-specific part numbers to come from the CAD system. Supplier parts in your CAD environment create inefficiencies when a supplier part becomes obsolete. Concepts such as EBOM and MBOM and potentially the SBOM should be well understood during this migration.
  3. Concepts of EBOM and MBOM should also be introduced when moving from an ETO to a CTO approach or when modularity is a future business strategy.

 

 

Conclusion

As every company is on its PLM journey and technology is evolving, there will always be a migration discussion. Understanding and working towards the future should be the most critical driver for migration. Migrations in the PLM domain are often more than a data migration – new ways of working should be introduced in parallel. And for that reason the “big bang” is often too costly and demotivating for the future.

 

 

I attended the PDSVISION forum for the first time, a two-day PLM event in Gothenburg organized by PTC’s largest implementer in the Nordics, also active in North America, the UK, and Germany.

The theme of the conference: Master your Digital Thread – a hot topic, as it has been discussed in various events, like the recent PLM Roadmap/PDT Europe conference in November 2023.

The event drew over 200 attendees, showing the commitment of participants, primarily from the Nordics, to knowledge sharing and learning.

The diverse representation included industry leaders like Vestas, pioneers in Sustainable Energy, and innovative startups like CorPower Ocean, who are dedicated to making wave energy reliable and competitive. Notably, the common thread among these diverse participants was their focus on sustainability, a growing theme in PLM conferences and an essential item on every board’s strategic agenda.

I enjoyed the structure and agenda of the conference. The first day was filled with lectures and inspiring keynotes. The second day was a day of interactive workshops divided into four tracks, which were of decent length so we could really dive into the topics. As you can imagine, I followed the sustainability track.

Here are some of my highlights of this conference.

 

Catching the Wind: A Digital Thread From Design to Service

Simon Saandvig Storbjerg, unfortunately remote,  gave an overview of the PLM-related challenges that Vestas is addressing. Vestas, the undisputed market leader in wind energy, is indirectly responsible for 231 million tonnes of CO2 per year.

One of the challenges of wind power energy is the growing complexity and need for variants. With continuous innovation and the size of the wind turbine, it is challenging to achieve economic benefits of scale.

As an example, Simon shared data related to the Lost Production Factor, which was around 5% in 2009 and reduced to 2% in 2017 and is now growing again. This trend is valid not only for Vestas but also for all wind turbine manufacturers, as variability is increasing.

Vestas is introducing modularity to address these challenges. I reported last year about their modularity journey related to the North European Modularity biannual meeting held at Vestas in Ringkøbing – you can read the post here.

Simon also addressed the importance of Model-Based Definition (MBD), which is crucial if you want to achieve digital continuity between engineering and manufacturing. In particular, in this industry, MBD is a challenge to involve the entire value chain, despite the fact that the benefits are proven and known. Change in people skills and processes remains a challenge.

 

The Future of Product Design and Development

The session led by PTC from Mark Lobo, General Manager for the PLM Segment, and Brian Thompson, General Manager of the CAD Segment, brought clarity to the audience on the joint roadmap of Windchill and Creo.

Mark and Brian highlighted the benefits of a Model-Based Enterprise and Model-Based Definition, which are musts if you want to be more efficient in your company and value chain.

The WHY is known, see the benefits described in the image, and requires new ways of working, something organizations need to implement anyway when aiming to realize a digital thread or digital twin.

In addition, Mark addressed PTC’s focus on Design for Sustainability and their partner network. In relation to materials science, the partnership with Ansys Granta MI is essential. It was presented later by Ansys and discussed on day two during one of the sustainability workshops.

Mark and Brian elaborated on the PTC SaaS journey – the future atlas platform and the current status of WindChill+ and Creo+, addressing a smooth transition from existing customers to a new future architecture.

And, of course, there was the topic of Artificial Intelligence.

Mark explained that PTC is exploring AI in various areas of the product lifecycle, like validating requirements, optimizing CAD models, streamlining change processes on the design side but also downstream activities like quality and maintenance predictions, improved operations and streamlined field services and service parts are part of the PTC Copilot strategy.

PLM combined with AI is for sure a topic where the applicability and benefits can be high to improve decision-making.

 

PLM Data Merge in the PTC Cloud: The Why & The How

Mikael Gustafson from Xylem, a leading Global Water Solutions provider, described their recently completed project: merging their on-premise Windchill instance TAPIR and their cloud Windchill XGV into a single environment.

TAPIR stands for Technical Administration, Part Information Repository and is very much part-centric and used in one organization. XGV stands for Xylem Global Vault, and it is used in 28 organizations with more of a focus on CAD data (Creo and AutoCAD). Two different siloes are to be joined in one instance to build a modern, connected, data-driven future or, as Mikael phrased it: “A step towards a more manageable Virtual Product“.

It was a severe project involving a lot of resources and time, again showing the challenges of migrations. I am planning to publish a blog post, the draft title “Migration Migraine,” as this type of migration is prevalent in many places because companies want to implement a single PLM backbone beyond (mechanical) engineering.

What I liked about the approach was its focus on assessing the risks and prioritizing a mitigation strategy if necessary. As the list below shows, even the COVID-19 pandemic was challenging the project.

Often, big migration projects fail due to optimism or by assessing some of the risks at the start and then giving it a go.

When failures happen, there is often the blame game: Was it the software, the implementer, or the customer (past or present) that caused the troubles? Mediating in such environments has been a long time my mission as the “Flying Dutchman,” and from my experience, it is not about the blame game; it is, most of the time, too high expectations and not enough time or resources to fully control this journey.

As Michael said, Xylem was successful, and during the go-live, only a few non-critical issues popped up.

When asked what he would do differently with the project’s hindsight, Mikael mentioned he would do the migrations not as a big project but as smaller projects.

I can relate a lot to this answer as, by experience, the “one-time” migration projects have created a lot of stress for the company, and only a few of them were successful.

 

Starting being coordinated and then connected

Several sessions were held where companies shared their PLM journey, to be mapped along the maturity slide (slide 8) I shared in my session: The Why, What and How of Digital Transformation in the PLM domain. You can review the content here on SlideShare.

There was Evolabel, a company starting its PLM journey because they are suffering from ineffective work procedures, information islands and the increasing complexity of its products.

Evolabel realized it needed PLM to realize its market ambition: To be a market leader within five years. For Evolabel, PLM is a must that is repeatable and integrated internally.

They shared how they first defined the required understanding and mindset for the needed capabilities before implementing them. In my terminology, they started to implement a coordinated PLM approach.

Teddy Svenson from JBT, a well-known manufacturer of food-tech solutions, described their next step in PLM. From an old AS/400 system with very little integration to PDM to a complete PLM system with parts, configurations, and change management.

It is not an easy task but a vital stepping stone for future development and a complete digital thread, from sales to customer care. In my terminology, they were upgrading their technology to improve their coordinated approach to be ready for the next digital evolution.

There were several other presentations on Day One – See the agenda here  I cannot cover them all given the limited size of this blog post.

 

The Workshops

As I followed the Sustainability track, I cannot comment much on the other track; however, given the presenters and the topics, they all appeared to be very pragmatic and interactive – given the format.

Achieving sustainability goals by integrating material intelligence into the design process

In the sustainability track, we started with Manuelle Clavel from Ansys Granta, who explained in detail how material data and its management are crucial for designing better-performing, more sustainable, and compliant products.

With the importance of compliance with (upcoming) regulations and the usage of material characteristics in the context of more sustainable products and being able to perform a Life Cycle Assessment, it is crucial to have material information digitally available, both in the CAD design environment as well in the PLM environment.

For me, a dataset of material properties is an excellent example of how it is used in a connected enterprise. You do not want to copy the information from system to system; it needs to be connected and available in real-time.

How can we design more sustainable products?

Together with Martin Lundqvist from QCM, I conducted an interactive session. We started with the need for digitalization, then looked at RoHS and REACH compliance and discussed the upcoming requirements of the Digital Product Passport.

We closed the session with a dialogue on the circular economy.

From the audience, we learned that many companies are still early in understanding the implementation of sustainability requirements and new processes. However, some were already quite advanced and acting. In particular, it is essential to know if your company is involved with batteries (DPP #1) or is close to consumers.

 

Conclusion

The PDSFORUM was for me an interesting experience for meeting companies at all different stages of their PLM journey. All sessions I attended were realistic, and the solutions were often pragmatic. In my day-to-day life, inspiring companies to understand a digital and sustainable future, you sometimes forget the journey everyone is going through.

Thanks, PDVISION, for inviting me to speak and learn at this conference.

and some sad news …..

I was sorry to learn that last week, Dr. Ken Versprille suddenly passed away. I know Ken, as shown in the picture – a passionate moderator and timekeeper of the PLM Roadmap / PDT conferences, well prepared for the details. May his spirit live through the future conferences – the next one already on May 8-9th in Washington, DC.

 

 

 

 

 

 

 

 

Our recent interviews this year with aPriori and SAP were with companies that had less of a focus on the traditional product design process and more of a focus on the (circular) manufacturing process. In these interviews the importance of working with connected data was discussed in a shared (digital) thread.

This time, we, Mark Reisig and Jos Voskuil, were excited to talk with Siemens, not only a well-known PLM vendor but also a manufacturer of products and, therefore, having a close understanding of what is needed and can be achieved with their software solutions.

Siemens

As Siemens is such a broad enterprise; we were happy to speak with Ryan R. Rochelle, who focuses on Sustainable Production, Sustainable Manufacturing and Sustainable Industry within Siemens . In the interview we discussed the importance of digital twins and the feedback loops between design and manufacturing. Despite some flaws in the network connection, we are happy to share an informative interview.

Enjoy listening and watching the next 33 minutes, talking with Ryan Rochelle.

You can download the images shown during the interview HERE

 

What I have learned

  • Like all PLM vendors in this domain, Siemens talks about the importance of a circular economy and the need for digital threads and digital twins, confirming the need for all of us to invest in the  digitization of the product lifecycle.
  • Siemens is in a unique position as both the industrial user and software provider of its PLM suite, therefore having a unique feedback loop on the usability and applicability of its software in its industry.
  • In the area of sustainability, they learn from both customers and internal customers. They are customer zero. Here, they observe shifting in engineering activities to the left” to optimize processes, supply chain and manufacturing earlier . (<<PGGA>>: which aligns with our aPriori and Makersite interviews).
  • Siemens, SiGreen’s solution is an example of this unique position, being  be able to track the carbon footprint of products across the supply chain.

Want to learn more


Conclusion

We have been discussing the relationship between PLM and sustainability with relevant software vendors for over two years now. As we saw initially in 2022, a few companies were exploring the possibilities.

Now, with further regulations and advanced software capabilities, companies are starting to implement new capabilities to make their product development process and products more sustainable. Siemens, as a software provider and an industrial user of its tools, is leading this journey—is it time for your company to step up, too?

 

Our first PGGA interview with PLM-related software vendors was two years ago with SAP. At that time, Sustainability became more visible in corporate strategies, and regulations were imminent.

This time, Klaus Brettschneider and I want to learn what has happened related to Sustainability. Is there visible progress in their organizations and customer base? And what is hot now?

And we were positively surprised by a conversation going in many directions.

SAP

The interview was again with Darren West. Darren is the product expert for SAP’s Circular Economy solutions and this time, Stephan Fester supported him. Stephan is co-leading the SAP Global Circular Manufacturing Practice and, therefore, is well-connected to the field. Last year, in particular, working in discrete manufacturing and discussing circular manufacturing.

Thanks to the expertise of our guests, the discussion went in various directions, with circularity as the central theme.

We discussed the progress of the Responsible Design & Production module that was just launched two years ago. We discussed the Green Ledger and Carbon Accounting, of course, in the context of circular manufacturing.

But also, we discussed the Digital Product Passport. Catena-X, what is it, and what is it targeting?

We also discussed how to deal with the scarcity of materials and materials harvesting.   The interview could not be complete without mentioning AI.

Enjoy the 35-minute interview with Darren and Stephan on our YouTube channel.

The slides shown in this recording can be found here: PGGA talking again with SAP.

 

What we have learned

  • Regulations heavily push SAP customers and require adequate reporting tools, not only for finance and material use but also for sustainability KPIs
  • The Responsible Design & Production module launched two years ago is already in use with 60+ customers, showing the importance of having data-driven decision support for plastic packaging – to be extended to the product. Of course, as a PLM community, we are interested in understanding the next steps toward the product.
  • The insights from Stephan Fester on circular manufacturing can be a logical evolution of the linear product process, as Stephan’s image shows.
  • Great insights on Catena-X as an independent network for data sharing in the global network

 

Want to learn more?

Events and Shows:

Websites:

 

Conclusion

It was a great discussion with a company that is quite active in supporting its customers on a sustainable journey. The journey is complex and has many aspects, as Darren and Stephan shared in this dialogue. The good news is that SAP’s customers are actively implementing measures and processes – going circular is happening!

 

Join the PDSFORUM next month and join me to get inspired an participate in a Think Thank session on day 2 related to designing more sustainable products. Will we meet there?

 

Last week, I participated in the annual 3DEXPERIENCE User Conference, organized by the ENOVIA and NETVIBES brands. With approximately 250 attendees, the 2-day conference on the High-Tech Campus in Eindhoven was fully booked.

My PDM/PLM career started in 1990 in Eindhoven.

First, I spent a significant part of my school life there, and later, I became a physics teacher in Eindhoven. Then, I got infected by CAD and data management, discovering SmarTeam, and the rest is history.

As I wrote in my last year’s post, the 3DEXPERIENCE conference always feels like a reunion, as I have worked most of my time in the SmarTeam, ENOVIA, and 3DEXPERIENCE Eco-system.

 

Innovation Drivers in the Generative Economy

Stephane Declee and Morgan Zimmerman kicked off the conference with their keynote, talking about the business theme for 2024: the Generative Economy. Where the initial focus was on the Experience Economy and emotion, the Generative Economy includes Sustainability. It is a clever move as the word Sustainability, like Digital Transformation, has become such a generic term. The Generative Economy clearly explains that the aim is to be sustainable for the planet.

Stephane and Morgan talked about the importance of the virtual twin, which is different from digital twins. A virtual twin typically refers to a broader concept that encompasses not only the physical characteristics and behavior of an object or system but also its environment, interactions, and context within a virtual or simulated world. Virtual Twins are crucial to developing sustainable solutions.

Morgan concluded the session by describing the characteristics of the data-driven 3DEXPERIENCE platform and its AI fundamentals, illustrating all the facets of the mix of a System of Record (traditional PLM) and Systems of Record (MODSIM).

 

3DEXPERIENCE for All at automation.eXpress

Daniel Schöpf, CEO and founder of automation.eXpress GmbH, gave a passionate story about why, for his business, the 3DEXPERIENCE platform is the only environment for product development, collaboration and sales.

Automation.eXpress is a young but typical Engineering To Order company building special machinery and services in dedicated projects, which means that every project, from sales to delivery, requires a lot of communication.

For that reason, Daniel insisted all employees to communicate using the 3DEXPERIENCE platform on the cloud. So, there are no separate emails, chats, or other siloed systems.

Everyone should work connected to the project and the product as they need to deliver projects as efficiently and fast as possible.

Daniel made this decision based on his 20 years of experience in traditional ways of working—the coordinated approach. Now, starting from scratch in a new company without a legacy, Daniel chose the connected approach, an ideal fit for his organization, and using the cloud solution as a scalable solution, an essential criterium for a startup company.

My conclusion is that this example shows the unique situation of an inspired leader with 20 years of experience in this business who does not choose ways of working from the past but starts a new company in the same industry, but now based on a modern platform approach instead of individual traditional tools.

 

 

Augment Me Through Innovative Technology

Dr. Cara Antoine gave an inspiring keynote based on her own life experience and lessons learned from working in various industries, a major oil & gas company and major high-tech hardware and software brands. Currently, she is an EVP and the Chief Technology, Innovation & Portfolio Officer at Capgemini.

She explained how a life-threatening infection that caused blindness in one of her eyes inspired her to find ways to augment herself to keep on functioning.

With that, she drew a parallel with humanity, who continuously have been augmenting themselves from the prehistoric day to now at an ever-increasing speed of change.

The current augmentation is the digital revolution. Digital technology is coming, and you need to be prepared to survive – it is Innovate of Abdicate.

Dr. Cara continued expressing the need to invest in innovation (me: it was not better in the past 😉 ) – and, of course, with an economic purpose; however, it should go hand in hand with social progress (gender diversity) and creating a sustainable planet (innovation is needed here).

Besides the focus on innovation drivers, Dr. Cara always connected her message to personal interaction. Her recently published book Make it Personal describes the importance of personal interaction, even if the topics can be very technical or complex.

I read the book with great pleasure, and it was one of the cornerstones of the panel discussion next.

 

It is all about people…

It might be strange to have a session like this in an ENOVIA/NETVIBES User Conference; however, it is another illustration that we are not just talking about technology and tools.

I was happy to introduce and moderate this panel discussion,also using the iconic Share PLM image,  which is close to my heart.

The panelists, Dr. Cara Antoine, Daniel Schöpf, and Florens Wolters, each actively led transformational initiatives with their companies.

We discussed questions related to culture, personal leadership and involvement and concluded with many insights, including “Create chemistry, identify a passion, empower diversity, and make a connection as it could make/break your relationship, were discussed.

 

And it is about processes.

Another trend I discovered is that cloud-based business platforms, like the 3DEXERIENCE platform, switch the focus from discussing functions and features in tools to establishing platform-based environments, where the focus is more on data-driven and connected processes.

Some examples:

Data Driven Quality at Suzlon Energy Ltd.

Florens Wolters, who also participated in the panel discussion “It is all about people ..” explained how he took the lead to reimagine the Sulon Energy Quality Management System using the 3DEXPERIENCE platform and ENOVIA from a disconnected, fragmented, document-driven Quality Management System with many findings in 2020 to a fully integrated data-driven management system with zero findings in 2023.

It is an illustration that a modern data-driven approach in a connected environment brings higher value to the organization as all stakeholders in the addressed solution work within an integrated, real-time environment. No time is wasted to search for related information.

Of course, there is the organizational change management needed to convince people not to work in their favorite siloes system, which might be dedicated to the job, but not designed for a connected future.

The image to the left was also a part of the “It is all about people”- session.

 

Enterprise Virtual Twin at Renault Group

The presentation of Renault was also an exciting surprise. Last year, they shared the scope of the Renaulution project at the conference (see also my post: The week after the 3DEXPERIENCE conference 2023).

Here, Renault mentioned that they would start using the 3DEXPERIENCE platform as an enterprise business platform instead of a traditional engineering tool.

Their presentation today, which was related to their Engineering Virtual Twin, was an example of that. Instead of using their document-based SCR (Système de Conception Renault – the Renault Design System) with over 1000 documents describing processes connected to over a hundred KPI, they have been modeling their whole business architecture and processes in UAF using a Systems of System Approach.

The image above shows Franck Gana, Renault’s engineering – transformation chief officer, explaining the approach. We could write an entire article about the details of how, again, the 3DEXPERIENCE platform can be used to provide a real-time virtual twin of the actual business processes, ensuring everyone is working on the same referential.

 

Bringing Business Collaboration to the Next Level with Business Experiences

To conclude this section about the shifting focus toward people and processes instead of system features, Alizée Meissonnier Aubin and Antoine Gravot introduced a new offering from 3DS, the marketplace for Business Experiences.


According to the HBR article, workers switch an average of 1200 times per day between applications, leading to 9 % of their time reorienting themselves after toggling.

1200 is a high number and a plea for working in a collaboration platform instead of siloed systems (the Systems of Engagement, in my terminology – data-driven, real-time connected). The story has been told before by Daniel Schöpf, Florens Wolters and Franck Gana, who shared the benefits of working in a connected collaboration environment.

The announced marketplace will be a place where customers can download Business Experiences.

There is was more ….

There were several engaging presentations and workshops during the conference. But, as we reach 1500 words, I will mention just two of them, which I hope to come back to in a later post with more detail.

  • Delivering Sustainable & Eco Design with the 3DS LCA Solution

    Valentin Tofana from Comau, an Italian multinational company in the automation and committed to more sustainable products. In the last context Valentin   shared his experiences and lessons learned starting to use the 3DS LifeCycle Assessment tools on the 3DEXPERIENCE platform.
    This session gave such a clear overview that we will come back with the PLM Green Global Alliance in a separate interview.
  • Beyond PLM. Productivity is the Key to Sustainable Business
    Neerav MEHTA from L&T Energy Hydrocarbon demonstrated how they currently have implemented a virtual twin of the plant, allowing everyone to navigate, collaborate and explore all activities related to the plant.I was promoting this concept in 2013 also for Oil & Gas EPC companies, at that time, an immense performance and integration challenge. (PLM for all industries) Now, ten years later, thanks to the capabilities of the 3DEXPERIENCE platform, it has become a workable reality. Impressive.

 

Conclusion

Again, I learned a lot during these days, seeing the architecture of the 3DEXPERIENCE platform growing (image below). In addition, more and more companies are shifting their focus to real-time collaboration processes in the cloud on a connected platform. Their testimonies illustrate that to be sustainable in business, you have to augment yourself with digital.

Note: Dassault Systemes did not cover any of the cost for me attending this conference. I picked the topics close to my heart and got encouraged by all the conversations I had.

 

This post shares our second interview this year in the PLM Global Green Alliance series, where we talked with PLM-related software vendors and their activities related to Sustainability. Last year, we spoke mainly with the more traditional PLM vendors, but this year, we started with Makersite, a company specialized in Product Lifecycle Intelligence supporting sustainability analysis.

And now we are happy to talk this time with Mark Rushton,  Senior Product Marketing Manager and Ryan Flavelle, Associate Product Owner, both at aPriori Technologies. For my PGGA partner Mark Reisig and me, it was an interesting discussion in a domain where the focus was not on product design at the time.

 

aPriori

aPriori, according to their website, focuses on Digital Manufacturing, digitizing the entire manufacturing process, from design to production, and therefore able to asses environmental impact in a reliable manner.

It was an informative dialogue. Watch the 35-minute discussion here and learn how aPriori uniquely digitizes the manufacturing processes to support Sustainability.

Slides shown during the interview combined with additional company information can be found HERE.

 

What we have learned

  • aPriori’s customers have pushed the company to provide faster and digital sustainability insights in their manufacturing processes, illustrating that companies are really acting to understand their environmental impact. To measure is to know.
  • In this interview, we saw the concepts of the digital twin of manufacturing processes and the digital twin of a plant.
  • aPriori uniquely starts their impact analysis based on the 3d CAD geometry, being more accurate than what most LCA tools do, a BOM-based assessment,

Want to learn more?

Here are some links to the topics discussed in our meeting:

 

Conclusions

When it comes to sustainability in action, you need to be able measure and understand your environmental impact. Where traditional PLM activities focus on the design phase, there is also a lot to learn during the manufacturing phase. aPriori is doing this on a unique manner, not just based on BOM-analysis. In addition companies like aPriori have already a longer term experience with the virtual twin for manufacturing, originally used for cost and manufacturability analysis. Now extended to sustainability and their customers are working on it.

Next week more about the 3DEXPERIENCE conference – did I see you there?

 

 

 

 

 

We are happy to start the year with the next round of the PLM Global Green Alliances (PGGA) series: PLM and Sustainability. This year, we will speak with some new companies, and we will also revisit some of our previous guests to learn about their progress.

Where we talked with Aras,  Autodesk, CIMdata,  Dassault Systèmes, PTC, SAP, Sustaira and Transition Technologies PSC, there are still a lot of software companies with an exciting portfolio related to sustainability.

Therefore, we are happy to talk this time with  Makersite, a company whose AI-powered Product Lifecycle Intelligence software, according to their home page, brings together your cost, environment, compliance, and risk data in one place to make smarter, greener decisions powered by the deepest understanding of your supply chain. Let’s explore

Makersite

We were lucky to have a stimulating discussion with Neil D’Souza, Makersite’s CEO and founder, who was active in the field of sustainability for almost twenty years, even before it became a cool (or disputed) profession.

It was an exciting dialogue where we enjoyed realistic answers without all the buzzwords and marketing terms often used in the new domain of sustainability. Enjoy the 39 minutes of interaction below:

 

Slides shown during the interview combined with additional company information can be found HERE.

 

What we have learned

  • Makersite’s mission, to enable manufacturers to make better products, faster, initially applied to economic parameters, can be easily extended with sustainability parameters.The power of Makersite is that it connects to enterprise systems and sources using AI, Machine Learning and algorithms to support reporting views on compliance, sustainability, costs and risk.
  • Compliance and sustainability are the areas where I see a significant need for companies to invest. It is not a revolutionary business change but an extension of scope.We discussed this in the context of the stage-gate process, where sustainability parameters should be added at each gate.
  • Neil has an exciting podcast, Five Lifes to Fifty, where he discusses the path to sustainable products with co-hosts Shelley Metcalfe and Jim Fava, and recently, they discussed sustainability in the context of the stage-gate process.
  • Again, to move forward with sustainability, it is about creating the base and caring about the data internally to understand what’s happening, and from there, enable value engineering, including your supplier where possible (IP protection remains a topic) – confirming digital transformation (the connected way of working) is needed for business and sustainability.

 

Want to learn more?

Here are some links to the topics discussed in our meeting:

Conclusions

With Makersite, we discovered an experienced company that used its experience in cost, compliance and risk analysis, including supply chains, to extend it to the domain of sustainability. As their technology partners page shows, they can be complementary in many industries and enterprises.

We will see another complementary solution soon in our following interview. Stay tuned.

 

 

 

 

 

One year ago, I wrote the post: Time to Split PLM, which reflected a noticeable trend in 2022 and 2023.

If you still pursue the Single Source of Truth or think that PLM should be supported by a single system, the best you can buy, then you are living in the past.

It is now the time to move from a Monolithic PLM Landscape to a Federated Domain and Data Mesh (Yousef Hooshmand) or the Heliple Federated PLM project (Erik Herzog) – you may have read about these concepts.

When moving from a traditional coordinated, document-driven business towards a modern, connected, and data-driven business, there is a paradigm shift. In many situations, we still need the document-driven approach to manage baselines for governance and traceability, where we create the required truth for manufacturing, compliance, and configuration management.

However, we also need an infrastructure for multidisciplinary collaboration nowadays. Working with systems, a combination of hardware and software, requires a model-based approach and multidisciplinary collaboration. This infrastructure needs to be data-driven to remain competitive despite more complexity, connecting stakeholders along value streams.

Traditional PLM vendors still push all functionality into one system, often leading to frustration among the end-users, complaining about lack of usability, bureaucracy, and the challenge of connecting to external stakeholders, like customers, suppliers, design or service partners.

 

Systems of Engagement

It is in modern PLM infrastructures where I started the positioning of Systems or Record (the traditional enterprise siloes – PDM/PLM, ERP, CM) and the Systems of Engagement (modern environments designed for close to real-time collaboration between stakeholders within a domain/value stream). In the Heliple project (image below), the Systems of Record are the vertical bars, and the Systems of Engagement are the horizontal bars.

The Heliple PLM Approach

The power of a System of Engagement is the data-driven connection between stakeholders even when working in different enterprises. Last year, I discussed with Andre Wegner from Authentise, MJ Smith from CoLab, and Oleg Shilovitsky from OpenBOM.

They all focus on modern, data-driven, multidisciplinary collaboration. You can find the discussion here: The new side of PLM? Systems of Engagement!

Where is the money?

Business leaders usually are not interested in a technology or architectural discussion – too many details and complexity, they look for the business case. Look at this recent post and comments on LinkedIn – “When you try to explain PLM to your C-suite, and they just don’t get it.”

It is hard to build evidence for the need for systems of engagement, as the concepts are relatively new and experiences from the field are bubbling up slowly. With the Heliple team, we are now working on building the business case for Federated PLM in the context of the Heliple project scope.

Therefore, I was excited to read the results of this survey: Quantifying the impact of design review methods on NPD, a survey among 250 global engineering leaders initiated by CoLab.

CoLab is one of those companies that focus on providing a System of Engagement, and their scope is design collaboration. In this post, I am discussing the findings of this survey with Taylor Young, Chief Strategy Officer of CoLab.

CoLab – the company /the mission

Taylor, thanks for helping me explain the complementary value of CoLab based on some of the key findings from the survey. But first of all, can you briefly introduce CoLab as a company and the unique value you are offering to your clients?

Hi Jos, CoLab is a Design Engagement System – we exist to help engineering teams make design decisions.

Product decision-making has never been more challenging – or more essential – to get right – that’s why we built CoLab. In today’s world of product complexity, excellent decision-making requires specialized expertise. That means decision-making is no longer just about people engaging with product data – it’s about people engaging with other people.

PLM provides a strong foundation where product data is controlled (and therefore reliable). But PLM has a rigid architecture that’s optimized for data (and for human-to-data interaction). To deal with increased complexity in product design, engineers now need a system that’s built for human-to-human interaction complimentary to PLM.

CoLab allows you to interrogate a rich dataset, even an extended team, outside your company borders in real-time or asynchronously. With CoLab, decisions are made with context, input from the right people, and as early as possible in the process. Reviews and decision-making get tracked automatically and can be synced back to PLM. Engineers can do what they do best, and CoLab will support them by documenting everything in the background.

Design Review Quality

It is known that late-stage design errors are very costly, both impacting product launches and profitability. The report shows design review quality has been rated as the #1 most important factor affecting an engineering team’s ability to deliver projects on time.

Is it a severe problem for companies, and what are they trying to do to improve the quality of design reviews? Can you quantify the problem?

Our survey report demonstrated that late-stage design errors delay product launches for 90% of companies. The impact varies significantly from organization to organization, but we know that for large manufacturing companies, just one late-stage design error can be a six to seven-figure problem.

There are a few factors that lead to a “quality” design review – some of the most important ones we see leading manufacturing companies doing differently are:

  • Who they include – the highest performing teams include manufacturing, suppliers, and customers within the proper context.
  • When they happen – the highest performing teams focus on getting CAD to these stakeholders early in the process (during detailed design) and paralleling these processes (i.e., they don’t wait for one group to sign off before getting early info to the next)
  • Rethinking the Design Review meeting – the highest performing teams aren’t having issue-generation meetings – they have problem-solving meetings. Feedback is collected from a broad audience upfront, and meetings are used to solve nuanced problems – not to get familiar with the project for the first time.

Multidisciplinary collaboration

An interesting observation is that providing feedback to engineers mainly comes from peers or within the company. Having suppliers or customers involved seems to be very difficult. Why do you think this is happening, and how can we improve their contribution?

When we talk to manufacturing companies about “collaboration” or how engineers engage with other engineers – however good or bad the processes are internally, it almost always is significantly more challenging/less effective when they go external. External teams often use different CAD systems, work in different time zones, speak other first languages, and have varying levels of access to core engineering information.

However, as we can read from the HBR article What the Most Productive Companies Do Differently, we know that the most productive manufacturing companies “collaborate with suppliers and customers to form new ecosystems that benefit from agglomeration effects and create shared pools of value”.

They leverage the expertise of their suppliers and customers to do things more effectively. But manufacturing companies struggle to create engaging, high-value, external collaboration and ‘co-design’ without the tools purpose-built for person-to-person external communication.

Traceability and PLM

One of the findings is that keeping track of the feedback and design issues is failing in companies. One of my recommendations from the past was to integrate Issue management into your PLM systems – why is this not working?

We believe that the task of completing a design review and the task of documenting the output of that review should not be two separate efforts. Suppose we are to reduce the amount of time engineers spend on admin work and decrease the number of issues that are never tracked or documented (43%, according to our survey).

In that case, we need to introduce a purpose-built, engaging design review system that is self-documenting. It is crucial for the quality of design reviews that issues aren’t tracked in a separate system from where they are raised/developed, but that instead, they are automatically tracked just by doing the work.

Learn More?

Below is the recording of a recent webinar, where Taylor said that your PLM alone isn’t enough: Why you need a Design Engagement System during product development.

  • A traditional PLM system is the system of record for product data – from ideation through sustaining engineering.
  • However one set of critical data never makes it to the PLM. For many manufacturing companies today, design feedback and decisions live almost exclusively in emails, spreadsheets, and PowerPoint decks. At the same time, 90% of engineering teams state that product launches are delayed due to late-stage changes.
  • Engineering teams need to implement a true Design Engagement System (DES) for more effective product development and a more holistic digital thread.

 Conclusion

Traditional PLM systems have always been used to provide quality and data governance along the product lifecycle. However, most end users dislike the PLM system as it is a bureaucratic overhead to their ways of working. CoLab, with its DES solution, provides a System of Engagement focusing on design reviews, speed, and quality of multidisciplinary collaboration complementary to the PLM System of Record – a modern example of how digitization is transforming the PLM domain.

Next upcoming event – will we meet there ?

Another year passed, and as usual, I took the time to look back. I always feel that things are going so much slower than expected. But that’s reality – there is always friction, and in particular, in the PLM domain, there is so much legacy we cannot leave behind.

It is better to plan what we can do in 2024 to be prepared for the next steps or, if lucky, even implement the next steps in progress.

In this post, I will discuss four significant areas of attention (AI – DATA – PEOPLE – SUSTAINABILITY) in an alphabetic order, not prioritized.

Here are some initial thoughts. In the upcoming weeks I will elaborate further on them and look forward to your input.

 

AI (Artificial Intelligence)

Where would I be without talking about AI?

When you look at the image below, the Gartner Hype Cycle for AI in 2023, you see the potential coming on the left, with Generative AI at the peak.

Part of the hype comes from the availability of generative AI tools in the public domain, allowing everyone to play with them or use them. Some barriers are gone, but what does it mean? Many AI tools can make our lives easier, and there is for sure no threat if our job does not depend on standard practices.

 

AI and People

When I was teaching physics in high school, it was during the introduction of the pocket calculator, which replaced the slide rule.You need to be skilled to uyse the slide rule, now there was a device that gave immediate answers. Was this bad for the pupils?

If you do not know a slide rule, it was en example of new technology replacing old tools, providing more time for other details.  Click on the image or read more about the slide rule here on Wiki.

Or today you would ask the question about the slide rule to ChatGPT? Does generative AI mean the end of Wikipedia? Or does generative AI need the common knowledge of sites like Wikipedia?

AI can empower people in legacy environments, when working with disconnected systems. AI will be a threat for to people and companies that rely on people and processes to bring information together without adding value. These activities will disappear soon and you must consider using this innovative approach.

During the recent holiday period, there was an interesting discussion about why companies are reluctant to change and implement better solution concepts. Initially launched by Alex Bruskin here on LinkedIn , the debate spilled over into the topic of TECHNICAL DEBT , well addressed here by Lionel Grealou.

Both articles and the related discussion in the comments are recommended to follow and learn.

 

AI and Sustainability

Similar to the introduction of Bitcoin using blockchain technology, some people are warning about the vast energy consumption required for training and interaction with Large Language Models (LLM), as Sasha Luccioni explains in her interesting TED talk when addressing sustainability.

She proposes that tech companies should be more transparent on this topic, the size and the type of the LLM matters, as the indicative picture below illustrates.

Carbon Emissions of LLMs compared

In addition, I found an interesting article discussing the pros and cons of AI related to Sustainability. The image below from the article Risks and Benefits of Large Language Models for the Environment illustrates nicely that we must start discussing and balancing these topics.

To conclude, in discussing AI related to sustainability, I see the significant advantage of using generative AI for ESG reporting.

ESG reporting is currently a very fragmented activity for organizations, based on (marketing) people’s goodwill and currently these reports are not always be evidence-based.

 

Data

The transformation from a coordinated, document-driven enterprise towards a hybrid coordinated/connected enterprise using a data-driven approach became increasingly visible in 2023. I expect this transformation to grow faster in 2024 – the momentum is here.

We saw last year that the discussions related to Federated PLM nicely converged at the PLM Roadmap / PDT Europe conference in Paris. I shared most of the topics in this post: The week after PLM Roadmap / PDT Europe 2023. In addition, there is now the Heliple Federated PLM LinkedIn group with regular discussions planned.

In addition, if you read here Jan Bosch’s reflection on 2023, he mentions (quote):

… 2023 was the year where many of the companies in the center became serious about the use of data. Whether it is historical analysis, high-frequency data collection during R&D, A/B testing or data pipelines, I notice a remarkable shift from a focus on software to a focus on data. The notion of data as a product, for now predominantly for internal use, is increasingly strong in the companies we work with

I am a big fan of Jan’s posting; coming from the software world, he describes the same issues that we have in the PLM world, except he does not carry the hardware legacy that much and, therefore, acts faster than us in the PLM world.

An interesting illustration of the slow pace to a data-driven environment is the revival of the PLM and ERP integration discussion. Prof. Jörg Fischer and Martin Eigner contributed to the broader debate of a modern enterprise infrastructure, not based on systems (PLM, ERP, MES, ….) but more on the flow of data through the lifecycle and an organization.

It is a great restart of the debate, showing we should care more about data semantics and the flow of information.

The articles: The Future of PLM & ERP: Bridging the Gap. An Epic Battle of Opinions!  and Is part master in PLM and ERP equal or not) combined with the comments to these posts, are a must read to follow this change towards a more connected flow of information.

While writing this post, Andreas Lindenthal expanded the discussion with his post: PLM and Configuration Management Best Practices: Part Traceability and Revisions. Again thanks to data-driven approaches, there is an extending support for the entire product lifecycle. Product Lifecycle Management,  Configuration Management and AIM (Asset Information Management) have come together.

PLM and CM are more and more overlapping as I discussed some time ago with Martijn Dullaart, Maxime Gravel and Lisa Fenwick in the The future of Configuration Management. This topic will be “hot”in 2024.

 

People

From the people’s perspective towards AI, DATA and SUSTAINABILITY, there is a noticeable divide between generations.  Of course, for the sake of the article, I am generalizing, assuming most people do not like to change their habits or want to reprogram themselves.

Unfortunate, we have to adapt our skills as our environment is changing. Most of my generation was brought up with the single source of truth idea, documented and supported by science papers.

In my terminology, information processing takes place in our head by combining all the information we learned or collected through documents/books/newspapers – the coordinated approach.

For people living in this mindset, AI can become a significant threat, as their brain is no longer needed to make a judgment, and they are not used to differentiate between facts and fake news as they were never trained to do so

The same is valid for practices like the model-based approach, working data-centric, or considering sustainability. It is not in the DNA of the older generations and, therefore, hard to change.

The older generation is mostly part of an organization’s higher management, so we are returning to the technical debt discussion.

Later generations that grew up as digital natives are used to almost real-time interaction, and when applied consistently in a digital enterprise, people will benefit from the information available to them in a rich context – in my terminology – the connected approach.

AI is a blessing for people living in this mindset as they do not need to use old-fashioned methods to acquire information.

“Let ChatGPT write my essay.”

However, their challenge could be what I would call “processing time”. Because data is available, it does not necessarily mean it is the correct information. For that reason it remains important to spend time digesting the impact of information you are reading – don’t click “Like”based on the tittle, read the full article and then decide.

Experience is what you get, when you don’t get what you expect.

meaning you only become experienced if you learn from failures.

 

Sustainability

Unfortunately, sustainability is not only the last topic in alphabetic order, as when you look at the image below, you see that discussions related to sustainability are in a slight decline at C-level at the moment.

I share this observation in my engagements when discussing sustainability with the companies I interact with.

The PLM software and services providers are all on a trajectory of providing tools and an infrastructure to support a transition to a more circular economy and better traceability of materials and carbon emissions.

In the PLM Global Green Alliance, we talked with Aras, Autodesk, Dassault Systems, PTC, SAP, Sustaira, TTPSC(Green PLM) and more to come in 2024. The solution offerings in the PLM domain are available to start, now the people and processes.

For sure, AI tools will help companies to get a better understanding of their sustainability efforts. As mentioned before AI could help companies in understanding their environmental impact and build more accurate ESG reports.

Next, being DATA-driven will be crucial.  As discussed during the latest PLM Roadmap/PDT Europe conference: The Need for a Governance Digital Thread.

And regarding PEOPLE, the good news is that younger generations want to take care of their future. They are in a position to choose the company to work for or influence companies by their consumer behavior. Unfortunately, climate disasters will remind us continuously in the upcoming decades that we are in a critical phase.

With the PLM Global Green Alliance, we strive to bring people together with a PLM mindset, sharing news and information on how to move forward to a sustainable future.

Mark Reisig (CIMdata – moderator for Sustainability & Energy) and Patrice Quencez (CIMPA – moderator for the Circular Economy) joined the PGGA last year and you will experience their inputs this year.

 

Conclusion

As you can see from this long post, there is so much to learn. The topics described are all actual, and each topic requires education, experience (success & failures) combined with understanding  of the technology concepts. Make sure you consider all of them, as focusing on a single topic will not make move faster forward – they are all related. Please share your experiences this year—Happy New Year of Learning.

 

Two weeks ago, this post from Ilan Madjar drew my attention. He pointed to a demo movie, explaining how to support Smart Part Numbering on the 3DEXPERIENCE platform. You can watch the recording here.

I was surprised that Smart Part Numbering is still used, and if you read through the comments on the post, you see the various arguments that exist.

  • “Many mid-market customers are still using it”
    me: I think it is not only the mid-market – however, the argument is no reason to keep it alive.
  • “The problem remains in the customer’s desire (or need or capability) for change.”
    me: This is part of the lowest resistance.
  • “User resistance to change. Training and management sponsorship has proven to be not enough.”
    me: probably because discussions are feature-oriented, not starting from the business benefits.
  • “Cost and effort- rolling this change through downstream systems. The cost and effort of changing PN in PLM,ERP,MES, etc., are high. Trying to phase it out across systems is a recipe for a disaster.”
    me: The hidden costs of maintaining Smart Numbers inside an organization are high and invisible, reducing the company’s competitiveness.
  • “Existing users often complain that it takes seconds to minutes more for unintelligent PN vs. using intelligent PN.”
    me: If we talk about a disconnected user without access to information, it could be true if the number of Smart Numbers to comprehend is low.

There were many other arguments for why you should not change. It reminded me of the image below:

Smart Numbers related to the Coordinated approach

Smart Part Numbers are a characteristic of best practices from the past. Where people were working in different systems, the information moving from one system to another was done manually.

For example, it is re-entering the Bill of Materials from the PDM system into the ERP system or attaching drawings to materials/parts in the ERP system. The filename often reflects the material or part number in the latter case.

The problems with the coordinated, smart numbering approach are:

  • New people in the organization need to learn the meaning of the numbering scheme. This learning process reduces the flexibility of an organization and increases the risk of making errors.
  • Typos go unnoticed when transferring numbers from one system to another and only get noticed late when the cost of fixing the error might be 10 -100 fold.
  • The argument that people will understand the meaning of a part is partly valid. A person can have a good guess of the part based on the smart part number; however, the details can be different unless you work every day with the same and small range of parts.
  • Smart Numbers created a legacy. After Mergers and Acquisitions, there will be multiple part number schemes. Do you want to renumber old parts, meaning non-value-added, risky activities? Do you want to continue with various numbering schemes, meaning people need to learn more than one numbering schema – a higher entry barrier and risk of errors?

There were and still are many advanced smart numbering systems.

In one of my first PDM implementations in the Netherlands, I learned about the 12NC code system from Philips – introduced at Philips in 1963 and used to identify complete products, documentation, and bare components, up to the finest detail. At this moment, many companies in the Philips family (suppliers or offspring) still use this numbering system, illustrating that it is not only the small & medium enterprises that are reluctant to change their numbering system.

The costs of working with Smart Part Numbers are often unnoticed as they are considered a given.

 

From Coordinated to Connected

Digital transformation in the PLM domain means moving from coordinated practices toward practices that benefit from connected technology. In many of my blog posts, you can read why organizations need to learn to work in a connected manner. It is both for their business sustainability and also for being able to deal with regulations related to sustainability in the short term.

GHG reporting, ESG reporting, material compliance, and the DPP are all examples of the outside world pushing companies to work connected. Besides the regulations, if you are in a competitive business, you must be more efficient, innovative and faster than your competitors.

In a connected environment, relations between artifacts (datasets) are maintained in an IT infrastructure without requiring manual data transformations and people to process the data. In a connected enterprise, this non-value-added work will be reduced.

How to move away from Smart Numbering systems?

Several comments related to the Smart Numbering discussion mentioned that changing the numbering system is too costly and risky to implement and that no business case exists to support it. This statement only makes sense if you want your business to become obsolete slowly. Modern best practices based on digitization should be introduced as fast as possible, allowing companies to learn and adapt. There is no need for a big bang.

  • Start with mapping, prioritizing, and mapping value streams in your company. Where do we see the most significant business benefits related to cost of handling, speed, and quality?

Note: It is not necessary to start with engineering as they might be creators of data – start, for example, with the xBOM flow, where the xBOM can be a concept BOM, the engineering BOM, the Manufacturing BOM, and more. Building this connected data flow is an investment for every department; do not start from the systems.

  • Next point: Do not rename or rework legacy data. These activities do not add value; they can only create problems. Instead, build new process definitions that do not depend on the smartness of the number.

Make sure these objects have, besides the part number, the right properties, the right status, and the right connections. In other words, create a connected digital thread – first internally in your company and next with your ecosystem (OEMs, suppliers, vendors)

  • Next point: Give newly created artifacts a guaranteed unique ID independent of others. Each artifact has its status, properties and context. In this step, it is time to break any 1 : 1 relation between a physical part and a CAD-part or drawing. If a document gets revised, it gets a new version, but the version change should not always lead to a part number change. You can find many discussions on why to decouple parts and documents and the flexibility it provides.
  • Next point: New generated IDs are not necessarily generated in a single system. The idea of a single source of truth is outdated. Build your infrastructure upon existing standards if possible. For example, the UID of the Digital Product Passport will be based on the ISO/IEC 15459 standard, similar to the UID for retail products managed by the GS1 standard. Or, probably closer to home, look into your computer’s registry, and you will discover a lot of software components with a unique ID that specific programs or applications can use in a shared manner.

When will it happen?

In January 2016, I wrote about “the impact of non-intelligent part numbers” and surprisingly almost 8 years later and we are still in the same situation.

I just read Oleg Shilovitsky’s post The Data Dilemma: Why Engineers and Manufacturing Companies Struggle to Find Time for Data Management where he mentions Legacy Systems and Processes, Overwhelming Workloads, Lack of (Data) Expertise, Short-Term Focus and Resource Constraints as inhibitors.

You probably all know the above cartoon. How can companies get out of this armor or habits? Will they be forced by the competition or by regulations. What do you think ?

 

Conclusion

Despite proven business benefits and insights, it remains challenging for companies to move toward modern, data-driven practices where Smart Number generators are no longer needed. When talking one-on-one to individuals, they are convinced a change is necessary, and they are pointing to the “others”.

I wish you all a prosperous 2024 and the power to involve the “others”.

@38 minute: you need to be able to unlearn

 

 

 

 

 

Translate

Categories

  1. Good day Jos, I was involved in many implementations over the years (including) Philips…. Indeed smart part numbers was a…

  2. Another Interesting article, I also see this kind of development in our company where terminology shifts and approach methods change.…