You are currently browsing the category archive for the ‘Migrations’ category.

Two weeks ago, I shared my first post about PDM/PLM migration challenges on LinkedIn: How to avoid Migration Migraine – part 1. Most of the content discussed was about data migrations.

Starting from moving data stored in relational databases to modern object-oriented environments – the technology upgrade. But also the challenges a company can have when merging different data siloes (CAD & BOM related) into a single PLM backbone to extend the support of product data beyond engineering.

Luckily, the post generated a lot of reactions and feedback through LinkedIn and personal interactions last week.

The amount of interaction illustrated the relevance of the topic for people; they recognized the elephant in the room, too.

 

Working with a partner

Data migrations and consolidation are typically not part of a company’s core business, so it is crucial to find the right partner for a migration project. The challenge with migrations is that there is potentially a lot to do technically, but only your staff can assess the quality and value of migrations.

Therefore, when planning a migration, make sure you work on it iteratively with an experienced partner who can provide a set of tools and best practices. Often, vendors or service partners have migration tools that still need to be tuned to your As-Is and To-Be environment.

To get an impression of what a PLM service partner can do and which topics or tools are relevant in the context of mid-market PLM, you can watch this xLM webinar on YouTube. So make sure you select a partner who is familiar with your PDM/PLM infrastructure and who has the experience to assess complexity.

 

Migration lessons learned

In my PLM coaching career I have seen many migrations. In the early days they were more related to technology upgrades, consolidation of data and system replacements. Nowadays the challenges are more related to become more data-driven. Here are 5 lessons that I learned in the past twenty years:

  1. A fixed price for the migration can be a significant risk as the quality of the data and the result are hard to comprehend upfront. In case of a fixed price, either you would pay for the moon (taking all the risk), or your service partner would lose a lot of money. In a sustainable business model, there should be no losers.
  2. Start (even now) with checking and fixing your data quality. For example, when you are aware of a mismatch between CAD assemblies and BOM data, analyze and fix discrepancies even before the migration.
  3. One immediate action to take when moving from CAD assemblies to BOM structures is to check or fill the properties in the CAD system to support a smooth transition. Filling properties might be a temporary action, as later, when becoming more data-driven, some of these properties, e.g., material properties or manufacturer part numbers, should not be maintained in the CAD system anymore. However, they might help migration tools to extract a richer dataset.
  4. Focus on implementing an environment ready for the future. Don’t let your past data quality compromise complexity. In such a case, learn to live with legacy issues that will be fixed only when needed. A 100 % matching migration is not likely to happen because the source data might also be incorrect, even after further analysis.
  5. The product should probably not be configured in the CAD environment, even because the CAD tool allows it. I had this experience with SolidWorks in the past. PDM became the enemy because the users managed all configuration options in the assembly files, making it hard to use it on the BOM or Product level (the connected digital thread).

 

 The future is data-driven

In addition, these migration discussions made me aware again that so many companies are still in the early phases of creating a unified PLM infrastructure in their company and implementing the coordinated approach – an observation I shared in my report on the PDSFORUM 2024 conference.

Due to sustainability-related regulations and the need to understand product behavior in the field (Digital Twin / Product As A Service), becoming data-driven is an unavoidable target in the near future. Implementing a connected digital thread is crucial to remaining competitive and sustainable in business.

However, the first step is to gain insights about the available data (formats and systems) and its quality. Therefore, implementing a coordinated PLM backbone should immediately contain activities to improve data quality and implement a data governance policy to avoid upcoming migration issues.

Data-driven environments, the Systems of Engagement, bring the most value when connected through a digital thread with the Systems of Record (PLM. ERP and others), therefore,  design your processes, even current ones, user-centric, data-centric and build for change (see Yousef Hooshmand‘s story in this post – also image below).

 

The data-driven Future is not a migration.

The last part of this article will focus on what I believe is a future PLM architecture for companies. To be more precise, it is not only a PLM architecture anymore. It should become a business architecture based on connected platforms (the systems of record) and inter-platform connected value streams (the systems of engagement).

The discussion is ongoing, and from the technical and business side, I recommend reading Prof Dr. Jorg Fischer’s recent articles, for example. The Crisis of Digitalization – Why We All Must Change Our Mindset! or The MBOM is the Steering Wheel of the Digital Supply Chain! A lot of academic work has been done in the context of TeamCenter and SAP.

Also, Martin Eigner recently described in The Constant Conflict Between PLM and ERP a potential digital future of enterprise within the constraints of existing legacy systems.

In my terminology, they are describing a hybrid enterprise dominated by major Systems of Record complemented by Systems of Engagement to support optimized digital value streams.

Whereas Oleg Shilovitsky, coming from the System of Engagement side with OpenBOM, describes the potential technologies to build a digital enterprise as you can read from one of his recent posts: How to Unlock the Future of Manufacturing by Opening PLM/ERP to Connect Processes and Optimize Decision Support.

All three thought leaders talk about the potential of connected aspects in a future enterprise. For those interested in the details there is a lot to learn and understand.

For the sake of the migration story I stay out of the details. However interesting to mention, they also do not mention data migration—is it the elephant in the room?

I believe moving from a coordinated enterprise to a integrated (coordinated and connected) enterprise is not a migration, as we are no longer talking about a single system that serves the whole enterprise.

The future of a digital enterprise is a federated environment where existing systems need to become more data-driven, and additional collaboration environments will have their internally connected capabilities to support value streams.

With this in mind you can understand the 2017 McKinsey article– Our insights/toward an integrated technology operating model – the leading image below:

And when it comes to realization of such a concept, I have described the Heliple-2 project a few times before as an example of such an environment, where the target is to have a connection between the two layers through standardized interfaces, starting from OSLC. Or visit the Heliple Federated PLM LinkedIn group.

Data architecture and governance are crucial.

The image above generalizes the federated PLM concept and illustrates the two different systems connected through data bridges. As data must flow between the two sides without human intervention, the chosen architecture must be well-defined.

Here, I want to use a famous quote from Youssef Housmand’s paper From a Monolithic PLM Landscape to a Federated Domain and Data Mesh. Click on the image to listen to the Share PLM podcast with Yousef.

From a Single Source of Truth towards a principle of the Nearest Source of Truth based on a Single Source of Change

  • If you agree with this quote, you have a future mindset of federated PLM.
  • If you still advocate the Single Source of Truth, you are still in the Monolithic PLM phase.

It’s not a problem if you are aware that the next step should be federated and you are not ready yet.

However, in particular, environmental regulations and sustainability initiatives can only be performed in data-driven, federated environments. Think about the European Green Deal with its upcoming Ecodesign for Sustainable Products Directive (ESPR), which demands digital traceability of products, their environmental impact, and reuse /recycle options, expressed in the Digital Product Passport.

Reporting, Greenhouse Gas Reporting and ESG reporting are becoming more and more mandatory for companies, either by regulations or by the customers. Only a data-driven connected infrastructure can deal with this efficiently. Sustaira, a company we interviewed with the PLM Green Global Alliance last year, delivers such a connected infrastructure.

Read the challenges they meet in their blog post:  Is inaccurate sustainability data holding you back?

Finally, to perform Life Cycle Assessments for design options or Life Cycle Analyses for operational products, you need connections to data sources in real-time. The virtual design twin or the digital twin in operation does not run on documents.

 

Conclusion

Data migration and consolidation to modern systems is probably a painful and challenging process. However, the good news is that with the right mindset to continue and with a focus on data quality and governance, the next step to a integrated coordinated and connected enterprise will not be that painful. It can be an evolutionary process, as the McKinsey article describes it.

In the past months, I have had several discussions related to migrating PLM data, either from one system to another or from consolidating a collection of applications into a single environment. Does this sound familiar?

Let me share some experiences and lessons learned to avoid the Migration Migraine.

It is not a technical guide but a collection of experiences and thoughts that you might have missed when considering to solve the technical dream.

Halfway I realized I was too ambitious; therefore, another post will follow this introduction. Here, I will focus on the business side and the digital transformation journey.

 

Garbage Out – Garbage In

The Garbage Out-In statement is somehow the paradigm we are used to in our day-to-day lives. When you buy a new computer, you use backup and restore. Even easier, nowadays, the majority of the data is already in the cloud.

This simple scenario assumes that all professional systems should be easily upgrade-able. We become unaware of the amount of data we store and its relevance.

This phenomenon already has a name: “Dark Data.” Dark Data consumes storage energy in the cloud and is no longer visible. Please read all about it here: Dark Data.

TIP 1: Every migration is a moment to clean up your data. By dragging everything with you, the burden of migrating becomes bigger. In easy migrations, do a clean-up—it prevents future, more extensive issues.

Never follow the Garbage Out – Garbage in principle, even if it is easy!

 

Migrations in the PLM domain are different – setting the scene.

Before discussing the various scenarios, let’s examine what companies are doing. For early PLM adopters in the Automotive, Aerospace, and Defense Industries, migrations from mainframes to modern infrastructures have become impossible. The real problem is not only the changing hardware but also the changing data and data models.

For these companies, the solution is often to build an entirely new PLM infrastructure on top of the existing infrastructure, where manageable data pieces are migrated to new environments using data lakes, dashboards, and custom apps to support modern users.

Migration in this case is a journey as long as the data lives – and we can learn from them!

 

Follow the money

From a business perspective, migrations are considered a negative distractor. Talking about them raises awareness of their complexity, which might jeopardize enthusiasm.

For the initiator, the PLM software vendor or implementer, it might endanger the sales deal.

Traditional IT organizations strive for simplification—one CAD, one PLM or one ERP system to manage. Although this argument makes sense, an analysis should always be done comparing the benefits and the (migration) costs and risks to reach the ideal situation.

In those discussions often, migrations are downplayed

Without naming companies, I have observed the downplaying several times, even at some prominent enterprises. So, if you recognize your company in this process, you are not alone.

TIP 2: Migrations are never simple. Make migration a serious topic of your PLM project – as important as the software. This approach means analyzing the potential migration risks and their mitigation is needed.

Please read about the Xylem story in my recent post: The week after the PDSFORUM 2024

The Big Bang has the highest risk and might again lead to garbage out—garbage in.

 

You are responsible for your garbage.

It may sound disparaging, but it is not. Most companies are aware that people, tools and policies have changed over the years. Due to the coordinated approach to working, disciplines did not need to care about downstream or upstream usage of their initially created data – Excel and PDFs are the bridges between disciplines.

All the actual knowledge and context are stored in the heads of experienced employees who have gotten used to dealing with inconsistencies. And they will retire, so there is an urgent need for actual data quality and governance. Read more about the journey from Coordinated to Connected in these articles.

Even if you are not yet thinking about migrations, the digital transformation in the PLM domain is coming, and we should learn to work in a connected mode.

TIP 3: Create a team in your organization that assesses the current data quality and defines the potential future enterprise (data) architecture. Then, start improving the quality of the current generated data. Like the ISO 900x standard, the ISO 8000 standard already exists for data quality.

The future is data-driven; prepare yourself for the future.

 

Migration scenarios and their best practices

Here are some migrations scenario’s – two in this post and more in an upcoming post.

 

From Relational to Object-oriented

One of my earlier projects, starting in 2010 with SmarTeam, was migrating a mainframe-based application for airplane certification to a modern Microsoft infrastructure.

The goal was to create a new environment that could be used both as a replacement for the mainframe application and as the design and validation environment to implement changes to the current airplanes during a maintenance or upgrade activity.

The need was high because detailed documentation about the logic of the current application did not exist, and only one person who understood the logic was partly available.

So, internally, the relational database was a black box. The tables in the database contained a mix of item data, document data, change status and versions. The documents were stored in directories with meaningful file names but disconnected from the application.

The initial estimate was that the project would take two to three months, so a fixed price for two months was agreed upon. However, it became almost a two-year project, and in the end, the result seemed to be reliable (there was never mathematical proof).

The disadvantage was that SmarTeam ended up being so highly customized that automatic upgrades would not work for this version anymore—a new legacy was created with modern technology.

The same story, combined with the example of Ericsson’s migration attempt, is described in the 2016 post, The PLM Migration Dilemma. For me, the lesson learned from these examples leads to the following recommendation.

TIP 4: When there is a paradigm change in the data model, don’t migrate but establish a new (data-driven) infrastructure and connect to your legacy as much as possible in read-only mode.

The automotive and aerospace industries’ story is one of paradigm change.

Listen to the SharePLM podcast Revolutionizing PLM: Insights from Yousef Hooshmand, where Yousef also discusses how to address this transition process.

 

CAD/PDM to PLM

Another migration step happens when companies decide to implement a traditional PLM infrastructure as a System of Record, merging PDM data (mainly CAD) and ERP data (the BOM).

Some of these companies have been working file-based and have stored their final CAD files in folders; others might have a local PDM system native to the 3D CAD. The EBOM usually existed digitally in ERP, and most of the time, it is not a “pure” EBOM but more of a hybrid EBOM/MBOM.

The image above show this type of migration can be very challenging as, in the source systems, there is not necessarily a consistent 3D CAD definition matching the BOM items. As the systems have been disconnected in the past, people have potentially added missing information or fixed information on the BOM side. As in most companies, the manufacturing definition is based on drawings, and the consistency with the 3D CAD definition is not guaranteed.

To address this challenge, companies need to assess the usability of the CAD and BOM data. Is it possible to populate the CAD files with properties that are necessary for an import? For example, does the file path contain helpful information?

I have experienced a situation where a company has poorly defined 3D parts and no properties, as all the focus was on using the 3D to generate the 2D drawing.

The relevant details for manufacturing were next added to the drawing and not anymore to the parts or models – traceability was almost impossible.

In this situation, importing the 3D CAD structures into the new PLM system has limited value. An alternative is to describe and test procedures for handling legacy data when it is needed, either to implement a design change or a new order. Leave the legacy accessible, but do not migrate.

The BOM side is, in theory, stable for manufactured products, as the data should have gone through a release process. However, the company needs to revisit its part definition process for new designs and products.

Some points to consider:

  1. Meaningful identifiers are not desired in a PLM system as they create a legacy. Therefore, the import of parts with smart identifiers should map to relevant part properties besides the ID. Splitting the ID into properties will create a broader usage in the future. Read more in Smart Part Numbers – do we need them?
  2. In addition, companies should try to avoid having logistic information, such as supplier-specific part numbers to come from the CAD system. Supplier parts in your CAD environment create inefficiencies when a supplier part becomes obsolete. Concepts such as EBOM and MBOM and potentially the SBOM should be well understood during this migration.
  3. Concepts of EBOM and MBOM should also be introduced when moving from an ETO to a CTO approach or when modularity is a future business strategy.

 

 

Conclusion

As every company is on its PLM journey and technology is evolving, there will always be a migration discussion. Understanding and working towards the future should be the most critical driver for migration. Migrations in the PLM domain are often more than a data migration – new ways of working should be introduced in parallel. And for that reason the “big bang” is often too costly and demotivating for the future.

 

 

Translate

  1. Unknown's avatar
  2. Håkan Kårdén's avatar

    Jos, all interesting and relevant. There are additional elements to be mentioned and Ontologies seem to be one of the…

  3. Lewis Kennebrew's avatar

    Jos, as usual, you've provided a buffet of "food for thought". Where do you see AI being trained by a…

  4. Håkan Kårdén's avatar