You are currently browsing the tag archive for the ‘Migration’ tag.

Two weeks ago, I shared my first post about PDM/PLM migration challenges on LinkedIn: How to avoid Migration Migraine – part 1. Most of the content discussed was about data migrations.

Starting from moving data stored in relational databases to modern object-oriented environments – the technology upgrade. But also the challenges a company can have when merging different data siloes (CAD & BOM related) into a single PLM backbone to extend the support of product data beyond engineering.

Luckily, the post generated a lot of reactions and feedback through LinkedIn and personal interactions last week.

The amount of interaction illustrated the relevance of the topic for people; they recognized the elephant in the room, too.

 

Working with a partner

Data migrations and consolidation are typically not part of a company’s core business, so it is crucial to find the right partner for a migration project. The challenge with migrations is that there is potentially a lot to do technically, but only your staff can assess the quality and value of migrations.

Therefore, when planning a migration, make sure you work on it iteratively with an experienced partner who can provide a set of tools and best practices. Often, vendors or service partners have migration tools that still need to be tuned to your As-Is and To-Be environment.

To get an impression of what a PLM service partner can do and which topics or tools are relevant in the context of mid-market PLM, you can watch this xLM webinar on YouTube. So make sure you select a partner who is familiar with your PDM/PLM infrastructure and who has the experience to assess complexity.

 

Migration lessons learned

In my PLM coaching career I have seen many migrations. In the early days they were more related to technology upgrades, consolidation of data and system replacements. Nowadays the challenges are more related to become more data-driven. Here are 5 lessons that I learned in the past twenty years:

  1. A fixed price for the migration can be a significant risk as the quality of the data and the result are hard to comprehend upfront. In case of a fixed price, either you would pay for the moon (taking all the risk), or your service partner would lose a lot of money. In a sustainable business model, there should be no losers.
  2. Start (even now) with checking and fixing your data quality. For example, when you are aware of a mismatch between CAD assemblies and BOM data, analyze and fix discrepancies even before the migration.
  3. One immediate action to take when moving from CAD assemblies to BOM structures is to check or fill the properties in the CAD system to support a smooth transition. Filling properties might be a temporary action, as later, when becoming more data-driven, some of these properties, e.g., material properties or manufacturer part numbers, should not be maintained in the CAD system anymore. However, they might help migration tools to extract a richer dataset.
  4. Focus on implementing an environment ready for the future. Don’t let your past data quality compromise complexity. In such a case, learn to live with legacy issues that will be fixed only when needed. A 100 % matching migration is not likely to happen because the source data might also be incorrect, even after further analysis.
  5. The product should probably not be configured in the CAD environment, even because the CAD tool allows it. I had this experience with SolidWorks in the past. PDM became the enemy because the users managed all configuration options in the assembly files, making it hard to use it on the BOM or Product level (the connected digital thread).

 

 The future is data-driven

In addition, these migration discussions made me aware again that so many companies are still in the early phases of creating a unified PLM infrastructure in their company and implementing the coordinated approach – an observation I shared in my report on the PDSFORUM 2024 conference.

Due to sustainability-related regulations and the need to understand product behavior in the field (Digital Twin / Product As A Service), becoming data-driven is an unavoidable target in the near future. Implementing a connected digital thread is crucial to remaining competitive and sustainable in business.

However, the first step is to gain insights about the available data (formats and systems) and its quality. Therefore, implementing a coordinated PLM backbone should immediately contain activities to improve data quality and implement a data governance policy to avoid upcoming migration issues.

Data-driven environments, the Systems of Engagement, bring the most value when connected through a digital thread with the Systems of Record (PLM. ERP and others), therefore,  design your processes, even current ones, user-centric, data-centric and build for change (see Yousef Hooshmand‘s story in this post – also image below).

 

The data-driven Future is not a migration.

The last part of this article will focus on what I believe is a future PLM architecture for companies. To be more precise, it is not only a PLM architecture anymore. It should become a business architecture based on connected platforms (the systems of record) and inter-platform connected value streams (the systems of engagement).

The discussion is ongoing, and from the technical and business side, I recommend reading Prof Dr. Jorg Fischer’s recent articles, for example. The Crisis of Digitalization – Why We All Must Change Our Mindset! or The MBOM is the Steering Wheel of the Digital Supply Chain! A lot of academic work has been done in the context of TeamCenter and SAP.

Also, Martin Eigner recently described in The Constant Conflict Between PLM and ERP a potential digital future of enterprise within the constraints of existing legacy systems.

In my terminology, they are describing a hybrid enterprise dominated by major Systems of Record complemented by Systems of Engagement to support optimized digital value streams.

Whereas Oleg Shilovitsky, coming from the System of Engagement side with OpenBOM, describes the potential technologies to build a digital enterprise as you can read from one of his recent posts: How to Unlock the Future of Manufacturing by Opening PLM/ERP to Connect Processes and Optimize Decision Support.

All three thought leaders talk about the potential of connected aspects in a future enterprise. For those interested in the details there is a lot to learn and understand.

For the sake of the migration story I stay out of the details. However interesting to mention, they also do not mention data migration—is it the elephant in the room?

I believe moving from a coordinated enterprise to a integrated (coordinated and connected) enterprise is not a migration, as we are no longer talking about a single system that serves the whole enterprise.

The future of a digital enterprise is a federated environment where existing systems need to become more data-driven, and additional collaboration environments will have their internally connected capabilities to support value streams.

With this in mind you can understand the 2017 McKinsey article– Our insights/toward an integrated technology operating model – the leading image below:

And when it comes to realization of such a concept, I have described the Heliple-2 project a few times before as an example of such an environment, where the target is to have a connection between the two layers through standardized interfaces, starting from OSLC. Or visit the Heliple Federated PLM LinkedIn group.

Data architecture and governance are crucial.

The image above generalizes the federated PLM concept and illustrates the two different systems connected through data bridges. As data must flow between the two sides without human intervention, the chosen architecture must be well-defined.

Here, I want to use a famous quote from Youssef Housmand’s paper From a Monolithic PLM Landscape to a Federated Domain and Data Mesh. Click on the image to listen to the Share PLM podcast with Yousef.

From a Single Source of Truth towards a principle of the Nearest Source of Truth based on a Single Source of Change

  • If you agree with this quote, you have a future mindset of federated PLM.
  • If you still advocate the Single Source of Truth, you are still in the Monolithic PLM phase.

It’s not a problem if you are aware that the next step should be federated and you are not ready yet.

However, in particular, environmental regulations and sustainability initiatives can only be performed in data-driven, federated environments. Think about the European Green Deal with its upcoming Ecodesign for Sustainable Products Directive (ESPR), which demands digital traceability of products, their environmental impact, and reuse /recycle options, expressed in the Digital Product Passport.

Reporting, Greenhouse Gas Reporting and ESG reporting are becoming more and more mandatory for companies, either by regulations or by the customers. Only a data-driven connected infrastructure can deal with this efficiently. Sustaira, a company we interviewed with the PLM Green Global Alliance last year, delivers such a connected infrastructure.

Read the challenges they meet in their blog post:  Is inaccurate sustainability data holding you back?

Finally, to perform Life Cycle Assessments for design options or Life Cycle Analyses for operational products, you need connections to data sources in real-time. The virtual design twin or the digital twin in operation does not run on documents.

 

Conclusion

Data migration and consolidation to modern systems is probably a painful and challenging process. However, the good news is that with the right mindset to continue and with a focus on data quality and governance, the next step to a integrated coordinated and connected enterprise will not be that painful. It can be an evolutionary process, as the McKinsey article describes it.

After two reposts, I have finally the ability to write with full speed, and my fingers were aching, having read some postings in the past four weeks.  It started with Verdi Ogewell’ s article on Engineering.com Telecom Giant Ericsson Halts Its PLM Project with Dassault’s 3DEXPERIENCE followed by an Aras blog post Don’t Be a Dinosaur from Mark Reisig, and of course, I would say Oleg Shilovitsky’s post: What to learn from Ericsson PLM failure?

Setting the scene

Verdi’s article is quite tendentious based on outside observations and insinuations. I let you guess who sponsored this article.  If I had to write an article about this situation,

I would state: Ericsson and Dassault failed to migrate the old legacy landscape into a new environment – an end-to-end migration appeared to be impossible.

The other topics mentioned are not relevant to the current situation.

Mark is chiming in on Verdi’s truth and non-relevant points to data migration, suggesting PLM is chosen over dinner. Of course, decisions are not that simple. It is not clear from Mark’s statement, who are the Dinosaurs:

Finally, don’t bet your future on a buzzword. Before making a huge PLM investment, take the time to make sure your PLM vendor has an actual platform. Have them show you their spider chart.  And here’s the hard reality: they won’t do it, because they can’t.

Don’t be a dinosaur—be prepared for the unexpected with a truly resilient digital platform.

I would state, “Don’t bet your future on a spider chart” if you do not know what the real problem is.

 

Oleg’s post finally is more holistic, acknowledging that a full migration might not be the right target, and I like his conclusion:

Flexibility Vs. Out of the box products – which one do you prefer? Over-customize a new PLM to follow old processes? To use a new system as an opportunity to clean existing processes? To move 25,000 people from one database to another is not a simple job. It is time to think about no upgrade PLM systems. While a cloud environment is not an option for mega-size OEMs like Ericsson, there is an opportunity for OEM IT together with the PLM vendor to run a migration path. The last one is a costly step. But… without this step, the current database oriented single-version of truth PLM paradigm is doomed.

The Migration Problem

I believe migration of data – and sometimes the impossibility of data migration – is the biggest elephant in the room when dealing with PLM projects. In 2015 during the PI PLM conference in Dusseldorf, I addressed this topic for the first time: The Challenge of PLM Upgrades.
You can find the presentation on SlideShare here.

I shared a similar example to the Ericsson case from almost 10 years ago. At that time, one of the companies I was working with wanted to replace their mainframe application, which was managing the configuration of certain airplanes. The application managed the aircraft configuration structures in tables and where needed pointing to specifications in a document repository. The two systems were not connected; integrity was guaranteed through manual verification procedures.

The application was considered as the single version of the truth, and has been treated like that for decades. The reason for migration was that all the knowledge of the application disappeared, tables were documented, but the logic was not. And besides this issue, the maintenance costs for the mainframe was also high – also at that time vendor lock-in existed.

The idea was to implement SmarTeam – flexible data model – rapid deployment based on windows technology  -to catch two birds with one stone, i.e., latest microsoft technology and meanwhile direct link to the controlled documents. As they were using CATIA V5, the SmarTeam-integration was a huge potential benefit. For the migration of data, the estimate was two months. What could go wrong?

Well, technically, almost nothing went wrong. The challenge was to map the relational tables to the objects in the SmarTeam data model. And as the relational tables contained a mix of document and item attributes, splitting these tables was not always easy. Sometimes the same properties were with different values in the original table – which one was the truth? The migration took almost two years also due to limited availability of the last knowledgeable resource who could explain the logic.

After the conversion, the question still remained if the migrated data was accurate? Perhaps 99 %?
But what if it was critical? For this company, it was significant, but not mission critical like in Ericsson, where a lot of automation and rules are linked together between loads of systems.

So my point: Dassault has failed at Ericsson and so will Siemens or Aras or any other PLM vendor as the migration issue is not in the technology – we should stop thinking about this kind of migrations.

Who are the dinosaurs?

Mark is in a way suggesting that when you use PLM software from the “old” PLM vendors, you are a dinosaur. Of course, this is a great marketing message, but the truth is that it is not the PLM vendor to blame. Yes, some have more friction than the other in some instances, but in my opinion, there is no ultimate single PLM vendor.

Have a look at the well-known Daimler case from some years ago, which made the news because Daimler decided to replace CATIA by NX. Not because NX was superior – it was about maintaining the PLM backbone Smaragd which would be hard to replace. Even in 2010, there was already the notion that the existing data management infrastructure is hard to replace. See a more neutral article about this topic from Monica Schnitger if you want: Update: Daimler chooses NX for Smaragd.  Also here in the end, it became a complete Siemens account for compatibility reasons.

When you look at the significant wins Aras is mentioning in their customer base, GM, Schaeffler or Airbus, you will probably discover Aras is more the connection layer between legacy systems, old PLM or PDM systems. They are not the new PLM replacing old PLM.  A connection layer creates a digital thread, connecting various data sources for traceability but does not provide digital continuity as the data in the legacy systems is untouched. Still it is an intermediate step towards a hybrid environment.

For me the real dinosaurs are these large enterprises that have been implementing their proprietary PLM environments in the previous century and have built a fully automated infrastructure based on custom data models with a lot of proprietary rules. This was the case in Ericsson, but most traditional automotive and aerospace companies share this problem, as they were the early PLM adopters. And they are not the only ones. Many industrial manufacturing companies suffer from the past, opposite to their Asian competitors who can start with less legacy.

What’s next?

It would be great if the PLM community focused more on the current incompatibility of data between current/past concepts and future digital needs and discuss solution paths (for sure standards will pop-up)

Incompatibility means: Do not talk about migration but probably focus on a hybrid landscape with legacy data, managed in a coordinated manner, and modern, growing digital PLM processes based on a connected approach.

This is the discussion I would like to see, instead of vendors claiming that their technology is the best. None of the vendors will talk about this topic – like the old “Rip-and-Replace” approach is what brings the most software revenue combined with the simplification that there is only OnePLM. It is interesting to see how many companies have a kind of OnePLM or OneXXX statement.

The challenge, of course, is to implement a hybrid approach. To have the two different PLM-concepts work together, there is a need to create a reliable overlap. The reliable overlap can come from an enterprise data governance approach if possible based on a normalized PLM data model. So far all PLM vendors that I know have proprietary data models, only ShareAspace from Eurostep is based on the PLCS standard, but their solutions are most of the time part of a larger PLM-infrastructure (the future !)

To conclude: I look forward to discussing this topic with other PLM peers that are really in the field, discovering and understanding the chasm between the past and the future. Contact me directly or join us as the PLM Roadmap and PDT Europe 13-14 November in Paris. Let’s remain fact-based!
(as a matter of fact you can still contribute – call for papers still open)

 

 

 

Translate

  1. Unknown's avatar
  2. Håkan Kårdén's avatar

    Jos, all interesting and relevant. There are additional elements to be mentioned and Ontologies seem to be one of the…

  3. Lewis Kennebrew's avatar

    Jos, as usual, you've provided a buffet of "food for thought". Where do you see AI being trained by a…

  4. Håkan Kårdén's avatar