You are currently browsing the category archive for the ‘Smart Numbers’ category.

In the past months, I have had several discussions related to migrating PLM data, either from one system to another or from consolidating a collection of applications into a single environment. Does this sound familiar?

Let me share some experiences and lessons learned to avoid the Migration Migraine.

It is not a technical guide but a collection of experiences and thoughts that you might have missed when considering to solve the technical dream.

Halfway I realized I was too ambitious; therefore, another post will follow this introduction. Here, I will focus on the business side and the digital transformation journey.

 

Garbage Out – Garbage In

The Garbage Out-In statement is somehow the paradigm we are used to in our day-to-day lives. When you buy a new computer, you use backup and restore. Even easier, nowadays, the majority of the data is already in the cloud.

This simple scenario assumes that all professional systems should be easily upgrade-able. We become unaware of the amount of data we store and its relevance.

This phenomenon already has a name: “Dark Data.” Dark Data consumes storage energy in the cloud and is no longer visible. Please read all about it here: Dark Data.

TIP 1: Every migration is a moment to clean up your data. By dragging everything with you, the burden of migrating becomes bigger. In easy migrations, do a clean-up—it prevents future, more extensive issues.

Never follow the Garbage Out – Garbage in principle, even if it is easy!

 

Migrations in the PLM domain are different – setting the scene.

Before discussing the various scenarios, let’s examine what companies are doing. For early PLM adopters in the Automotive, Aerospace, and Defense Industries, migrations from mainframes to modern infrastructures have become impossible. The real problem is not only the changing hardware but also the changing data and data models.

For these companies, the solution is often to build an entirely new PLM infrastructure on top of the existing infrastructure, where manageable data pieces are migrated to new environments using data lakes, dashboards, and custom apps to support modern users.

Migration in this case is a journey as long as the data lives – and we can learn from them!

 

Follow the money

From a business perspective, migrations are considered a negative distractor. Talking about them raises awareness of their complexity, which might jeopardize enthusiasm.

For the initiator, the PLM software vendor or implementer, it might endanger the sales deal.

Traditional IT organizations strive for simplification—one CAD, one PLM or one ERP system to manage. Although this argument makes sense, an analysis should always be done comparing the benefits and the (migration) costs and risks to reach the ideal situation.

In those discussions often, migrations are downplayed

Without naming companies, I have observed the downplaying several times, even at some prominent enterprises. So, if you recognize your company in this process, you are not alone.

TIP 2: Migrations are never simple. Make migration a serious topic of your PLM project – as important as the software. This approach means analyzing the potential migration risks and their mitigation is needed.

Please read about the Xylem story in my recent post: The week after the PDSFORUM 2024

The Big Bang has the highest risk and might again lead to garbage out—garbage in.

 

You are responsible for your garbage.

It may sound disparaging, but it is not. Most companies are aware that people, tools and policies have changed over the years. Due to the coordinated approach to working, disciplines did not need to care about downstream or upstream usage of their initially created data – Excel and PDFs are the bridges between disciplines.

All the actual knowledge and context are stored in the heads of experienced employees who have gotten used to dealing with inconsistencies. And they will retire, so there is an urgent need for actual data quality and governance. Read more about the journey from Coordinated to Connected in these articles.

Even if you are not yet thinking about migrations, the digital transformation in the PLM domain is coming, and we should learn to work in a connected mode.

TIP 3: Create a team in your organization that assesses the current data quality and defines the potential future enterprise (data) architecture. Then, start improving the quality of the current generated data. Like the ISO 900x standard, the ISO 8000 standard already exists for data quality.

The future is data-driven; prepare yourself for the future.

 

Migration scenarios and their best practices

Here are some migrations scenario’s – two in this post and more in an upcoming post.

 

From Relational to Object-oriented

One of my earlier projects, starting in 2010 with SmarTeam, was migrating a mainframe-based application for airplane certification to a modern Microsoft infrastructure.

The goal was to create a new environment that could be used both as a replacement for the mainframe application and as the design and validation environment to implement changes to the current airplanes during a maintenance or upgrade activity.

The need was high because detailed documentation about the logic of the current application did not exist, and only one person who understood the logic was partly available.

So, internally, the relational database was a black box. The tables in the database contained a mix of item data, document data, change status and versions. The documents were stored in directories with meaningful file names but disconnected from the application.

The initial estimate was that the project would take two to three months, so a fixed price for two months was agreed upon. However, it became almost a two-year project, and in the end, the result seemed to be reliable (there was never mathematical proof).

The disadvantage was that SmarTeam ended up being so highly customized that automatic upgrades would not work for this version anymore—a new legacy was created with modern technology.

The same story, combined with the example of Ericsson’s migration attempt, is described in the 2016 post, The PLM Migration Dilemma. For me, the lesson learned from these examples leads to the following recommendation.

TIP 4: When there is a paradigm change in the data model, don’t migrate but establish a new (data-driven) infrastructure and connect to your legacy as much as possible in read-only mode.

The automotive and aerospace industries’ story is one of paradigm change.

Listen to the SharePLM podcast Revolutionizing PLM: Insights from Yousef Hooshmand, where Yousef also discusses how to address this transition process.

 

CAD/PDM to PLM

Another migration step happens when companies decide to implement a traditional PLM infrastructure as a System of Record, merging PDM data (mainly CAD) and ERP data (the BOM).

Some of these companies have been working file-based and have stored their final CAD files in folders; others might have a local PDM system native to the 3D CAD. The EBOM usually existed digitally in ERP, and most of the time, it is not a “pure” EBOM but more of a hybrid EBOM/MBOM.

The image above show this type of migration can be very challenging as, in the source systems, there is not necessarily a consistent 3D CAD definition matching the BOM items. As the systems have been disconnected in the past, people have potentially added missing information or fixed information on the BOM side. As in most companies, the manufacturing definition is based on drawings, and the consistency with the 3D CAD definition is not guaranteed.

To address this challenge, companies need to assess the usability of the CAD and BOM data. Is it possible to populate the CAD files with properties that are necessary for an import? For example, does the file path contain helpful information?

I have experienced a situation where a company has poorly defined 3D parts and no properties, as all the focus was on using the 3D to generate the 2D drawing.

The relevant details for manufacturing were next added to the drawing and not anymore to the parts or models – traceability was almost impossible.

In this situation, importing the 3D CAD structures into the new PLM system has limited value. An alternative is to describe and test procedures for handling legacy data when it is needed, either to implement a design change or a new order. Leave the legacy accessible, but do not migrate.

The BOM side is, in theory, stable for manufactured products, as the data should have gone through a release process. However, the company needs to revisit its part definition process for new designs and products.

Some points to consider:

  1. Meaningful identifiers are not desired in a PLM system as they create a legacy. Therefore, the import of parts with smart identifiers should map to relevant part properties besides the ID. Splitting the ID into properties will create a broader usage in the future. Read more in Smart Part Numbers – do we need them?
  2. In addition, companies should try to avoid having logistic information, such as supplier-specific part numbers to come from the CAD system. Supplier parts in your CAD environment create inefficiencies when a supplier part becomes obsolete. Concepts such as EBOM and MBOM and potentially the SBOM should be well understood during this migration.
  3. Concepts of EBOM and MBOM should also be introduced when moving from an ETO to a CTO approach or when modularity is a future business strategy.

 

 

Conclusion

As every company is on its PLM journey and technology is evolving, there will always be a migration discussion. Understanding and working towards the future should be the most critical driver for migration. Migrations in the PLM domain are often more than a data migration – new ways of working should be introduced in parallel. And for that reason the “big bang” is often too costly and demotivating for the future.

 

 

Two weeks ago, this post from Ilan Madjar drew my attention. He pointed to a demo movie, explaining how to support Smart Part Numbering on the 3DEXPERIENCE platform. You can watch the recording here.

I was surprised that Smart Part Numbering is still used, and if you read through the comments on the post, you see the various arguments that exist.

  • “Many mid-market customers are still using it”
    me: I think it is not only the mid-market – however, the argument is no reason to keep it alive.
  • “The problem remains in the customer’s desire (or need or capability) for change.”
    me: This is part of the lowest resistance.
  • “User resistance to change. Training and management sponsorship has proven to be not enough.”
    me: probably because discussions are feature-oriented, not starting from the business benefits.
  • “Cost and effort- rolling this change through downstream systems. The cost and effort of changing PN in PLM,ERP,MES, etc., are high. Trying to phase it out across systems is a recipe for a disaster.”
    me: The hidden costs of maintaining Smart Numbers inside an organization are high and invisible, reducing the company’s competitiveness.
  • “Existing users often complain that it takes seconds to minutes more for unintelligent PN vs. using intelligent PN.”
    me: If we talk about a disconnected user without access to information, it could be true if the number of Smart Numbers to comprehend is low.

There were many other arguments for why you should not change. It reminded me of the image below:

Smart Numbers related to the Coordinated approach

Smart Part Numbers are a characteristic of best practices from the past. Where people were working in different systems, the information moving from one system to another was done manually.

For example, it is re-entering the Bill of Materials from the PDM system into the ERP system or attaching drawings to materials/parts in the ERP system. The filename often reflects the material or part number in the latter case.

The problems with the coordinated, smart numbering approach are:

  • New people in the organization need to learn the meaning of the numbering scheme. This learning process reduces the flexibility of an organization and increases the risk of making errors.
  • Typos go unnoticed when transferring numbers from one system to another and only get noticed late when the cost of fixing the error might be 10 -100 fold.
  • The argument that people will understand the meaning of a part is partly valid. A person can have a good guess of the part based on the smart part number; however, the details can be different unless you work every day with the same and small range of parts.
  • Smart Numbers created a legacy. After Mergers and Acquisitions, there will be multiple part number schemes. Do you want to renumber old parts, meaning non-value-added, risky activities? Do you want to continue with various numbering schemes, meaning people need to learn more than one numbering schema – a higher entry barrier and risk of errors?

There were and still are many advanced smart numbering systems.

In one of my first PDM implementations in the Netherlands, I learned about the 12NC code system from Philips – introduced at Philips in 1963 and used to identify complete products, documentation, and bare components, up to the finest detail. At this moment, many companies in the Philips family (suppliers or offspring) still use this numbering system, illustrating that it is not only the small & medium enterprises that are reluctant to change their numbering system.

The costs of working with Smart Part Numbers are often unnoticed as they are considered a given.

 

From Coordinated to Connected

Digital transformation in the PLM domain means moving from coordinated practices toward practices that benefit from connected technology. In many of my blog posts, you can read why organizations need to learn to work in a connected manner. It is both for their business sustainability and also for being able to deal with regulations related to sustainability in the short term.

GHG reporting, ESG reporting, material compliance, and the DPP are all examples of the outside world pushing companies to work connected. Besides the regulations, if you are in a competitive business, you must be more efficient, innovative and faster than your competitors.

In a connected environment, relations between artifacts (datasets) are maintained in an IT infrastructure without requiring manual data transformations and people to process the data. In a connected enterprise, this non-value-added work will be reduced.

How to move away from Smart Numbering systems?

Several comments related to the Smart Numbering discussion mentioned that changing the numbering system is too costly and risky to implement and that no business case exists to support it. This statement only makes sense if you want your business to become obsolete slowly. Modern best practices based on digitization should be introduced as fast as possible, allowing companies to learn and adapt. There is no need for a big bang.

  • Start with mapping, prioritizing, and mapping value streams in your company. Where do we see the most significant business benefits related to cost of handling, speed, and quality?

Note: It is not necessary to start with engineering as they might be creators of data – start, for example, with the xBOM flow, where the xBOM can be a concept BOM, the engineering BOM, the Manufacturing BOM, and more. Building this connected data flow is an investment for every department; do not start from the systems.

  • Next point: Do not rename or rework legacy data. These activities do not add value; they can only create problems. Instead, build new process definitions that do not depend on the smartness of the number.

Make sure these objects have, besides the part number, the right properties, the right status, and the right connections. In other words, create a connected digital thread – first internally in your company and next with your ecosystem (OEMs, suppliers, vendors)

  • Next point: Give newly created artifacts a guaranteed unique ID independent of others. Each artifact has its status, properties and context. In this step, it is time to break any 1 : 1 relation between a physical part and a CAD-part or drawing. If a document gets revised, it gets a new version, but the version change should not always lead to a part number change. You can find many discussions on why to decouple parts and documents and the flexibility it provides.
  • Next point: New generated IDs are not necessarily generated in a single system. The idea of a single source of truth is outdated. Build your infrastructure upon existing standards if possible. For example, the UID of the Digital Product Passport will be based on the ISO/IEC 15459 standard, similar to the UID for retail products managed by the GS1 standard. Or, probably closer to home, look into your computer’s registry, and you will discover a lot of software components with a unique ID that specific programs or applications can use in a shared manner.

When will it happen?

In January 2016, I wrote about “the impact of non-intelligent part numbers” and surprisingly almost 8 years later and we are still in the same situation.

I just read Oleg Shilovitsky’s post The Data Dilemma: Why Engineers and Manufacturing Companies Struggle to Find Time for Data Management where he mentions Legacy Systems and Processes, Overwhelming Workloads, Lack of (Data) Expertise, Short-Term Focus and Resource Constraints as inhibitors.

You probably all know the above cartoon. How can companies get out of this armor or habits? Will they be forced by the competition or by regulations. What do you think ?

 

Conclusion

Despite proven business benefits and insights, it remains challenging for companies to move toward modern, data-driven practices where Smart Number generators are no longer needed. When talking one-on-one to individuals, they are convinced a change is necessary, and they are pointing to the “others”.

I wish you all a prosperous 2024 and the power to involve the “others”.

@38 minute: you need to be able to unlearn

 

 

 

 

 

Translate

  1. Unknown's avatar
  2. Håkan Kårdén's avatar

    Jos, all interesting and relevant. There are additional elements to be mentioned and Ontologies seem to be one of the…

  3. Lewis Kennebrew's avatar

    Jos, as usual, you've provided a buffet of "food for thought". Where do you see AI being trained by a…

  4. Håkan Kårdén's avatar