Someone notified me that not everyone subscribed to my blog necessary will read my posts on LinkedIn. Therefore I will repost the upcoming weeks some of my more business oriented posts from LinkedIn here too. This post was from July 3rd and an introduction to all the methodology post I am currently publishing.
The importance of a (PLM) data model
What makes it so hard to implement PLM in a correct manner and why is this often a mission impossible? I have been asking myself this question the past ten years again and again. For sure a lot has to do with the culture and legacy every organization has. Imagine if a company could start from scratch with PLM. How would they implement PLM nowadays?
My conclusion for both situations is that it all leads to a correct (PLM) data model, allowing companies to store their data in an object-oriented manner. In this way reflecting the behavior the information objects have and the way they mature through their information lifecycle. If you making compromises here, it has an effect on your implementation, the way processes are supported out-of-the-box by a PLM system or how information can be shared with other enterprise systems, in particular, ERP. PLM is written between parenthesis as I believe in the future we do not talk PLM or ERP separate anymore – we will talk business.
Let me illustrate this academic statement.
A mid-market example
When I worked with SmarTeam in the nineties, the system was designed more as a PDM system than a PLM system. The principal objects were Projects, Documents, and Items. The Documents had a sub-grouping in Office documents and CAD documents. And the system had a single lifecycle which was very basic and designed for documents. Thanks to the flexibility of the system you could quickly implement a satisfactory environment for the engineering department. Problems (and customizations) came when you wanted to connect the data to the other departments in the company.
The sales and marketing department defines and sells products. Products were not part of the initial data model, so people misused the Project object for that. To connect to manufacturing a BOM (Bill of Material) was needed. As the connected 3D CAD system generated a structure while saving the assemblies, people start to consider this structure as the EBOM. This might work if your projects are mechanical only.
However, a Document is not the same as a Part. A Document has a complete different behavior as a Part. Documents have continuous iterations, with a check-in/checkout mechanism, where the Part definition remains unchanged and gets meanwhile a higher maturity.
The correct approach is to have the EBOM Part structure, where Part connect to the Documents. And yes, Documents can also have a structure, but it is not a BOM. SmarTeam implemented this around 2004. Meanwhile, a lot of companies had implemented their custom solution for EBOM by customization not matching this approach. This created a first level of legacy.
When SmarTeam implemented Part behavior, it became possible to create a multidisciplinary EBOM, and the next logical step was, of course, to connect the data to the ERP system. At that time, most implementations have been pushing the EBOM to the ERP system and let it live there further. ERP was the enterprise tool, SmarTeam the engineering tool. The information became disconnected in an IT-manner. Applying changes and defining a manufacturing BOM was done manually in the ERP system and could be done by (experienced) people that do not make mistakes.
Next challenge comes when you want to automate the connection to ERP. In that case, it became apparent that the EBOM and MBOM should reside in the same system. (See old and still actual post with comments here: Where is the MBOM) In one system to manage changes and to be able to implement these changes quickly without too much human intervention. And as the EBOM is usually created in the PLM system, the (commercial/emotional) PLM-ERP battle started. “Who owns the part definition”, “Who owns the MBOM definition” became the topic of many PLM implementations. The real questions should be: “Who is responsible for which attributes of the Part ?” and “Who is responsible for which part of the MBOM definition ?” as data should be shared not owned.
The SmarTeam evolution shows how a changing scope and an incomplete/incorrect data model leads to costly rework when aligning to the mainstream. And this is happening with many implementation and other PLM systems. In particular when the path is to grow from PDM to PLM. An important question remains what is going to be mainstream in the future. More on that in my conclusion.
A complex enterprise example
In the recent years, I have been involved in several PLM discussions with large enterprises. These enterprises suffer from their legacy. Often the original data management was not defined in an object-oriented manner, and the implementation has been expanding with connected and disconnected systems like a big spaghetti bowl.
The main message most of the time is:
“Don’t touch the systems it as it works for us”.
The underlying message is;
“We would love to change to a modern approach, but we understand it will be a painful exercise and how will it impact profitability and execution of our company”
The challenge these companies have is that it extremely hard to imagine the potential to-be situation and how it is affected by the legacy. In a project that I participated several years ago the company was migrating from a mainframe database towards a standard object-oriented (PLM) data model. The biggest pain was in mapping data towards the object-oriented data model. As the original mainframe database had all kind of tables with flags and mixed Part & Document data, it was almost impossible to make a 100 % conversion. The other challenge was that knowledge of the old system had vaporized. The result at the end was a customized PLM data model, closer to current reality, still containing legacy “tricks” to assure compatibility.
All these enterprises at a particular time have to go through such a painful exercise. When is the best moment? When business is booming, nobody wants to slow-down. When business is in a lower gear, costs and investments are minimized to keep the old engine running efficiently. I believe the latter would be the best moment to invest in making the transition if you believe your business will still exist in 10 years from now.
Back to the data model.
Businesses should have today a high-level object-oriented data model, describing the main information objects and their behavior in your organization. The term Master Data Management is related to this. How many companies have the time and skills to implement a future-oriented data model? And the data model must stay flexible for the future.
Compare it to your brain, which also stores information by its behavior and by learning the brain understands what it logically related. The internal data model gets enriched while we learn.
Once you have a business data model, you are able to implement processes on top of it. Processes can change over time, therefore, avoid hard-coding specific processes in your enterprise systems. Like the brain, we can change our behavior (applying new processes) still it will be based on the data model stored inside our brain.
Conclusion:
A lot of enterprise PLM implementations are in a challenging situation due to legacy or incomplete understanding and availability of an enterprise data model. Therefore cross-department implementations and connecting others systems are considered as a battle between systems and their proprietary capabilities.
The future will be based on business platforms and realizing this take years – imagine openness and usage of data standards. An interesting conference to attend in the near future for this purpose is the PDT2015 conference in Stockholm.
Meanwhile I also learned that a one-day Master Data Management workshop will be held before the PDT2015 conference starts on the 12th of October. A good opportunity to deep-dive for three days !
Leave a comment
Comments feed for this article