You are currently browsing the tag archive for the ‘PLM platform’ tag.
Someone notified me that not everyone subscribed to my blog necessary will read my posts on LinkedIn. Therefore I will repost the upcoming weeks some of my more business oriented posts from LinkedIn here too. This post was from July 3rd and an introduction to all the methodology post I am currently publishing.
The importance of a (PLM) data model
What makes it so hard to implement PLM in a correct manner and why is this often a mission impossible? I have been asking myself this question the past ten years again and again. For sure a lot has to do with the culture and legacy every organization has. Imagine if a company could start from scratch with PLM. How would they implement PLM nowadays?
My conclusion for both situations is that it all leads to a correct (PLM) data model, allowing companies to store their data in an object-oriented manner. In this way reflecting the behavior the information objects have and the way they mature through their information lifecycle. If you making compromises here, it has an effect on your implementation, the way processes are supported out-of-the-box by a PLM system or how information can be shared with other enterprise systems, in particular, ERP. PLM is written between parenthesis as I believe in the future we do not talk PLM or ERP separate anymore – we will talk business.
Let me illustrate this academic statement.
A mid-market example
When I worked with SmarTeam in the nineties, the system was designed more as a PDM system than a PLM system. The principal objects were Projects, Documents, and Items. The Documents had a sub-grouping in Office documents and CAD documents. And the system had a single lifecycle which was very basic and designed for documents. Thanks to the flexibility of the system you could quickly implement a satisfactory environment for the engineering department. Problems (and customizations) came when you wanted to connect the data to the other departments in the company.
The sales and marketing department defines and sells products. Products were not part of the initial data model, so people misused the Project object for that. To connect to manufacturing a BOM (Bill of Material) was needed. As the connected 3D CAD system generated a structure while saving the assemblies, people start to consider this structure as the EBOM. This might work if your projects are mechanical only.
However, a Document is not the same as a Part. A Document has a complete different behavior as a Part. Documents have continuous iterations, with a check-in/checkout mechanism, where the Part definition remains unchanged and gets meanwhile a higher maturity.
The correct approach is to have the EBOM Part structure, where Part connect to the Documents. And yes, Documents can also have a structure, but it is not a BOM. SmarTeam implemented this around 2004. Meanwhile, a lot of companies had implemented their custom solution for EBOM by customization not matching this approach. This created a first level of legacy.
When SmarTeam implemented Part behavior, it became possible to create a multidisciplinary EBOM, and the next logical step was, of course, to connect the data to the ERP system. At that time, most implementations have been pushing the EBOM to the ERP system and let it live there further. ERP was the enterprise tool, SmarTeam the engineering tool. The information became disconnected in an IT-manner. Applying changes and defining a manufacturing BOM was done manually in the ERP system and could be done by (experienced) people that do not make mistakes.
Next challenge comes when you want to automate the connection to ERP. In that case, it became apparent that the EBOM and MBOM should reside in the same system. (See old and still actual post with comments here: Where is the MBOM) In one system to manage changes and to be able to implement these changes quickly without too much human intervention. And as the EBOM is usually created in the PLM system, the (commercial/emotional) PLM-ERP battle started. “Who owns the part definition”, “Who owns the MBOM definition” became the topic of many PLM implementations. The real questions should be: “Who is responsible for which attributes of the Part ?” and “Who is responsible for which part of the MBOM definition ?” as data should be shared not owned.
The SmarTeam evolution shows how a changing scope and an incomplete/incorrect data model leads to costly rework when aligning to the mainstream. And this is happening with many implementation and other PLM systems. In particular when the path is to grow from PDM to PLM. An important question remains what is going to be mainstream in the future. More on that in my conclusion.
A complex enterprise example
In the recent years, I have been involved in several PLM discussions with large enterprises. These enterprises suffer from their legacy. Often the original data management was not defined in an object-oriented manner, and the implementation has been expanding with connected and disconnected systems like a big spaghetti bowl.
The main message most of the time is:
“Don’t touch the systems it as it works for us”.
The underlying message is;
“We would love to change to a modern approach, but we understand it will be a painful exercise and how will it impact profitability and execution of our company”
The challenge these companies have is that it extremely hard to imagine the potential to-be situation and how it is affected by the legacy. In a project that I participated several years ago the company was migrating from a mainframe database towards a standard object-oriented (PLM) data model. The biggest pain was in mapping data towards the object-oriented data model. As the original mainframe database had all kind of tables with flags and mixed Part & Document data, it was almost impossible to make a 100 % conversion. The other challenge was that knowledge of the old system had vaporized. The result at the end was a customized PLM data model, closer to current reality, still containing legacy “tricks” to assure compatibility.
All these enterprises at a particular time have to go through such a painful exercise. When is the best moment? When business is booming, nobody wants to slow-down. When business is in a lower gear, costs and investments are minimized to keep the old engine running efficiently. I believe the latter would be the best moment to invest in making the transition if you believe your business will still exist in 10 years from now.
Back to the data model.
Businesses should have today a high-level object-oriented data model, describing the main information objects and their behavior in your organization. The term Master Data Management is related to this. How many companies have the time and skills to implement a future-oriented data model? And the data model must stay flexible for the future.
Compare it to your brain, which also stores information by its behavior and by learning the brain understands what it logically related. The internal data model gets enriched while we learn.
Once you have a business data model, you are able to implement processes on top of it. Processes can change over time, therefore, avoid hard-coding specific processes in your enterprise systems. Like the brain, we can change our behavior (applying new processes) still it will be based on the data model stored inside our brain.
Conclusion:
A lot of enterprise PLM implementations are in a challenging situation due to legacy or incomplete understanding and availability of an enterprise data model. Therefore cross-department implementations and connecting others systems are considered as a battle between systems and their proprietary capabilities.
The future will be based on business platforms and realizing this take years – imagine openness and usage of data standards. An interesting conference to attend in the near future for this purpose is the PDT2015 conference in Stockholm.
Meanwhile I also learned that a one-day Master Data Management workshop will be held before the PDT2015 conference starts on the 12th of October. A good opportunity to deep-dive for three days !
This time I would like to receive some feedback from my readers as I believe the topic I am discussing here might be similar to a PLM / ERP discussion – a discussion between religions. I have preached the past two years a more data-centric approach for PLM, instead of file management and related tot this data-centric approach, the concept of a PLM platform / Business Platform – CIMdata/ Innovation Platform – Gartner becomes clear.
What´s the issue?
As I wrote in my earlier post (random PLM future thoughts), I realized that talking about platforms is not that straight-forward when meeting companies with their history and terminology. Some claim they are already using a business platform, others have no clue what makes a platform different from a their current PLM implementation ? Therefore I will summarize the different approaches I have seen in my network and give a non-academic opinion as a base for discussion. Looking forward to your opinion.
The platform approach
My definition of a PLM platform:
- A central repository of data based on a core data model. Information is stored as data in a unique way
- On top of this repository, applications can run, using a subset of the overall data elements, proving dedicated functionality and user interface to a particular user / role
- Access to the platform is provided through web-technology. Storage could be on the cloud.
- External applications and data can be connected through an open (standardized?) API embedded or federated
- The PLM platform can be a collection of services and functionality coming from various vendors / suppliers – the app store concept
- The platform approach is THE DREAM for business, being flexible to combine and edit data in any desired context in dedicated apps / environments
In the PLM world, Dassault Systems with their 3DExperience approach is following this trend although here you might argue about the ease of use to add external apps to this platform – is it open ? Aras and Autodesk might also claim they have a PLM platform, where you might question the same and if the depth of the data model and the provided solutions on top of the data model are mature enough. Finally also SAP can be considered as a platform, but I would not name it a PLM platform at this moment in time. An important question for me would be: How can achieve openness of a PLM platform?
Your thoughts?
The PLM backbone approach
My definition of a PLM backbone:
- The core PLM functionality is provided by a single, proprietary PLM system
- Additional functionality that is not part of the core development (acquisitions) is connected to the backbone through proprietary interfaces
- External authoring tools are linked to the backbone through integrations or interfaces which could be developed by third parties
- External system can interface to the PLM backbone through open interfaces
- The PLM backbone is THE DREAM for engineering, as historically this was the domain where PLM started to be implemented
I would consider Siemens and PTC (see picture) the best examples of a PLM backbone approach with their PLM portfolio. Teamcenter and Windchill are both rich PLM systems further connected to several systems, covering the product lifecycle. I am not expert enough to state that the same conclusion is valid for Oracle´s Agile, where I believe the backbone is bigger than the PLM system. What do you think ? Will these PLM vendors also move to a platform approach? And what will be the platform?
The Service Bus approach
My understanding of the Service Bus (I am not an IT-expert):
- Service Bus has a standardized interface to request for data or to post data that needs to be stored in other systems
- The Service Bus approach reduces the amount of (custom) interfaces between systems by requiring standardized inputs and outputs per system
- Providing a user with information that is not entirely available in a single system, the service bus needs to acquire the data from other systems, which might not give a high-performance as expected by business people
- The Service Bus is the IT DREAM as it simplifies the complexity for IT to manage point-to-point solutions between systems and makes an upgrade strategy easier to support.
From a very high-level view, the service bus approach has some similarities to a platform. The service bus concept allows business to select the systems they like the most (provided they connect to the service bus) – Image property of IBM.com
The main difference would be the persistence of information, where is the real data stored? I came across the service bus approach more often in the past, where the target was most of the time to integrate the PDM functionality (PLM as an enterprise solution was never in scope here).
For the Service Bus approach, I am curious to learn its relevance for future PLM implementations as the challenge would be to provide any user in the company with the relevant information in context. Is the service bus going to be replaced by the platform? Who would be the major players here?
The Business Intelligence approach
This method I discovered in project-centric companies (Oil & Gas companies, EPCs, Construction companies) but strangely enough also at some manufacturing companies, where I would assume integration of systems would bring large benefits.
- Each type of information is managed only in one single system avoiding interfaces or duplication of data.
- Only where needed, data will be pushed from one system to other systems
- Business Intelligence applications extract information from the relevant system and present this in context to the user, giving him/her a better of understanding
- Business users will work have to work in multiple systems to complete their tasks
- The BI approach is the ULTIMATE IT DREAM as it simplifies their works dramatically and shuts down business demands.
I have seen an example where IT dictated that for document management we use product ABC (well-known Content Management system). Next for internal documents we use SharePoint. For CAD, we use product PQR as much as possible (heavily adapted) or AutoCAD 2D (to support the minimum). For ERP, the standard system is XYZ (a famous ERP system – you do not lose your job by selecting them) and of course everyone uses Excel as a common interface of information between people.
It was impossible in this company to have a business view on the solution landscape. As you can imagine, this company’s margins are not (yet) under pressure as their industry is very conservative.
What do you think?
Is the future for PLM in platforms? If Yes, what about openness? Who are the candidates to offer such a platform? Or will lack of industry standards and openness block wider adoption? If No, will there be a massive PLM system in the future, connected to other enterprise systems (ERP/CRM)? Or will PLM be implemented as a collection of smaller systems communicating through an enterprise service bus?
I am looking forward discussing the topic here and soon during the upcoming Product Innovation conference in Düsseldorf
Jos, great thoughts about BOM management. Here are some of my thoughts. I can see how BOM management will evolve…
As a complement, even if more and more of the diversity of a product is managed at the software level…
1) A wiring diagram stores information (wires between ports of the electrical components) that does not exist in most of…
BOM has NEVER been the sole "master" of the Product. The DEFINITION FILE is ! For example the wiring of…
Interesting discussion about part numbers and where they originate. Though there seems to be consensus about the EBOM and MBOM,…