You are currently browsing the tag archive for the ‘PLM’ tag.
In the past two years, I have been heavily involved in PLM Proof of Concepts sitting at both sides of the table. Supporting companies in their PLM selection, supporting a vendor explaining their value to the customer and supporting implementers assisting them with industry knowledge, all in the context of a PLM selection process.
The Proof of Concept is crucial in a PLM selection process as it is the moment where the first glimpse of reality comes to the table.
Different size of companies, different consultants all have a different view on the importance of the Proof of Concept. Let me share you my thoughts after a quick recap on the PLM selection process.
The PLM selection process
1. Build a vision
It is important that a company understands what they want to achieve in the next five to ten years, before starting a PLM selection process. Implementing PLM means a business transformation, even if you are a small company. If the management does not understand a vision is required, there is a potential risk upcoming, as PLM without a change in the way people work, will not deliver the expected results.
2. Issue an RFI to potential candidates
Once you have a PLM vision, it is time to get in touch with potential suppliers. The RFI (Request for Information) phase is the phase where you can educate yourself better by challenging the suppliers to work with you on the future solutions.
3. Discuss with selected candidates
From the RFI responses you understand which companies are attractive because they match your vision, your budget or industry. Have a first interaction with the selected companies and let them demo their standard environment targeted to your vision.
In this stage, you check with the preferred companies their ability to deliver and your ability to work together. The POC phase should give you the understanding of the scope for the upcoming PLM project and help you to understand who and how the project can be executed. More details about this step below.
Although some companies start with an RFP before the POC, for me it makes most sense to verify the details after you have a proper understanding of the To-Be solution. The RFP is often the base for the contractual scope and therefore should be as accurate as possible
In the past, I wrote in more detail about the PLM selection process. Two posts: PLM selection: Don’t do this and PLM selection: Do this. Have a read if you want to understand this part in more depth. Now let´s focus on the POC .
- As described before, the target of the Proof of Concept should be to get a better understanding of the potential To-Be processes and obtain an impression of the capabilities of the implementer and the preferred PLM software.
The result should be that you have more realistic expectations of what can be achieved and the challenges your company will face.
- From there, you can evaluate the risks, address them and build an achievable roadmap to implement. It is important that the focus is not just on the cost of the implementation.
- To sell PLM inside your company, you need to realign with the vision and explain, to all people involved,the value of “Why PLM”.
Explaining the value is complex, as not everyone needs the same message. The management will focus on business benefits where users will focus how it impacts their daily life. If you forget to explain the value, the PLM projects, it is considered again as just another software purchase.
Make sure the Proof of Concept is driven by validating future business scenarios, focusing on the To-Be solution. The high-level scenarios should be demonstrated and explained to the business people. In this stage, it is important people realize the benefits and the value of the new processes.
The POC is also an internal sales event. The goal should be to get more enthusiastic and supportive business people in your company for the upcoming PLM project. Identify the champions you will need to lean on during the implementation.
Test the implementer. To my opinion the critical success of a PLM implementation depends on the implementation team, not on the software. Therefore, the POC phase is the best moment to learn if you can work with the implementer. Do they know your business? Do they have experience with your business? The more you are aligned, the higher the chance you will be successful as a team
Show commitment to engage. Often I have seen POC engagements where the company demanded the implementer or vendor a Proof of Concept for free. This creates an unbalanced situation during the Proof of Concept as the vendor or implementer can not invest time and resources in the process as expected without any commitment from the company. By paying a certain fee for the POC, a company can demonstrate to the implementer /vendor that this POC is valuable for you and you can request the same response from them.
The Proof of Concept is not a detailed function/feature check to identify each mouse-click or option in the system. During the implementation, these details might come up. It is important in a Proof of Concept to understand the big picture and not to get lost in the details. As human beings we tend to focus on what does not work, not realizing that probably over eighty-ninety percent works according the needs
Do not expect the ultimate To-Be scenario demonstrated during the Proof of Concept. The Proof of Concept is a learning stage for both the company and the implementer to imagine the best possible scenario. PLM systems are generic and likely they will not provide a similar configuration and functionality matching your environment. At this stage validate if the primary capabilities are there and if there are gaps.
Do not run a POC with a vendor (only). This might be one of the most critical points for a POC. A PLM software vendor’s target is to sell their software and for that reason they often have dedicated presales teams that will show you everything in a smooth manner, overwhelming you with all the beauty of the software. However after the POC this team is gone and you will have to align yourself again with the implementation partner, trying to match again your business needs and their understanding.
Realize – you get what you are asking for. This is more a Do-and-Don’t message packed together. A Proof of Concept phase is a point where companies get to know each other. If you are not focused, do not expect the implementer / vendor to be committed. A PLM implementation is not product. It is a business transformation supported by products and services. Do not treat PLM implementers and vendors in the same way, as your customers treat you (in case you deliver products).
There are still many more thoughts about the Proof of Concept . Ideally you run two POCs in parallel, either with two implementers of the preferred software (if possible) or with two different implementers representing different software.
Ideally, as I know it is a challenge, especially for small and medium-sized businesses, where people are running to keep the business on-going.
Still remember, PLM is a business transformation, targeting to improve your business in the upcoming five to ten years, avoiding you are running out of business.
Your thoughts ?
As a bonus a short anecdote that I posted in 2010 still relevant:
Some time ago a Christian PLM Sales professional died (let’s call him Jack) and according to his believe he faced Saint Peter at the gates of Heaven and Hell.
Saint Peter greeted Jack and said: “Jack, with the PLM Sales you have done good and bad things to the world. For that reason, I cannot decide if you should go to Heaven or to Hell. Therefore, I allow you to make the choice yourself”.
Jack replied: “But Saint Peter, how can I make such an important decision for the rest of my eternal life. It is too difficult!”
Saint Peter replied: “No problem Jack, take a look at Heaven and Hell, take your time and then tell me your decision.”
Jack entered Heaven and he was surprised about the quietness and green atmosphere there. Angels were singing, people were eating from golden plates with the best food ever, people were reading poetry and everything was as peaceful as you could imagine. In the distance, he could see God surrounded by some prophets talking about the long-term future. After some time, Jack had seen it and went to Hell to have a view there.
And when he opened the gates of Hell, he was astonished. Everywhere he looked there were people partying, having fun. It reminded him off these sales kick-offs, he had in the past, exotic places with lots of fun. In the distance, he could see the Devil as DJ playing the latest dance music – or was it DJ Tiësto?
Jack did not hesitate and ran back to Saint Peter, no time to lose. “Saint Peter,” he said “I want to go to Hell, no doubt. And pity I did not know it before”
“So be it, ” said Saint Peter “go for it.”
And then once Jack entered Hell, it was suddenly all fire around him, people were screaming of pain and suffering and also Jack felt the first flames.
“Devil!!” He screamed “what happened to what I have seen before?”
With a sarcastic voice, the devil replied: “That? That was a proof of concept.”
Shaping the PLM platform of the Future
It was the first time I attended this event. I was positively surprised about the audience and content. Where other PLM conferences were often more focusing on current business issues, here a smaller audience (130 persons) was looking into more details around the future of PLM. Themes like PLM platforms, the Circular Economy, Open Standards and longevity of data were presented and discussed here.
The emergence of the PLM platform
1. The product lifecycle will become more and more circular due to changing business models and in parallel the different usage/availability of materials will have an impact how we design and deliver products
Can current processes and tools support today’s complexity. And what about tomorrow? According to a CIMdata survey there is a clear difference in profit and performance between leaders and followers, and the gap is increasing faster. “Can you afford yourself to be a follower ?” is a question companies should ask themselves.
Rethinking PLM platform does not bring the 2-3 % efficiency benefit but can bring benefits from 20 % and more.
Peter sees a federated platform as a must for companies to survive. I in particular likes his statement:
The new business platform paradigm is one in which solutions from multiple providers must be seamlessly deployed using a resilient architecture that can withstand rapid changes in business functions and delivery modalities
Industry voices on the Future PLM platform
Steven Vetterman from ProSTEP talked about PLM in the automotive industry. Steven started describing the change in the automotive industry, by quoting Heraclitus Τα πάντα ρεί – the only constant is change. Steven described two major changes in the automotive industry:
1. The effect of globalization, technology and laws & ecology
2. The change of the role of IT and the impact of culture & collaboration
Interesting observation is that the preferred automotive market will shift to the BRIC countries. In 2050 more than 50 % of the world population (estimate almost 10 billion people at that time) will be living in Asia, 25 percent in Africa. Europe and Japan are aging. They will not invest in new cars.
For Steven, it was clear that current automotive companies are not yet organized to support and integrate modern technologies (systems engineering / electrical / software) beyond mechanical designs. Neither are they open for a true global collaboration between all players in the industry. Some of the big automotive companies are still struggling with their rigid PLM implementation. There is a need for open PLM, not driven from a single PLM system, but based on a federated environment of information.
Yves Baudier spoke on behalf of the aerospace industry about the standardization effort at their Strategic Standardization Group around Airbus and some of its strategic suppliers, like Thales, Safran, BAE systems and more. If you look at the ASD Radar, you might get a feeling for the complexity of standards that exist and are relevant for the Airbus group.
It is a complex network of evolving standard all providing (future) benefits in some domains. Yves was talking about the through Lifecycle support which is striving for data creation once and reuse many times during the lifecycle. The conclusion from Yves, like all the previous speakers is that: The PLM Platform of the Future will be federative, and standards will enable PLM Interoperability
Energy and Marine
Shefali Arora from Wärtsilä spoke on behalf of the energy and marine sector and gave an overview of the current trends in their business and the role of PLM in Wärtsilä. With PLM, Wärtsilä wants to capitalize on its knowledge, drive costs down and above all improve business agility. As the future is in flexibility. Shefali gave an overview of their PLM roadmap covering the aspects of PDM (with Teamcenter), ERP (SAP) and a PLM backbone (Share-A-space). The PLM backbone providing connectivity of data between all lifecycle stages and external partners (customer / suppliers) based on the PLCS standard. Again another session demonstrating the future of PLM is in an open and federated environment
The future PLM platform is a federated platform which adheres to standards provides openness of interfaces that permit the platform to be reliable over multiple upgrade cycles and being able to integrate third-parties (Peter Bilello)
The afternoon session I followed the Systems Engineering track. Peter Bilello gave an overview of Model-Based Systems engineering and illustrated based on a CIMdata survey that even though many companies have a systems engineering strategy in place it is not applied consistently. And indeed several companies I have been dealing with recently expressed their desire to integrate systems engineering into their overall product development strategy. Often this approach is confused by believing requirements management and product development equal systems engineering. Still a way to go.
Dieter Scheithauer presented his vision that Systems Engineering should be a part of PLM, and he gave a very decent, academic overview how all is related. Important for companies that want to go into that direction, you need to understand where you aiming at. I liked his comparison of a system product structure and a physical product structure, helping companies to grab the difference between a virtual, system view and a physical product view:
More Industry voices
The afternoon session started with Christophe Castaing, explaining BIM (Building Information Modeling) and the typical characteristics of the construction industry. Although many construction companies focus on the construction phase, for 100 pieces of information/exchange to be managed during the full life cycle only 5 will be managed during the initial design phase (BIM), 20 will be managed during the construction phase (BAM) and finally 75 will be managed during the operation phase (BOOM). I wrote about PLM and BIM last year: Will 2014 become the year the construction industry will discover PLM?
Christophe presented the themes from the French MINnD project, where the aim is starting from an Information Model to come to a platform, supporting and integrated with the particular civil and construction standards, like IFC. CityGml but also PLCS standard (isostep ISO 10303-239
Amir Rashid described the need for PLM in the consumer product markets stating the circular economy as one of the main drivers. Especially in consumer markets, product waste can be extremely high due to the short lifetime of the product and everything is scrapped to land waste afterward. Interesting quote from Amir: Sustainability’s goal is to create possibilities not to limit options. He illustrated how Xerox already has sustainability as part of their product development since 1984. The diagram below demonstrates how the circular economy can impact all business today when well-orchestrated.
Marc Halpern closed the tracks with his presentation around Product Innovation Platforms, describing how Product Design and PLM might evolve in the upcoming digital era. Gartner believes that future PLM platforms will provide insight (understand and analyze Big Data), Adaptability (flexible to integrate and maintain through an open service oriented architecture), promoting reuse (identifying similarity based on metadata and geometry), discovery (the integration of search analysis and simulation) and finally community (using the social paradigm).
If you look to current PLM systems, most of them are far from this definition, and if you support Gartner’s vision, there is still a lot of work for PLM vendor to do.
Interesting Marc also identified five significant risks that could delay or prevent from implementing this vision:
- inadequate openness (pushing back open collaboration)
- incomplete standards (blocking implementation of openness)
- uncertain cloud performance (the future is in cloud services)
- the steep learning curve (it is a big mind shift for companies)
- Cyber-terrorism (where is your data safe?)
After Marc´s session there was an interesting panel discussion with some the speakers from that day, briefly answering discussing questions from the audience. As the presentations have been fairly technical, it was logical that the first question that came up was: What about change management?
A topic that could fill the rest of the week but the PDT dinner was waiting – a good place to network and digest the day.
Day 2 started with two interesting topics. The first presentation was a joined presentation from Max Fouache (IBM) and Jean-Bernard Hentz (Airbus – CAD/CAM/PDM R&T and IT Backbones). The topic was about the obsolescence of information systems: Hardware and PLM applications. As in the aerospace industry some data needs to be available for 75 years. You can imagine that during 75 years a lot can change to hardware and software systems. At Airbus, there are currently 2500 applications, provided by approximate 600 suppliers that need to be maintained. IBM and Airbus presented a Proof of Concept done with virtualization of different platforms supporting CATIA V4/V5 using Linux, Windows XP, W7, W8 which is just a small part of all the data.
The conclusion from this session was:
To benefit from PLM of the future, the PLM of the past has to be managed. Migration is not the only answer. Look for solutions that exist to mitigate risks and reduce costs of PLM Obsolescence. Usage and compliance to Standards is crucial.
Next Howard Mason, Corporate Information Standards Manager took us on a nice journey through the history of standards developed in his business. I loved his statement: Interoperability is a right, not a privilege
In the systems engineering track Kent Freeland talked about Nuclear Knowledge Management and CM in Systems Engineering. As this is one of my favorite domains, we had a good discussion on the need for pro-active Knowledge Management, which somehow implies a CM approach through the whole lifecycle of a plant. Knowledge management is not equal to store information in a central place. It is about building and providing data in context that it can be used.
Ontology for systems engineering
Leo van Ruijven provided a session for insiders: An ontology for Systems Engineering based on ISO 15926-11. His simplified approach compared to the ISO 15288 lead to several discussion between supporters and opponents during lunch time.
Master Data Management
Based on the type of information companies want to manage in relation to each other supported by various applications (PLM, ERP, MES, MRO, …) this can be a complex exercise and Marc ended with recommendations and an action plan for the MDM lead. In my customer engagements I also see more and more the digital transformation leads to MDM questions. Can we replace Excel files by mastered data in a database?
Almost at the end of the day I was speaking about the PDM platform of the people targeted for the people from the future. Here I highlighted the fundamental change in skills that’s upcoming. Where my generation was trained to own and capture information as much as possible information in your brain (or cabinet), future generations are trained and skilled in finding data and building information out of it. Owning (information) is not crucial for them. Perhaps as the world is moving fast. See this nice YouTube movie at the end.
Ella Jamsin ended the conference on behalf of the Ellen MacArthur Foundation explaining the need to move to a circular economy and the PLM should play a role in that. No longer is PLM from cradle-to-grave but PLM should support the lifecycle from cradle-to-cradle.
Unfortunate I could not attend all sessions as there were several parallel sessions. Neither have I written about all sessions I attended. The PDT Europe conference, a conference for people who mind about the details around the PLM future concepts and the usage of standards, is a must for future strategists.
Business is changing and becoming digital as you might have noticed. If you haven´t noticed it, you might be disconnected from the world or work in a stable silo. A little bit simplified and provocative otherwise you would not read further.
The change towards digital also has its effect on how PLM is evolving. Initially considered as an extension of PDM, managing engineering data, slowly evolving to an infrastructure to support the whole product lifecycle.
The benefits from a real PLM infrastructure are extremely high as it allows people to work smarter, identify issues earlier and change from being reactive towards proactive. In some industries, this change in working is they only way to stay in business. Others with still enough margin will not act.
Note: I am talking about a PLM infrastructure as I do not believe in a single PLM system anymore. For me PLM is supported through a collection of services across the whole product lifecycle, many potentially in one system or platform.
Changing from an engineering-centric system towards an infrastructure across the departmental silos is the biggest challenge for PLM. PLM vendors and ERP vendors with a PLM offering are trying provide this infrastructure and mainly fight against Excel. As an Excel file can easy pass the border from one department to the other. No vision needed for Excel.
A PLM infrastructure however requires a vision. A company has to look at its core business processes and decide on which information flows through the organization or even better their whole value chain.
Building this vision, understanding this vision and then being able to explain the vision is a challenge for all companies. Where sometime even management says
“Why do we need to have a vision, just fix the problem”
also people working in departments are not looking forward to change their daily routines because they need to share information. Here you here statements like
“Why people feel the need to look at the big picture. I want to have my work done.”
So if current businesses do not change, will there be a change?
Here I see the digital world combined with search-based applications coming up. Search based applications allow companies to index their silos and external sources and get an understanding of the amount of data there exists. And from these results learn that there is a lot of duplicated data or invalid information at different places.
This awareness might create the understanding that if instead of having hundred thousands of Excels in the organization, it would be better to have the data inside a database, uniquely stored and connected to other relevant information.
Next if you want to understand it in a more down-to-earth manner it is important to listen and talk with your peers from other companies, other industries. This is currently happening all around the world and I invite you to participate.
Here is a list of events that I am attending or planned to attend but too far away:
Here I will participate as a panel member in the discussion around the concept of zero files. Here we want to explain and discuss to the audience what a data-centric approach means for an organization. Also, customers will share their experiences. This conference is focusing on the ENOVIA community – you can still register here
Here I will speak about the PLM future (based on data) and what PLM should deliver for the future generations. This conference is much broader and addresses all PLM related topics in a broader perspective
Relative new in the Nordics Infuseit, a PLM consultancy company, is able to attract an audience that wants to work on understanding the PLM future. Instead of listening to presenters, here you are challenged to to discuss and contribute to build a common opinion. I will be there too.
Conclusion: It is time to prepare yourself for the change – it is happening and be educated an investment that will be rewarding for your company
What do you think – Is data-centric a dream ?
This is for the moment the last post about the difference between files and a data-oriented approach. This time I will focus on the need for open exchange standards and the relation to proprietary systems. In my first post, I explained that a data-centric approach can bring many business benefits and is pointing to background information for those who want to learn more in detail. In my second post, I gave the example of dealing with specifications.
It demonstrated that the real value for a data-centric approach comes at the moment there are changes of the information over time. For a specification that is right the first time and never changes there is less value to win with a data-centric approach. Moreover, aren’t we still dreaming that we do everything right the first time.
The specification example was based on dealing with text documents (sometimes called 1D information). The same benefits are valid for diagrams, schematics (2D information) and CAD models (3D information)
The challenge for a data-oriented approach is that information needs to be stored in data elements in a database, independent of an individual file format. For text, this might be easy to comprehend. Text elements are relative simple to understand. Still the OpenDocument standard for Office documents is in the background based on a lot of technical know-how and experience to make it widely acceptable. For 2D and 3D information this is less obvious as this is for the domain of the CAD vendors.
CAD vendors have various reasons not to store their information in a neutral format.
- First of all, and most important for their business, a neutral format would reduce the dependency on their products. Other vendors could work with these formats too, therefore reducing the potential market capture. You could say that in a certain manner the Autodesk 2D format for DXF (and even DWG) have become a neutral format for 2D data as many other vendors have applications that read and write back information in the DXF-data format. So far DXF is stored in a file but you could store DXF data also inside a database and make it available as elements.
- This brings us to the second reason why using neutral data formats are not that evident for CAD vendors. It reduces their flexibility to change the format and optimize it for maximal performance. Commercially the significant, immediate disadvantage of working in neutral formats is that it has not been designed for particular needs in an individual application and therefore any “intelligent” manipulations on the data are hard to achieve..
The same reasoning can be applied to 3D data, where different neutral formats exist (IGES, STEP, …. ). It is very difficult to identify a common 3D standard without losing many benefits that an individual 3D CAD format brings currently. For example, CATIA is handling 3D CAD data in a complete different way as Creo does, and again handled different compared to NX, SolidWorks, Solid Edge and Inventor. Even some of them might use the same CAD kernel.
However, it is not only about the geometry anymore; the shapes represent virtual objects that have metadata describing the objects. In addition other related information exists, not necessarily coming from the design world, like tasks (planning), parts (physical), suppliers, resources and more
PLM, ERP, systems and single source of truth
This brings us in the world of data management, in my world mainly PLM systems and ERP systems. An ERP system is already a data-centric application, the BOM is already available as metadata as well as all the scheduling and interaction with resources, suppliers and financial transactions. Still ERP systems store a lot of related documents and drawings, containing content that does not match their data model.
PLM systems have gradually becoming more and more data centric as the origin was around engineering data, mostly stored in files. In a data-centric approach, there is the challenge to exchange data between a PLM system and an ERP system. Usually there is a need to share information between two systems, mainly the items. Different definitions of an item on the PLM and ERP side make it hard to exchange information from one system to the other. It is for that reason why there are many discussions around PLM and ERP integration and the BOM.
In the modern data-centric approach however we should think less and less in systems and more and more in business processes performed on actual data elements. This requires a company-wide, actually an enterprise-wide or industry-wide data definition of all information that is relevant for the business processes. This leads into Master Data Management, the new required skill for enterprise solution architects
The data-centric approach creates the impression that you can achieve a single source of the truth as all objects are stored uniquely in a database. SAP solves the problem by stating everything fits in their single database. To my opinion this is more a black hole approach: Everything gets inside, but even light cannot escape. Usability and reuse of information that was stored with the intention not to be found is the big challenge here.
Other PLM and ERP vendors have different approaches. Either they choose for a service bus architecture where applications in the background link and synchronize common data elements from each application. Therefore, there is some redundancy, however everything is connected. More and more PLM vendors focus on building a platform of connected data elements, where on top applications will run, like the 3DExperience platform from Dassault Systèmes.
As users we are more and more used to platforms as Google, Apple provide these platforms already in the cloud for common use on our smartphones. The large amount of apps run on shared data elements (contacts, locations …) and store additional proprietary data.
Platforms, Networks and standards
And here we enter an interesting area of discussion. I think it is a given that a single database concept is a utopia. Therefore, it will be all about how systems and platforms communicate with each other to provide in the end the right information to the user. The systems and platforms need to be data-centric as we learned from the discussion around the document (file centric) or data-centric approach.
In this domain, there are several companies already active for years. Datamation from Dr. Kais Al-Timimi in the UK is such a company. Kais is a veteran in the PLM and data modeling industry, and they provide a platform for data-centric collaboration. This quote from one of his presentations, illustrates we share the same vision:
“……. the root cause of all interoperability and data challenges is the need to transform data between systems using different, and often incompatible, data models.
It is fundamentally different from the current Application Centric Approach, in that data is SHARED, and therefore, ‘NOT OWNED’ by the applications that create it.
This means in a Data Centric Approach data can deliver MORE VALUE, as it is readily sharable and reusable by multiple applications. In addition, it removes the overhead of having to build and maintain non-value-added processes, e.g. to move data between applications.”
Another company in the same domain is Eurostep, who are also focusing on business collaboration between in various industries. Eurostep has been working with various industry standards, like AP203/214, PLCS and AP233. Eurostep has developed their Share-A-space platform to enable a data-centric collaboration.
This type of data collaboration is crucial for all industries. Where the aerospace and automotive industry are probably the most mature on this topic, the process industry and construction industry are currently also focusing on discovering data standards and collaboration models (ISO 15926 / BIM). It will be probably the innovators in these industries that clear the path for others. For sure it will not come from the software vendors as I discussed before.
If you reach this line, it means the topic has been interesting in depth for you. In the past three post starting from the future trend, an example and the data modeling background, I have tried to describe what is happening in a simplified manner.
If you really want to dive into the PLM for the future, I recommend you visit the upcoming PDT 2014 conference in Paris on October 14 and 15. Here experts from different industries will present and discuss the future PLM platform and its benefits. I hope to meet you there.
Some more to read:
Two weeks ago I attended the Nobletek PLM forum in Belgium, where a group of experts, managers and users discussed topics related to my favorite theme: “Is PLM changing? “
Dick Terleth (ADSE) lead a discussion with title “PLM and Configuration Management as a proper profession” or "How can the little man grow?". The context of the discussion was related to the topic: “How is it possible that the benefits of PLM (and Configuration Management) are not understood at C-level?” or with other words: “Why is the value for Configuration Management and PLM not obvious?”.
In my previous post, PLM is doomed unless …., I quoted Ed Lopategui (www.eng-eng.com), who commented that being a PLM champion (or a Configuration Management expert as Dick Terleth would add) is bad for your career. Dick Terleth asked the same question, showing pictures of the self-assured accountant and the Configuration Management or PLM professional. (Thanks Dick for the pictures). Which job would you prefer?
The PLM ROI discussion
A first attempt to understand the difference could be related to the ROI discussion, which seems to be only applicable for PLM. Apparently ERP and financial management systems are a must for companies. No ROI discussion here. Persons who can control/report the numbers seem to have the company under control. For the CEO and CFO the value of PLM is often unclear. And to make it worse, PLM vendors and implementers are fighting for their unique definition of PLM so we cannot blame companies to be confused. This makes it clear that if you haven´t invested significant time to understand PLM, it will be hard to see the big picture. And at C-level people do not invest significant time to understand the topic. It is the C-level´s education, background or work experience that make him/her decide.
So if the C-level is not educated on PLM, somebody has to sell the value to them. Oleg Shilovitsky wrote about it recently in his post Why is it hard to sell PLM ROI and another respected blogger, Joe Barkai, sees the sun come up behind the cloud, in his latest post PLM Service Providers Ready To Deliver Greater Value. If you follow the posts of independent PLM bloggers (although who is 100 % independent), you will see a common understanding that implementing PLM currently requires a business transformation as old processes were not designed for a modern infrastructure and digital capabilities.
PLM is about (changing) business processes
Back to the Nobletek PLM forum. Douglas Noordhoorn, the moderator of the forum challenged the audience stating that PLM has always been there (or not there – if you haven´t discovered it). It is all about managing the product development processes in a secure way. Not talking about “Best Practices” but “Good practices." Those who had a proper education in the aerospace industry learned that good processes are crucial to deliver planes that can fly and are reliable.
Of course, the aerospace industry is not the same as other industries. However, more and more other industries in my network, like Nuclear new build, the construction industry or other Engineering, Procurement and Construction companies want to learn from aerospace and automotive good practices. They realize they are losing market share due to the fact that the cost of failure combined with relative high labor costs makes them too expensive. But from where to they get their proper good practices education?
The PLM professional?
And this was an interesting point coming up from the Nobletek forum. There is no proper, product agnostic education for PLM (anymore). If you study logistics, you will learn a lot about various processes and how they can be optimized for a certain scenario. When you study engineering, there is a lot of focus on engineering disciplines and methods. But there is no time to educate engineers in-depth to understand the whole product development process and how to control it. Sometimes I give a guest lecture to engineering classes. It is never an important part of the education.
To become a PLM professional
For those who never had any education in standard engineering processes, there is Frank Watts Engineering control book, which probably would be a good base. But it is not the PLM professional only that should be aware, of the good practices. Moreover, all companies manufacturing products, plants or buildings should learn these basics. As a side step, it would make a discussion around BIM more clear. At this time, manufacturing companies are every time discovering their good practices in the hard way.
And when this education exists, companies will be aware that it is not only about the tools, but it is the way the information is flowing through the organization. Even there is a chance that somewhere at C-level someone has been educated and understands the value. For ERP everyone agrees. For PLM, it remains a labyrinth of processes designed by companies learning on the job currently. Vendors and implementers pushing what they have learned. Engineering is often considered as a hard-to-manage discipline. As a SAP country manager once said to me: “Engineers are actually resources that do not want be managed, but we will get them …..”
And then the future ……
I support the demand for a better education in engineering processes especially for industries outside aerospace or automotive. I doubt if it will have a significant impact although it might create the visibility and understanding for PLM at C-level. No need anymore for the lone ranger who fights for PLM. Companies will have better educated people that understand the need for good practices that exist. These good practices will be the base for companies when discussing with PLM vendors and implementers. Instead of vendors and implementers pushing their vision, you can articulate, and follow your vision.
However, we need a new standard book too. We are currently in the middle of a big change. Thanks to modern technology and connectivity the world is changing. I wrote and spoke about it in: Did you notice PLM is changing?
This awareness needs to become visible at C-level.
Who will educate them ??
Now back to soccer – 4 years ago Spain-The Netherlands was the last match – the final. Now it is the first match for them – will the Dutch change the game ?
The past month I had several discussions related to the complexity of PLM. Why is PLM conceived as complex ? Why is it hard to sell PLM internal into an organization ? Or to phrase it differently: “What makes PLM so difficult for normal human beings. As conceptually it is not so complex”
So what makes it complex ? What´s behind PLM ?
The main concept behind PLM is that people share data. It can be around a project, a product, a plant through the whole lifecycle. In particular during the early lifecycle phases, there is a lot of information that is not yet 100 percent mature. You could decide to wait till everything is mature before sharing it with others (the classical sequential manner), however the chance of doing it right the first time is low. Several iterations between disciplines will be required before the data is approved. The more and more a company works sequential, the higher costs of changes are and the longer the time to market. Due to this rigidness of this sequential approach, it becomes difficult to respond rapidly to customer or market demands. Therefore in theory, (and it is not a PLM theory), concurrent engineering should reduce the amount of iterations and the total time to market by working in parallel in not approved data yet.
PLM goes further, it is also about sharing of data and as it started originally in the early phases of the lifecycle, the concept of PLM was often considered something related to engineering. And to be fair, most of the PLM (CAD-related) vendors have a high focus on the early stages of the lifecycle and strengthen this idea. However sharing can go much further, e.g. early involvement of suppliers (still engineering) or support for after-sales/services (the new acronym SLM). In my recent blog posts I discussed the concepts of SLM and the required data model for that.
The complexity lies in the word “sharing”. What does sharing mean for an organization, where historically every person was awarded for the knowledge he/she has/owned, instead of being awarded for the knowledge this person made available and shared. Many so-called PLM implementations have failed to reach the sharing target as the implementation focus was on storing data per discipline and not necessary storing data to become shareable and used by others. This is a huge difference.
Some famous (ERP) vendors claim if you store everything in their system, you have a “single version of the truth”. Sounds attractive. My garbage bin at home is also a place where everything ends up in a single place, but a garbage bin has not been designed for sharing, as another person has no clue and time to analyze what´s inside. Even data in the same system can be hidden for others as the way to find data is not anticipated.
Data sharing instead of document deliverables
The complexity of PLM is that data should be created and shared in a matter not necessary the most efficient manner for a single purpose, however with some extra effort, to make it usable and searchable for others. A typical example is drawings and documents management, where the whole process for a person is focused on delivering a specific document. Ok for that purpose, but this document on its own becomes a legacy for the long-term as you need to know (or remember) what´s inside the document.
A logical implication of data sharing is that, instead of managing documents, organizations start to collect and share data elements (a 3D model, functional properties, requirements, physical properties, logistical properties, etc). Data can be connected and restructured easily through reports and dashboards, therefore, proving specific views for different roles in the organization. Sharing becomes possible and it can be online. Nobody needed to consolidate and extract data from documents (Excels ?)
This does not fit older generations and departmental managed business units that are rewarded only on their individual efficiency. Have a look at this LinkedIn discussion where the two extremes are visible.
“The sad thing about PLM is that only PLM experts can understand it! It seems to be a very tight knit club with very little influence from any outside sources.
I think PLM should be dumped. It seems to me that computerizing engineering documentation is relatively easy process. I really think it has been over complicated. Of course we need to get the CAD vendors out of the way. Yes it was an obvious solution, but if anyone took the time to look down the road they would see that they were destroying a well established standard that were so cost effective and simple. But it seems that there is no money in simple”
And a the other side Kais stating:
“If we want to be able to use state-of-the art technology to support the whole enterprise, and not just engineering, and through-life; then product information, in its totality, must be readily accessible and usable at all times and not locked in any perishable CAD, ERP or other systems. The Data Centric Approach that we introduced in the Datamation PLM Model is built on these concepts”
Readers from my blog will understand I am very much aligned with Kais and PLM guys have a hard time to convince Joe of the benefits of PLM (I did not try).
Making the change happen
Beside this LinkedIn discussion, I had discussions with several companies where my audience understood the data-centric approach. It was nice to be in the room together, sharing the ideas what would be possible. However the outside world is hard to convince and here it is about change management.
I read an interesting article in IndustryWeek from John Dyer with the title: What Motivates Blockers to Resist Change?
John describes the various types of blockers and when reading the article combined with my PLM twisted brain, I understood again that this is one of the reasons PLM is perceived as complex – you need to change and there are blockers:
Blocker (noun) – Someone who purposefully opposes any change (improvement) to a process for personal reasons
“Blockers” can occupy any position in a company. They can be any age, gender, education level or pay rate. We tend to think of blockers as older, more experienced workers who have been with the company for a long time, and they don’t want to consider any other way to do things. While that may be true in some cases, don’t be surprised to find blockers who are young, well-educated and fairly new to the company.
The problem with blockers
The combination of business change and the existence of blockers are one of the biggest risks for companies to go through a business transformation. By the way, this is not only related to PLM, it is related to any required change in business.
A company I have been working with was eager in studying their path to the future, which required more global collaboration, a competitive business model and a more customer centric approach. After a long evaluation phase they decided they need PLM, which was new for most of the people in the company. Although the project team was enthusiastic, they were not able to pass the blockers for a change. Ironically enough they lost a significant part of their business to companies that have implemented PLM. Defending the past is not a guarantee for the future.
A second example is Nokia. Nokia was famous for they ways they were able to transform their business in the past. How come they did not see the smartphone and touch screens upcoming ? Apparently based on several articles presented recently, it was Nokia´s internal culture and superior feeling that they were dominating the market, that made it impossible to switch. The technology was known, the concepts were there, however the (middle) management was full of blockers.
Two examples where blockers had a huge impact on the company.
Staying in business and remaining competitive is crucial for companies. In particular the changes that currently happen require people to work different in order to stay completive. Documents will become reports generated from data. People handling and collecting documents to generate new documents will become obsolete as a modern data-centric approach makes them redundant. Keeping the old processes might destroy a company. This should convince the blockers to give up
In my previous post, I wrote about the different ways you could look at Service Lifecycle Management (SLM), which, I believe, should be part of the full PLM vision. The fact that this does not happen is probably because companies buy applications to solve issues instead of implementing a consistent company wide vision (When and Where to start is the challenge). Oleg Shilovitsky just referred one more time to this phenomena – Why PLM is stuck in PDM.
I believe PLM as the enterprise information backbone for product information. I will discuss the logical flow of data that might be required in a PLM data model, to support SLM. Of course all should be interpreted in the context of the kind of business your company is in.
This post is probably not the easiest to digest as it assumes you are somehow aware and familiar with the issues relevant for the ETO (Engineering To Order) /EPC (Engineering Procurement Construction) /BTO (Build To Order) business
A collection of systems or a single device
The first significant differentiation I want to make is between managing an installation or a single device as I will focus only on installations.
An installation can be a collection of systems, subsystems, equipment and/or components, typically implemented by companies that deliver end-to-end solutions to their customers. A system can be an oil rig, a processing production line (food, packages, …), a plant (processing chemicals, nuclear materials), where maintenance and service can be performed on individual components providing full traceability.
Most of the time a customer specific solution is delivered to a customer, either direct or through installation / construction partners. This is the domain I will focus on.
I will not focus on the other option for a single device (or system) with a unique serial number that needs to be maintained and serviced as a single entity. For example a car, a computer device. Usually a product for mass consumption, not to be traced individually.
In order to support SLM at the end of the PLM lifecycle, we will see a particular data model is required which has dependencies on the early design phases.
Let´s go through the lifecycle stages and identify the different data types.
The concept / sales phase
In the concept/sales phase the company needs to have a template structure to collect and process all the information shared and managed during their customer interaction.
In the implementations that I guided, this was often a kind of folder structure grouping information into a system view (what do we need), a delivery view (how and when can we deliver), a services view (who does what ) and a contractual view (cost, budget, time constraints). Most of these folders had initially relations to documents. However the system view was often already based on typical system objects representing the major systems, subsystems and components with metadata.
In the diagram, the colors represent various data types often standard available in a rich PLM data model. Although it can be simplified by going back to the old folder/document approach shared on a server, you will recognize the functional grouping of the information and its related documents, which can be further detailed into individual requirements if needed and affordable. In addition, a first conceptual system structure can already exist with links to potential solutions (generic EBOMs) that have been developed before. A PLM system provides the ideal infrastructure to store and manage all data in context of each other.
The Design phase
Before the design phase starts, there is an agreement around the solution to be delivered. In that situation, an as-sold system structure will be leading for the project delivery, and later this evolved structure will be the reference structure for the as-maintained and as-services environment.
A typical environment at this stage will support a work breakdown structure (WBS), a system breakdown structure (SBS) and a product breakdown structure (PBS). In cases where the location of the systems and subsystems are relevant for the solution, a geographical breakdown structure (GBS) can be used. This last method is often used in shipbuilding (sections / compartments) and plant design (areas / buildings / levels) and is relevant for any company that needs to combine systems and equipment in shared locations.
The benefit of having the system breakdown structure is that it manages the relations between all systems and subsystems. Potentially when a subsystem will be delivered by a supplier this environment supports the relationship to the supplier and the tracking of the delivery related to the full system / project.
Note: the system breakdown structure typically uses a hierarchical tag numbering system as the primary id for system elements. In a PLM environment, the system breakdown elements should be data objects, providing the metadata describing the performance of the element, including the mandatory attributes that are required for exchange with MRO (Maintenance Repair Overhaul) systems.
Working with a system breakdown structure is common for plant design or a asset maintenance project and this approach will be very beneficial for companies delivering process lines, infrastructure projects and other solutions that need to be delivered as a collection of systems and equipment.
The delivery phase
During the delivery phase, the system breakdown structure supports the delivery of each component in detail. In the example below you can see the relation between the tag number, the generic part number and the serial number of a component.
The example below demonstrates the situation where two motors (same item – same datasheet) is implemented at two positions in a subsystem with a different tag number, a unique serial number and unique test certificates per motor.
The benefit of a system breakdown structure here is that it supports the delivery of unique information per component that needs to be delivered and verified on-site. Each system element becomes traceable.
The maintenance phase
For the maintenance phase the system breakdown structure (or a geographical breakdown structure) could be the place holder to follow up the development of an installation at a customer site.
Imagine that, in the previous example, the motor with tag number S1.2-M2 appears to be under dimensioned and needs to be replaced by a more powerful one. The situation after implementing this change would look like the following picture:
Through the relationships with the BOM items (not all are shown in the diagram), there is the possibility to perform a where-used query and identify other customers with a similar motor at that system position. Perhaps a case for preventive maintenance?
Note: the diagram also demonstrates that the system breakdown structure elements should have their own lifecycle in order to support changes through time (and provide traceability).
From my experience, this is a significant differentiator PLM systems can bring in relation to an MRO system. MRO and ERP (Enterprise Resource Planning)systems are designed to work with the latest and actual data only. Bringing in versioning of assets and traceability towards the initial design intent is almost impossible to achieve for these systems (unless you invest in a heavy customized system).
In this post and my previous post, I tried to explain the value of having at least a system breakdown structure as part of the overall PLM data model. This structure supports the early concept phase and connects data from the delivery phase to the maintenance phase.
Where my mission in the past 8 years was teaching non-classical PLM industries the benefits of PLM technology and best practices, in this situation you might say it is where classical BTO companies can learn from best practices from the process and oil & gas industry.
Note: Oleg just published a new blog post: PLM Best Practices and Henry Ford Mass Production System where he claims PLM vendors, Service partners and consultants like to sell Best Practices and still during implementation discover mass customization needs to be made to become customer specific, therefore, the age of Best Practices is over.
I agree with that conclusion, as I do not believe in an Out-Of-The-Box approach, to lead a business change.
Still Best Practices are needed to explain to a company what could be done and in that context without starting from a blank sheet.
Therefore I have been sharing this Best Practice (for free)
Some weeks ago there was a vivid discussion around the need for SLM (service lifecycle management) besides PLM started in a PLM LinkedIn group. Of course, the discussion was already simmering in the background in other LinkedIn groups and fora (forums) triggered by PTC´s announcement to focus on SLM and their “observation” that they were probably the only PLM vendor to observe that need. The Internet of Things is in one pen stroke connected with SLM. (Someone still using a pen?)
Of course it is not that simple and I will try to bring some logic in the thought process, the potential hype and the various approaches you could take related to SLM
First SLM as a TLA (Three Letter Acronym). If you would Google what is the meaning of SLM the most common meaning is Hello, often said on IRC, this is short for “salaam”, or hello.
In the context of PLM it is a relative new acronym and the discussion on LinkedIn was also about the fact if we needed a new TLA. In general. What we try to achieve with SLM is: the ability to trace and follow existing products at customers and to provide advanced or integrated services to them. In a basic matter this could be providing documentation and service information (spare parts information). In an advanced manner, this could be thinking about the Internet of Things, be products that connect to the home base and provide information for preventive maintenance, performance monitoring and enhancements, etc.
The topic is not new for companies around the world that have a “what can we do beyond PDM” vision, as I was involved already in 2001 in discussion with a large Swiss company providing solutions for the food processing industry. They wanted to leverage their internal customer centric delivery process and extend it to their customer support using a web interface for relevant content: spare parts lists and documentation.
I am sure one or two readers of this blog post will remember “the spindle case” (the only part in the demo concept that had real data behind it at that time)
For many industries and businesses the customer services (and the margin on spare parts) are the main areas where they make a sustainable profit to secure the company’s future. Most of the time, the initial sale and/or delivery of their products are done with relative low margin due to the competitive sales situation they are during selling. And of course the sale itself is surrounded with uncertainty which vendors have to accept.
If they would ask for more certainty – it would require a more detailed research, which is costly for them or considered as a disadvantage by their potential customer. As other competing vendors do not insist on further research, your company might consider not being “skilled” enough to estimate properly a product.
The above paragraph implicitly clarifies that we are mainly talking about companies, where their primary process is Engineering to Order or Build to Order. For companies where the product is delivered through a Configure to Order or an Off-the-Shelf approach, there is no need to work in a similar manner. Buying a computer or a car has no sales engineering involved anymore. There is a clear understanding of the target price and of course resellers will still focus on differentiating themselves by providing adjacent services.
So for simplicity I will focus on companies with a BTO or ETO primary business process
SLM and ETO
In a real Engineering to Order process, traditionally the company that delivers the solution to the client will not be really involved in the follow up of the lifecycle of the products delivered. The delivered product (small machinery, large machinery or even an installation or plant) is delivered to the customer and with the commissioning and handover a lot of information is transferred to the customer, based on the requirements of the customer.
Usually during this handover, a lot of intelligence of the information is gone, as the customer does not have the same engineering environment and therefore requires information is “neutral” formats: paper (less and less), PDFs (the majority) and (stripped) CAD data combined with Excels.
The information battle here between the ETO-delivery company and the customer is, that the ETO-delivery company does not want to provide too much information to the customer, to make the customer fully independent, as the service and spare parts business is the area where they can make their margin. The customer, however, often wants to have ownership of the majority of data, but also there is the awareness if they ask too much; they will pay for it (as an engineering company will consider this as extra work). So finding the right balance is the point.
However, the balance is changing, and this is where SLM comes in.
More and more we see that companies who purchased in the past an Engineering to Order product (or even plant) are changing their business model towards using the product or running the plant and ask from the Engineering to Order company to provide the solution as a service. A kind of operation lease including resources. This means solutions are no longer sold as a collection of products, but as an operational model (40.000 chickens / day, 1 Mio liter/day, 100 000 Tons / year, etc., etc.)
The owner of the equipment is no longer the owner, but pays for the service to perform the business. Very similar to SaaS (software as a service) solutions. You do not own the software anymore; you pay for using it, no matter what kind of hardware / software architecture there is behind the offering.
In that case, the Engineering to Order company can provide much more advanced services when they extend their delivery process with capabilities for the operational phase of the product. As a more integrated approach eliminates the need for this disruptive handover process. Data does not need to be made “stupid” again, it is a continuous flow of information.
How this can be done, I will describe in an upcoming, more technical, blog post. This approach brings value to both the Engineering to Order company and the owner/operator of the product / plant.
As it is a continuous flow of information, I would like to conclude this topic by stating that, for Engineering to Order companies, there is no need to think about an extra SLM solution. You could label the last part of the PLM process the SLM domain.
As the customer data is already unique, it is just a normal continuation of the PLM process.
Two closing notes here:
- I have seen already Engineering to Order companies that provide the whole maintenance and service of the delivered product / plant to their customer integrated in their data environment. (so it is happening !)
- Engineering to Order companies are still discovering the advantages of PLM to get a cross-project, cross-discipline understanding and working methodology for their delivery process. Historically they were thinking in isolated projects, where the brain of experienced engineers was the connection between different projects. Now PLM practices are becoming the foundation for sharing and capitalizing on knowledge.
And with the last remark on capitalizing the knowledge, we move from the Engineering to Order industry to the Build to Order
SLM and BTO
In the Build to Order industry, the company that delivers a solution to their customer, has tried, in a way, to standardize certain parts of their total solution. These parts can be standardized/configurable machinery or standardized/configurable equipment, or even a level higher standardized systems and subsystems.
More configurable/modular standardization is what most companies are aiming for. As the more you modularize your solution parts, the clearer it will be that there are two different main processes inside the same organization:
- One process, the main process for the company, fulfilling the customer need. In this process it is about combining existing solution components and engineering them together in a customer specific solution. This could be a PLM delivery model like ETO.
- One process to enhance, maintain and develop new solution components, which is a typical R&D process. Here I would state PLM is indisputable needed, to bring new technology and solutions to the main business process
So within a company, there might be the need for two different PLM solution processes. From my observations in the past 10 years, companies invest in PDM for their R&D process and try to do a little of PLM on top of this PDM implementation for their delivery process. This basic PLM process usually focuses again on the core of the engineering process of delivery, starting somewhere from the specifications till the delivery of the solution.
So “full” PLM is very rare to find. The front end of the delivery process, systems engineering, is often considered complex and often the customer does not want to engage fully in the front end definition of the solution.
“You are the experts, you know best what we want” is often heard.
Ironically in an analogue situation this is often the case of PLM implementations at risk. Here the company expects the PLM implementer to know what they want, without being explicit or understanding what is needed.
To extend the discussion for PLM and SLM, I would like to change the question to a different dimension first:
Do we need two PLM implementations within one company ?
One for R&D and one for the delivery process ?
Reasons to say No are:
- Simplicity – it is easier to have one system instead of two systems
- The amount of R&D activity is so low compared to the delivery process; the main PLM system can support this.
Reasons to say Yes are:
- The R&D process is extremely important as is the delivery process
- The R&D process is extremely important and we have a large customer base to serve
Reading these two options, it brings some clarity.
If the R&D process is a significant differentiator and you are aiming to serve many customers, it makes sense to have two PLM implementations.
Still two PLM implementations could be based on the same PLM infrastructure and I would challenge readers of this post to explain why it should be a single instance of a PLM infrastructure.
Why two PLM systems
- I believe based on the potential huge amount of data a single instance would create a data monster, where we can see that connected systems (using big data) is the future.
- In other concepts there is an enterprise PLM and local PDMs exactly because there is no single system that can do all in an efficient manner.
Still I haven´t talked about SLM, which could be part of the delivery process, where you manage customer specific data. For that, more detail in my next blog post, there is are some data model constraints for the PLM system.
I would state you only can use a separate SLM system if you are not interested in data from the early phases of the delivery process. In the early phase, you use conceptual structures to define the product /installation/plant. These conceptual structures are to my opinion the connection between the concept phase and the service phase. Usually tag numbers are used to describe the functional usage of a product or system, and they are the ones referenced by service engineers to start a service operation.
Only when this view or need does not exist, I can imagine, SLM is needed, where potential based on serial numbers, services are tracked and monitored and are fed back to the R&D environment. The R&D environment then would publish product data into the SLM system
You might be confused at this time, as I did not bring the various information structures into this post to clarify the data flow for the delivery process. This I will do in my upcoming post.
Why not CTO and SLM ?
I haven´t discussed Configure to Order (CTO) here, as I consider CTO a logistical process, which logically is addressed by the ERP system. The definitions of the configurations and its related content probably will be delivered through a PDM/PLM system, so the R&D type of PLM system will exist in the company.
SLM most logically would be performed in this situation by the ERP system, as there is no PLM delivery layer. Having said this, a new religion discussion might come up. Is SLM a separate discipline or is it part of the ERP system?
This topic is no discussion for the big ERP vendors – they do it all J, but it is up to your company if a Swiss knife is the right tool to work within your organization.
For the moment I would like to conclude:
- PLM and SLM –> No (only Yes in isolated cases)
- PLM and PLM –> Yes (as SLM requires the front end of PLM too)
Do we need SLM ? Perhaps yes as a way to describe a functional domain. No when we are talking about another silo system. I believe the future is in connectivity of data and in the long term PLM, ERP and SLM will be functional domains describing how connected data will serve particular needs.
Looking forward to your thoughts
The product innovation conference in February has become one of my favorite events, mainly for networking. Perhaps PLM vendors try to give you the impression that we are in a fast moving world. In reality, most companies are moving in a much slower pace than these vendors dream of. In general for an outsider, last year might have looked similar to what happened this year. In this post, I will describe the subtle differences that I noticed.
The event was in the same location as last year with approximately. 400 participant including 60 speakers. The conference had three main streams: keynotes, PLM and design. The PLM and design sessions were most of the time parallel sessions. Great if you are interested in one domain only, a little more challenging for people who are enjoying to be in both domains. However the good news is that all participants will have access to the recorded sessions in a week or two. And from last years’ experience I can say the recordings are good, so I am looking forward to a virtual additional conference in two weeks from now.
Some remarks about the sessions that I was able to attend
Going to Mars ?
Bas Lansdorp explained us about the Mars One mission, what was the drive and challenge behind establishing a permanent human settlement on Mars. It was an inspiring opening session to make you think out of the box. Several interesting topics came up.
1. First of all that most of the mission’s materials need to be basic, proven technology instead of modern, innovative concepts. As maintenance and risks for issues need to be minimized, it is better to keep it with proven technology.
2. The crew selection is a long process – the first crew will fly in 10 years from now, so who are those individuals that want to take up the challenge to stay forever with 3 others, and every few years some more people will come. But hard to escape, and there is no way back. Amazing!
3. Part of the funding can be done by media rights. Bas explained the revenues that are related, for example, with the Olympic Games are already stunning. Imagine to have “Live on Mars” as a reality soap available all around the world. Programs like Big Brother demonstrate that it is in our nature just to watch ordinary people see how the behave. Will they fight? Will they have sex? Public voyeurism and eternal fame.
Although the keynote had no relation to PLM, I felt energized by the entrepreneurial thinking of Bas, following his passion and wanting to realize it. As Mars does not need the first centuries entrepreneurs, it was clear Bas is not part of the first crew.
Managing complexity and volume
Next Peter Smith from VF International presented the huge challenge his group of companies had to manage the complexity of the various products and their seasonal deliveries, up to 12 collection models per year. The group with famous brands like The North Face, Lee, Wranglers, JanSport, Kipling and Timberland has the challenge to deliver 500 Mio units/year which means 16 units/second ! For sure an execution engine. So where does PLM fit?
For Peter PLM is part of the infrastructure, a glue for the innovation process, but not driving the innovation process. They try to standardize on a single PLM system, but some of the brands have such characteristics and history that this was not possible to realize. As the business must go on, a new PLM should not be disruptive for business.
The two main challenges Peter sees for current PLM are:
- The software models available for them as consumers. Changes go here too slow
- Organizational change implications. How to change when change is hard?
It was clear from Peter’s experience that many of his points were from the IT-perspective. During the networking break when I spoke with others, some of them mentioned that the business value for PLM was missing in Peter’s analysis – too much tool/infrastructure.
The digital value chain
An interesting session from Michael Bitzer (Accenture) and Sebastien Handschuh (Daimler). After an introduction about the German initiative Digital Industry 4.0 the remaining part of the session was around Daimler´s approach to use JT as a neutral, application independent format for their 3D data. At this time, Daimler has already over a 6 Mio JT-files and the format has been proven to fulfill their process needs.
Where possible Daimler aims to collaborate with suppliers in JT format for 3D. In this manner, their suppliers are not forced to use exclusively CATIA or NX. And the answer one question from the audience if Daimler was supporting the Siemens flavored JT or the real neutral JT format, it was clear that Daimler was aiming for the neutral format. I believe an interesting move to a more generic data approach in this case for 3D CAD data instead of original file formats. Hopefully more standardization to follow.
PLM selection: Do´s and Don’ts
I was moderating a discussion session for companies that were in the process of selecting a PLM system or that wanted to share their experience. Unfortunate the session was overpopulated with a lot of people not all necessary in the selection process. Due to the large audience not really an opportunity to have an in-depth discussion. Still it was amazing to see that there are still companies where the value of PLM is not clear at the management level and therefore the focus is on quick ROI.
In a one-to-one discussion afterwards I learned about a company where the shareholders/investors of a company forced the PLM project to fail by pushing unrealistic deadlines and not understanding the human and business change required. Unrealistic ROI expectations and lack of understanding where PLM really brings a competitive advantage is missing. Worst case due to their short-term focus the company will slowly be out of business as competitiveness and margins will reduce. For this type of situations, there is the excellent Dilbert cartoon below.
Secure data sharing in the extended enterprise
An interesting session was organized by Häkan Kårdén (Eurostep) and Kristofer Thoresson (Siemens Industrial Turbomachinery). Siemens had chosen to use the Eurostep Share-A-space environment between their internal data (their PDM system and other data sources) and the external data from suppliers, customers and field services. A pragmatic concept and interesting to see Share-A-space Found-Its-place. PLM Vendors probably would claim that their system could organize this secure and remote access without the need for a system in between. But the fact that a Siemens company decides to use Share-A-space demonstrates there is still a gap between a potential safe, single PLM based implementation and a pragmatic separation approach.
PLM is changing
In my session that afternoon I focused on the visible change in PLM. From an IT infrastructure for file collaboration towards a more data-centric business driven approach. And from there looking into the future anticipate that moving towards a data-centric approach is crucial to be ready for advanced computer power and brain-matching algorithms. This will be the game changers I believe in the upcoming decade in line with the Industry 4.0 ideas. My past two post have been indicating this direction:
A Circular economy
Peter Bilello from CIMdata had a good presentation related to the change in business we see and must make. No longer can we afford an economy where we waste raw materials. The circular economy is about supporting the product lifecycle from cradle-to-cradle instead of the classical cradle-to-grave. This is what you could call the circular economy; This matches the trend that companies more and more will deliver services to their customers instead of selling products to them. Instead of buying a fridge you pay for cooling capacity and your supplier changes the current model with a new model after three years. The service or experience economy fitting very nicely with the new generations that seem to prefer more to live and share at the moment instead of owning property.
Your digital shadow
The closing keynote from Stephanie Hankey was like the starting keynote. No relation to PLM but interesting in the context of what the effects are from digitalization and mobility. She provided some insights about the data that is already collected from each individual (or device) and how this all can be combined in profiles – your digital shadow. And of course your shadow might give the wrong impression. You can imagine that with growing trend of smart devices and the Internet of Things it will be hard to stay out of it. Companies will sell and buy data sets from their potential customers (victims). Scary as it all happens in the background and you are not fully aware of it.
(At the point, I was writing this paragraph my computer crashed with a blue screen – coincidence?)
Cultured beef ?
After a good burger and discussion in the evening, the opening keynote on day two was from Mark Post with the title Cultured Beef – changing the way we eat and think about food forever. Another interesting keynote where Mark explained how we can feed the growing world population in a more sustainable way by creating animal products through cell culture and bio fabrication instead of farming. The process is still in the early days of discovery but by using cell culture you can assure you get the right meat, even without fat, and it is real meat. Currently still expensive. Mark estimates that with current technology and up scaling of the process a price of $ 65 per kilo can be reached. Too expensive for consumers at this time but a promising number for the future. Another (Dutch) keynote speaker that made us think differently for the rest of the day.
Next Bjarne Nørgaard from MAN Diesel & Turbo gave a good lecture for the audience, what it takes to design and build a ship. You build the engine and wrap the ship around it. The challenge for MAN is to follow, service and maintain the engine through is 30 year’s lifecycle and possibly longer. Next Bjarne went into the details of their information architecture, and it was surprising to learn that their PDM system was Siemens and that they used Aras on top of that for connecting data to the rest of the enterprise and lifecycle of the engine. You would assume two PLM systems in-house for one company is an overkill. Bjarne explained that they tried initially to achieve these goals with Teamcenter but failed due to lack of flexibility. Great marketing for Aras, bad for Siemens. Although I am sure the cultural aspect has played a role. No one likes their first PLM or ERP system, as the first implementation is this domain is the moment you have the biggest internal culture shock.
Using search and semantic technology
The presentation from Moises Martines-Ablanado (Configuration Management Airbus Group) and Thomas Kamps (Conweaver) was interesting as they demonstrated one of the upcoming concepts I foresee will have a great future. Conweaver connects to existing enterprise systems (PLM, ERP, CRM, and legacy) and create a semantic mapping and linking of the data indexed from these systems. And through this network of data provide apps with a particular purpose. For example identify directly changes in the current EBOM and MBOM and potentially from there update the MBOM based in EBOM changes. A concept I have seen with Exalead too, illustrating that once you are in a data-centric environment, combining data sources for particular purposes can be achieved fast. No need for the classical approach of a single database that stores all.
A new TLA ? CLM
Joy Batchelor gave a clear presentation why besides PLM and ERP Jaguar Landrover (JLR) needs a third system supporting the connectivity of product configurations and sales configurations. They are able to manage 58.000.000.000 combinations for 170 different markets, which means every person on this planet could have its unique Jaguar Landrover. Joy introduced CLM (Configuration Lifecycle Management) as the third domain needed to support these configurations. The system they are using is ConfigIT, and I assume all automotive vendors have their own toolsets to manage the product and marketing configurations. I hope to learn more on that area. Will CLM be a separate domain or will it be absorbed by PLM or ERP vendors in the future ? Time will tell/
A game changer ?
Henk Jan Pels from the Eindhoven University of Technology took us back in time and explained how ERP became visible on the CFO’s agenda eliminating the discussion on ROI. Where ERP is handling material flows, to develop and deliver products there is also a need for knowledge flows between requirements, functional and the physical definition of a product. Expanding these flows to a framework that covers the technology, the building blocks, the families and the individual products would be the ideal interaction Henk Jan is proposing. And a PLM system would be the environment to implement this concept. Henk Jan announced this as a game changer. I agree if management of companies spend times to understand the benefits, it will be a game changer. Somehow it remained an academic concept and I believe we are all eager to learn if companies will adapt this idea, knowing change to something that is not common or traditional is a cultural risk.
The German future ?
The final presentation I could attend was from Martin Eigner, who first explained in some detail what the Industry 4.0 approach was about. From there he took us into the world of model based systems engineering. You could say an integration of PLM with more virtual system modeling and analysis as the front end of the development process. Somehow similar to last year’s presentation, but understandable as the world of PLM does not evolve so fast.
This is somehow also my conclusion from this year’s event. I was hoping to see some new sparks. For sure the keynotes were inspiring although less related to PLM. The case from Airbus and Conweaver was inspiring as I believe search and semantic based applications are a logical extension for the challenges companies want to address with PLM. JLR’s presentation explaining the need for Configuration Lifecycle Management strengthened my thought that in the future PLM and ERP will disappear. It is about a business platform with combined services, which might fall in one of the classical categories. I believe for many people the German Innovation 4.0 should be studied and replicated as it acknowledges exactly the future trend to remain competitive.
It was a pity for the public that Siemens PLM, Dassault Systèmes and Autodesk were not there. As the two largest PLM vendors and one of the largest PLM challengers, you would expect them be there and allow prospects and PLM consultants to compare where each of the PLM companies is different. Still it was a good conference. Well organized and as mentioned in the introduction, all presentations are recorded, giving everyone the opportunity to digest and review content again.
I am looking forward to the next Product Innovation conference with perhaps some more PLM related keynotes and big data practices.
I will be attending the annual Product Innovation conference again in Berlin next week. Looking forward to this event, as it is one of the places where you have the chance to network and listen to presentations from people that are PLM minded. A kind of relaxation, as strangely enough, most of the companies I am visiting, considerer PLM still considered as something difficult, something related to engineering, not so much connected to the future of their business.
I believe one of the reasons is that people have founded their opinion on the past. An expensive implementation horror story, an engineering focuses implementation or other stories that have framed PLM in a certain manner.
However PLM has changed and it significance has grown !
During the Product Innovation conference, I will present in more depth this topic related to the change of PLM.,with more examples and a surprising projection to the future. Later, when time permits, I will share the more in-depth observations in my blog, hopefully extended based on discussions during the conference. And if you attend the conference, don’t miss my session.
the term PLM (Product Lifecycle Management) was introduced as a logical extension to cPDM (collaborative Product Data Management). Where the initial focus was of global file sharing of mechanical CAD data, PLM extended the scope with multidisciplinary support, connecting manufacturing preparation and providing an infrastructure for change management.
In the nineties product data management was in transition.
In the early 90s, UNIX dominated, and installing a PDM system was the work of IT-experts. Large enterprises, already operating globally, were pushing for standardization, and control of data to connect their engineers in a more efficient manner. Connectivity was achieved through expensive lease lines; people like me, had to connect to the internet through dial-up modems and its usage was limited, providing static web pages with minimal graphics.
It was obvious that cPDM and the first PLM projects were extremely expensive. There was no experience; it was learning on the job. The costs were high and visible at the management level. Giving the management the impression that PLM is potentially the same challenge as ERP, but with a less clear scope. And the projects were executed by IT-experts, end-users were not really in the game.
At the end of the 90s, a small revolution started to take place. The power of the PC combined with Microsoft technology provided a much cheaper and flexible alternative for a complex UNIX based implementation.
Affordable 3D CAD emerged in the mid-market, leading to the need for Windows-based PDM systems and with Windows came Excel, the PDM/PLM killer application.
A person with some rudimentary Visual Basic skills could do magic with Excel and although not an IT-expert would become the champion of the engineering department.
At that time, PLM conferences provided a platform on which industry could discuss and share their tips and tricks on how to implement in the best manner a system. The focus was mainly on the IT-side and large enterprises. The scope was engineering centric, connecting the various disciplines including mechanical, electrical and simulation, in a database and connecting files and versions.
most large enterprises had already started to implement a PLM system. The term PLM became an accepted acronym associated with something that is needed for big companies and is complex and expensive, a logical statement based on the experiences of early adopters.
PLM was the infrastructure that could connect product information between disciplines and departments working from different locations. The NPI (New Product Introduction) process became a topic pushed by all enterprise PLM vendors and was a practice that demonstrated the value of providing visibility on information across a large, dispersed company, to better decision-making.
As this process was more data-centric instead of CAD-centric, these capabilities promoted the recognition and introduction of PLM in non-traditional manufacturing industries like Consumer Packaged Goods, Pharmaceuticals and Apparel where planning and coordination of information leads, instead of a Bill of Material.
In large enterprises, PLM still lay with the IT-architects as they were the ones deciding the standards and software to be used. PLM and ERP connectivity was an expensive topic.
For the mid-market, many PLM vendors were working on offers to standardize a PLM implementation; this usually involved a stripped-down or limited version from the full PLM system, a preconfigured system with templates or something connected to SharePoint. Connectivity was much easier then 15 years ago, thanks to a better internet infrastructure and the deployment of VPN.
For me at that time selling PLM to the mid-market was challenging; how do you explain the value and minimize the risk while current business was still running well? What was so wrong with the existing practices based on Excel? In summary, with good margins and growing business, wasn’t everything under control without the need for PLM? This was the time I started to share my experiences in my blog: A Virtual Dutchman´s introduction
Mid-market PLM projects focused on departmental needs, with IT providing implementation support and guidance. As the number of IT-staff is usually limited in these companies and often organized around ERP and what they learned from its implementation, it was hard to find business experts for PLM in the implementation teams.
the financial crisis had started, and globalization had started to become real through world-wide connectivity – better infrastructure and WEB 2.0. The world became an open space for consumers and competitors; the traditional offshore countries became consumers and began to invest in developing products and services for their domestic market but also targeted the rest of the world. Large enterprises were still expanding their huge PLM implementations though some were challenged because of a change of ownership. Capital investors did not come from the US or Europe anymore but from the BRIC (Brazil, Russia, India, China) countries, forcing some established companies to restructure and refocus.
In response to the crisis, mid-market companies started to reduce costs and focus on efficiency. Lots of discussions related to PLM began as it appeared to be THE strategy needed to survive, though a significant proportion of the investment in PLM was cancelled or postponed by management due to uncertainty and impact on the organization.
PLM conferences showed that almost all of the big enterprises and the mid-market companies still using PLM for connecting departments without fundamentally integrating them in one complete PLM concept. It is easier to streamline the sequential process (thinking lean) instead of making it a concurrent process with a focus on the market needs. PLM conferences were being attended by a greater mix of IT and Business representatives from different businesses learning from each other.
everyone in the world is connected and consequently, the amount of data is piling up. And now it is more about data than about managing document. The introduction of smart devices has had an impact on how people want to work; instead of sharing files and documents, we start sharing and producing huge amounts of data. In addition the upcoming “Internet of Things” demonstrates we are moving to a world where connectivity through data becomes crucial.
Sharing data is the ideal strategy for modern PLM. PLM vendors and other leading companies in enterprise software are discovering that the classical method of storing all information into one database does not work anymore and will not work in the future.
In the future, a new generation of PLM systems, either as an evolution of existing systems or as a disruption from the current market, will come. No longer will the target be to store all information in one system; the goal will be to connect and interpret data and make the right decisions based on that. This is similar to what the new generation of workers are used to, and they will replace the (my) older generation in the upcoming decade
Combined with more and more cloud-based solutions and platforms, the role of IT will diminish, and the importance of business people driving PLM will become ever more crucial.
PLM has become a business-driven strategy and requires people that are strong enough to develop, justify and implement this approach in their companies. New champions are needed !
The value of communities, blogs and conferences
is bringing together the global brainpower in social environments. Complemented with presentations, opinions and discussions from all different industries and domains the ideal environment to grow new ideas. Here you can associate the information, question its relevancy for your business and network with others – the perfect base for innovating and securing your future business.
Therefore, do not use communities or conferences to stick to your opinion but be open and learn.
One of my favorite quotes