You are currently browsing the category archive for the ‘PLM’ category.
In the past two years, I have been heavily involved in PLM Proof of Concepts sitting at both sides of the table. Supporting companies in their PLM selection, supporting a vendor explaining their value to the customer and supporting implementers assisting them with industry knowledge, all in the context of a PLM selection process.
The Proof of Concept is crucial in a PLM selection process as it is the moment where the first glimpse of reality comes to the table.
Different size of companies, different consultants all have a different view on the importance of the Proof of Concept. Let me share you my thoughts after a quick recap on the PLM selection process.
The PLM selection process
1. Build a vision
It is important that a company understands what they want to achieve in the next five to ten years, before starting a PLM selection process. Implementing PLM means a business transformation, even if you are a small company. If the management does not understand a vision is required, there is a potential risk upcoming, as PLM without a change in the way people work, will not deliver the expected results.
2. Issue an RFI to potential candidates
Once you have a PLM vision, it is time to get in touch with potential suppliers. The RFI (Request for Information) phase is the phase where you can educate yourself better by challenging the suppliers to work with you on the future solutions.
3. Discuss with selected candidates
From the RFI responses you understand which companies are attractive because they match your vision, your budget or industry. Have a first interaction with the selected companies and let them demo their standard environment targeted to your vision.
In this stage, you check with the preferred companies their ability to deliver and your ability to work together. The POC phase should give you the understanding of the scope for the upcoming PLM project and help you to understand who and how the project can be executed. More details about this step below.
Although some companies start with an RFP before the POC, for me it makes most sense to verify the details after you have a proper understanding of the To-Be solution. The RFP is often the base for the contractual scope and therefore should be as accurate as possible
In the past, I wrote in more detail about the PLM selection process. Two posts: PLM selection: Don’t do this and PLM selection: Do this. Have a read if you want to understand this part in more depth. Now let´s focus on the POC .
- As described before, the target of the Proof of Concept should be to get a better understanding of the potential To-Be processes and obtain an impression of the capabilities of the implementer and the preferred PLM software.
The result should be that you have more realistic expectations of what can be achieved and the challenges your company will face.
- From there, you can evaluate the risks, address them and build an achievable roadmap to implement. It is important that the focus is not just on the cost of the implementation.
- To sell PLM inside your company, you need to realign with the vision and explain, to all people involved,the value of “Why PLM”.
Explaining the value is complex, as not everyone needs the same message. The management will focus on business benefits where users will focus how it impacts their daily life. If you forget to explain the value, the PLM projects, it is considered again as just another software purchase.
Make sure the Proof of Concept is driven by validating future business scenarios, focusing on the To-Be solution. The high-level scenarios should be demonstrated and explained to the business people. In this stage, it is important people realize the benefits and the value of the new processes.
The POC is also an internal sales event. The goal should be to get more enthusiastic and supportive business people in your company for the upcoming PLM project. Identify the champions you will need to lean on during the implementation.
Test the implementer. To my opinion the critical success of a PLM implementation depends on the implementation team, not on the software. Therefore, the POC phase is the best moment to learn if you can work with the implementer. Do they know your business? Do they have experience with your business? The more you are aligned, the higher the chance you will be successful as a team
Show commitment to engage. Often I have seen POC engagements where the company demanded the implementer or vendor a Proof of Concept for free. This creates an unbalanced situation during the Proof of Concept as the vendor or implementer can not invest time and resources in the process as expected without any commitment from the company. By paying a certain fee for the POC, a company can demonstrate to the implementer /vendor that this POC is valuable for you and you can request the same response from them.
The Proof of Concept is not a detailed function/feature check to identify each mouse-click or option in the system. During the implementation, these details might come up. It is important in a Proof of Concept to understand the big picture and not to get lost in the details. As human beings we tend to focus on what does not work, not realizing that probably over eighty-ninety percent works according the needs
Do not expect the ultimate To-Be scenario demonstrated during the Proof of Concept. The Proof of Concept is a learning stage for both the company and the implementer to imagine the best possible scenario. PLM systems are generic and likely they will not provide a similar configuration and functionality matching your environment. At this stage validate if the primary capabilities are there and if there are gaps.
Do not run a POC with a vendor (only). This might be one of the most critical points for a POC. A PLM software vendor’s target is to sell their software and for that reason they often have dedicated presales teams that will show you everything in a smooth manner, overwhelming you with all the beauty of the software. However after the POC this team is gone and you will have to align yourself again with the implementation partner, trying to match again your business needs and their understanding.
Realize – you get what you are asking for. This is more a Do-and-Don’t message packed together. A Proof of Concept phase is a point where companies get to know each other. If you are not focused, do not expect the implementer / vendor to be committed. A PLM implementation is not product. It is a business transformation supported by products and services. Do not treat PLM implementers and vendors in the same way, as your customers treat you (in case you deliver products).
There are still many more thoughts about the Proof of Concept . Ideally you run two POCs in parallel, either with two implementers of the preferred software (if possible) or with two different implementers representing different software.
Ideally, as I know it is a challenge, especially for small and medium-sized businesses, where people are running to keep the business on-going.
Still remember, PLM is a business transformation, targeting to improve your business in the upcoming five to ten years, avoiding you are running out of business.
Your thoughts ?
As a bonus a short anecdote that I posted in 2010 still relevant:
Some time ago a Christian PLM Sales professional died (let’s call him Jack) and according to his believe he faced Saint Peter at the gates of Heaven and Hell.
Saint Peter greeted Jack and said: “Jack, with the PLM Sales you have done good and bad things to the world. For that reason, I cannot decide if you should go to Heaven or to Hell. Therefore, I allow you to make the choice yourself”.
Jack replied: “But Saint Peter, how can I make such an important decision for the rest of my eternal life. It is too difficult!”
Saint Peter replied: “No problem Jack, take a look at Heaven and Hell, take your time and then tell me your decision.”
Jack entered Heaven and he was surprised about the quietness and green atmosphere there. Angels were singing, people were eating from golden plates with the best food ever, people were reading poetry and everything was as peaceful as you could imagine. In the distance, he could see God surrounded by some prophets talking about the long-term future. After some time, Jack had seen it and went to Hell to have a view there.
And when he opened the gates of Hell, he was astonished. Everywhere he looked there were people partying, having fun. It reminded him off these sales kick-offs, he had in the past, exotic places with lots of fun. In the distance, he could see the Devil as DJ playing the latest dance music – or was it DJ Tiësto?
Jack did not hesitate and ran back to Saint Peter, no time to lose. “Saint Peter,” he said “I want to go to Hell, no doubt. And pity I did not know it before”
“So be it, ” said Saint Peter “go for it.”
And then once Jack entered Hell, it was suddenly all fire around him, people were screaming of pain and suffering and also Jack felt the first flames.
“Devil!!” He screamed “what happened to what I have seen before?”
With a sarcastic voice, the devil replied: “That? That was a proof of concept.”
Shaping the PLM platform of the Future
It was the first time I attended this event. I was positively surprised about the audience and content. Where other PLM conferences were often more focusing on current business issues, here a smaller audience (130 persons) was looking into more details around the future of PLM. Themes like PLM platforms, the Circular Economy, Open Standards and longevity of data were presented and discussed here.
The emergence of the PLM platform
1. The product lifecycle will become more and more circular due to changing business models and in parallel the different usage/availability of materials will have an impact how we design and deliver products
Can current processes and tools support today’s complexity. And what about tomorrow? According to a CIMdata survey there is a clear difference in profit and performance between leaders and followers, and the gap is increasing faster. “Can you afford yourself to be a follower ?” is a question companies should ask themselves.
Rethinking PLM platform does not bring the 2-3 % efficiency benefit but can bring benefits from 20 % and more.
Peter sees a federated platform as a must for companies to survive. I in particular likes his statement:
The new business platform paradigm is one in which solutions from multiple providers must be seamlessly deployed using a resilient architecture that can withstand rapid changes in business functions and delivery modalities
Industry voices on the Future PLM platform
Steven Vetterman from ProSTEP talked about PLM in the automotive industry. Steven started describing the change in the automotive industry, by quoting Heraclitus Τα πάντα ρεί – the only constant is change. Steven described two major changes in the automotive industry:
1. The effect of globalization, technology and laws & ecology
2. The change of the role of IT and the impact of culture & collaboration
Interesting observation is that the preferred automotive market will shift to the BRIC countries. In 2050 more than 50 % of the world population (estimate almost 10 billion people at that time) will be living in Asia, 25 percent in Africa. Europe and Japan are aging. They will not invest in new cars.
For Steven, it was clear that current automotive companies are not yet organized to support and integrate modern technologies (systems engineering / electrical / software) beyond mechanical designs. Neither are they open for a true global collaboration between all players in the industry. Some of the big automotive companies are still struggling with their rigid PLM implementation. There is a need for open PLM, not driven from a single PLM system, but based on a federated environment of information.
Yves Baudier spoke on behalf of the aerospace industry about the standardization effort at their Strategic Standardization Group around Airbus and some of its strategic suppliers, like Thales, Safran, BAE systems and more. If you look at the ASD Radar, you might get a feeling for the complexity of standards that exist and are relevant for the Airbus group.
It is a complex network of evolving standard all providing (future) benefits in some domains. Yves was talking about the through Lifecycle support which is striving for data creation once and reuse many times during the lifecycle. The conclusion from Yves, like all the previous speakers is that: The PLM Platform of the Future will be federative, and standards will enable PLM Interoperability
Energy and Marine
Shefali Arora from Wärtsilä spoke on behalf of the energy and marine sector and gave an overview of the current trends in their business and the role of PLM in Wärtsilä. With PLM, Wärtsilä wants to capitalize on its knowledge, drive costs down and above all improve business agility. As the future is in flexibility. Shefali gave an overview of their PLM roadmap covering the aspects of PDM (with Teamcenter), ERP (SAP) and a PLM backbone (Share-A-space). The PLM backbone providing connectivity of data between all lifecycle stages and external partners (customer / suppliers) based on the PLCS standard. Again another session demonstrating the future of PLM is in an open and federated environment
The future PLM platform is a federated platform which adheres to standards provides openness of interfaces that permit the platform to be reliable over multiple upgrade cycles and being able to integrate third-parties (Peter Bilello)
The afternoon session I followed the Systems Engineering track. Peter Bilello gave an overview of Model-Based Systems engineering and illustrated based on a CIMdata survey that even though many companies have a systems engineering strategy in place it is not applied consistently. And indeed several companies I have been dealing with recently expressed their desire to integrate systems engineering into their overall product development strategy. Often this approach is confused by believing requirements management and product development equal systems engineering. Still a way to go.
Dieter Scheithauer presented his vision that Systems Engineering should be a part of PLM, and he gave a very decent, academic overview how all is related. Important for companies that want to go into that direction, you need to understand where you aiming at. I liked his comparison of a system product structure and a physical product structure, helping companies to grab the difference between a virtual, system view and a physical product view:
More Industry voices
The afternoon session started with Christophe Castaing, explaining BIM (Building Information Modeling) and the typical characteristics of the construction industry. Although many construction companies focus on the construction phase, for 100 pieces of information/exchange to be managed during the full life cycle only 5 will be managed during the initial design phase (BIM), 20 will be managed during the construction phase (BAM) and finally 75 will be managed during the operation phase (BOOM). I wrote about PLM and BIM last year: Will 2014 become the year the construction industry will discover PLM?
Christophe presented the themes from the French MINnD project, where the aim is starting from an Information Model to come to a platform, supporting and integrated with the particular civil and construction standards, like IFC. CityGml but also PLCS standard (isostep ISO 10303-239
Amir Rashid described the need for PLM in the consumer product markets stating the circular economy as one of the main drivers. Especially in consumer markets, product waste can be extremely high due to the short lifetime of the product and everything is scrapped to land waste afterward. Interesting quote from Amir: Sustainability’s goal is to create possibilities not to limit options. He illustrated how Xerox already has sustainability as part of their product development since 1984. The diagram below demonstrates how the circular economy can impact all business today when well-orchestrated.
Marc Halpern closed the tracks with his presentation around Product Innovation Platforms, describing how Product Design and PLM might evolve in the upcoming digital era. Gartner believes that future PLM platforms will provide insight (understand and analyze Big Data), Adaptability (flexible to integrate and maintain through an open service oriented architecture), promoting reuse (identifying similarity based on metadata and geometry), discovery (the integration of search analysis and simulation) and finally community (using the social paradigm).
If you look to current PLM systems, most of them are far from this definition, and if you support Gartner’s vision, there is still a lot of work for PLM vendor to do.
Interesting Marc also identified five significant risks that could delay or prevent from implementing this vision:
- inadequate openness (pushing back open collaboration)
- incomplete standards (blocking implementation of openness)
- uncertain cloud performance (the future is in cloud services)
- the steep learning curve (it is a big mind shift for companies)
- Cyber-terrorism (where is your data safe?)
After Marc´s session there was an interesting panel discussion with some the speakers from that day, briefly answering discussing questions from the audience. As the presentations have been fairly technical, it was logical that the first question that came up was: What about change management?
A topic that could fill the rest of the week but the PDT dinner was waiting – a good place to network and digest the day.
Day 2 started with two interesting topics. The first presentation was a joined presentation from Max Fouache (IBM) and Jean-Bernard Hentz (Airbus – CAD/CAM/PDM R&T and IT Backbones). The topic was about the obsolescence of information systems: Hardware and PLM applications. As in the aerospace industry some data needs to be available for 75 years. You can imagine that during 75 years a lot can change to hardware and software systems. At Airbus, there are currently 2500 applications, provided by approximate 600 suppliers that need to be maintained. IBM and Airbus presented a Proof of Concept done with virtualization of different platforms supporting CATIA V4/V5 using Linux, Windows XP, W7, W8 which is just a small part of all the data.
The conclusion from this session was:
To benefit from PLM of the future, the PLM of the past has to be managed. Migration is not the only answer. Look for solutions that exist to mitigate risks and reduce costs of PLM Obsolescence. Usage and compliance to Standards is crucial.
Next Howard Mason, Corporate Information Standards Manager took us on a nice journey through the history of standards developed in his business. I loved his statement: Interoperability is a right, not a privilege
In the systems engineering track Kent Freeland talked about Nuclear Knowledge Management and CM in Systems Engineering. As this is one of my favorite domains, we had a good discussion on the need for pro-active Knowledge Management, which somehow implies a CM approach through the whole lifecycle of a plant. Knowledge management is not equal to store information in a central place. It is about building and providing data in context that it can be used.
Ontology for systems engineering
Leo van Ruijven provided a session for insiders: An ontology for Systems Engineering based on ISO 15926-11. His simplified approach compared to the ISO 15288 lead to several discussion between supporters and opponents during lunch time.
Master Data Management
Based on the type of information companies want to manage in relation to each other supported by various applications (PLM, ERP, MES, MRO, …) this can be a complex exercise and Marc ended with recommendations and an action plan for the MDM lead. In my customer engagements I also see more and more the digital transformation leads to MDM questions. Can we replace Excel files by mastered data in a database?
Almost at the end of the day I was speaking about the PDM platform of the people targeted for the people from the future. Here I highlighted the fundamental change in skills that’s upcoming. Where my generation was trained to own and capture information as much as possible information in your brain (or cabinet), future generations are trained and skilled in finding data and building information out of it. Owning (information) is not crucial for them. Perhaps as the world is moving fast. See this nice YouTube movie at the end.
Ella Jamsin ended the conference on behalf of the Ellen MacArthur Foundation explaining the need to move to a circular economy and the PLM should play a role in that. No longer is PLM from cradle-to-grave but PLM should support the lifecycle from cradle-to-cradle.
Unfortunate I could not attend all sessions as there were several parallel sessions. Neither have I written about all sessions I attended. The PDT Europe conference, a conference for people who mind about the details around the PLM future concepts and the usage of standards, is a must for future strategists.
Business is changing and becoming digital as you might have noticed. If you haven´t noticed it, you might be disconnected from the world or work in a stable silo. A little bit simplified and provocative otherwise you would not read further.
The change towards digital also has its effect on how PLM is evolving. Initially considered as an extension of PDM, managing engineering data, slowly evolving to an infrastructure to support the whole product lifecycle.
The benefits from a real PLM infrastructure are extremely high as it allows people to work smarter, identify issues earlier and change from being reactive towards proactive. In some industries, this change in working is they only way to stay in business. Others with still enough margin will not act.
Note: I am talking about a PLM infrastructure as I do not believe in a single PLM system anymore. For me PLM is supported through a collection of services across the whole product lifecycle, many potentially in one system or platform.
Changing from an engineering-centric system towards an infrastructure across the departmental silos is the biggest challenge for PLM. PLM vendors and ERP vendors with a PLM offering are trying provide this infrastructure and mainly fight against Excel. As an Excel file can easy pass the border from one department to the other. No vision needed for Excel.
A PLM infrastructure however requires a vision. A company has to look at its core business processes and decide on which information flows through the organization or even better their whole value chain.
Building this vision, understanding this vision and then being able to explain the vision is a challenge for all companies. Where sometime even management says
“Why do we need to have a vision, just fix the problem”
also people working in departments are not looking forward to change their daily routines because they need to share information. Here you here statements like
“Why people feel the need to look at the big picture. I want to have my work done.”
So if current businesses do not change, will there be a change?
Here I see the digital world combined with search-based applications coming up. Search based applications allow companies to index their silos and external sources and get an understanding of the amount of data there exists. And from these results learn that there is a lot of duplicated data or invalid information at different places.
This awareness might create the understanding that if instead of having hundred thousands of Excels in the organization, it would be better to have the data inside a database, uniquely stored and connected to other relevant information.
Next if you want to understand it in a more down-to-earth manner it is important to listen and talk with your peers from other companies, other industries. This is currently happening all around the world and I invite you to participate.
Here is a list of events that I am attending or planned to attend but too far away:
Here I will participate as a panel member in the discussion around the concept of zero files. Here we want to explain and discuss to the audience what a data-centric approach means for an organization. Also, customers will share their experiences. This conference is focusing on the ENOVIA community – you can still register here
Here I will speak about the PLM future (based on data) and what PLM should deliver for the future generations. This conference is much broader and addresses all PLM related topics in a broader perspective
Relative new in the Nordics Infuseit, a PLM consultancy company, is able to attract an audience that wants to work on understanding the PLM future. Instead of listening to presenters, here you are challenged to to discuss and contribute to build a common opinion. I will be there too.
Conclusion: It is time to prepare yourself for the change – it is happening and be educated an investment that will be rewarding for your company
What do you think – Is data-centric a dream ?
This is for the moment the last post about the difference between files and a data-oriented approach. This time I will focus on the need for open exchange standards and the relation to proprietary systems. In my first post, I explained that a data-centric approach can bring many business benefits and is pointing to background information for those who want to learn more in detail. In my second post, I gave the example of dealing with specifications.
It demonstrated that the real value for a data-centric approach comes at the moment there are changes of the information over time. For a specification that is right the first time and never changes there is less value to win with a data-centric approach. Moreover, aren’t we still dreaming that we do everything right the first time.
The specification example was based on dealing with text documents (sometimes called 1D information). The same benefits are valid for diagrams, schematics (2D information) and CAD models (3D information)
The challenge for a data-oriented approach is that information needs to be stored in data elements in a database, independent of an individual file format. For text, this might be easy to comprehend. Text elements are relative simple to understand. Still the OpenDocument standard for Office documents is in the background based on a lot of technical know-how and experience to make it widely acceptable. For 2D and 3D information this is less obvious as this is for the domain of the CAD vendors.
CAD vendors have various reasons not to store their information in a neutral format.
- First of all, and most important for their business, a neutral format would reduce the dependency on their products. Other vendors could work with these formats too, therefore reducing the potential market capture. You could say that in a certain manner the Autodesk 2D format for DXF (and even DWG) have become a neutral format for 2D data as many other vendors have applications that read and write back information in the DXF-data format. So far DXF is stored in a file but you could store DXF data also inside a database and make it available as elements.
- This brings us to the second reason why using neutral data formats are not that evident for CAD vendors. It reduces their flexibility to change the format and optimize it for maximal performance. Commercially the significant, immediate disadvantage of working in neutral formats is that it has not been designed for particular needs in an individual application and therefore any “intelligent” manipulations on the data are hard to achieve..
The same reasoning can be applied to 3D data, where different neutral formats exist (IGES, STEP, …. ). It is very difficult to identify a common 3D standard without losing many benefits that an individual 3D CAD format brings currently. For example, CATIA is handling 3D CAD data in a complete different way as Creo does, and again handled different compared to NX, SolidWorks, Solid Edge and Inventor. Even some of them might use the same CAD kernel.
However, it is not only about the geometry anymore; the shapes represent virtual objects that have metadata describing the objects. In addition other related information exists, not necessarily coming from the design world, like tasks (planning), parts (physical), suppliers, resources and more
PLM, ERP, systems and single source of truth
This brings us in the world of data management, in my world mainly PLM systems and ERP systems. An ERP system is already a data-centric application, the BOM is already available as metadata as well as all the scheduling and interaction with resources, suppliers and financial transactions. Still ERP systems store a lot of related documents and drawings, containing content that does not match their data model.
PLM systems have gradually becoming more and more data centric as the origin was around engineering data, mostly stored in files. In a data-centric approach, there is the challenge to exchange data between a PLM system and an ERP system. Usually there is a need to share information between two systems, mainly the items. Different definitions of an item on the PLM and ERP side make it hard to exchange information from one system to the other. It is for that reason why there are many discussions around PLM and ERP integration and the BOM.
In the modern data-centric approach however we should think less and less in systems and more and more in business processes performed on actual data elements. This requires a company-wide, actually an enterprise-wide or industry-wide data definition of all information that is relevant for the business processes. This leads into Master Data Management, the new required skill for enterprise solution architects
The data-centric approach creates the impression that you can achieve a single source of the truth as all objects are stored uniquely in a database. SAP solves the problem by stating everything fits in their single database. To my opinion this is more a black hole approach: Everything gets inside, but even light cannot escape. Usability and reuse of information that was stored with the intention not to be found is the big challenge here.
Other PLM and ERP vendors have different approaches. Either they choose for a service bus architecture where applications in the background link and synchronize common data elements from each application. Therefore, there is some redundancy, however everything is connected. More and more PLM vendors focus on building a platform of connected data elements, where on top applications will run, like the 3DExperience platform from Dassault Systèmes.
As users we are more and more used to platforms as Google, Apple provide these platforms already in the cloud for common use on our smartphones. The large amount of apps run on shared data elements (contacts, locations …) and store additional proprietary data.
Platforms, Networks and standards
And here we enter an interesting area of discussion. I think it is a given that a single database concept is a utopia. Therefore, it will be all about how systems and platforms communicate with each other to provide in the end the right information to the user. The systems and platforms need to be data-centric as we learned from the discussion around the document (file centric) or data-centric approach.
In this domain, there are several companies already active for years. Datamation from Dr. Kais Al-Timimi in the UK is such a company. Kais is a veteran in the PLM and data modeling industry, and they provide a platform for data-centric collaboration. This quote from one of his presentations, illustrates we share the same vision:
“……. the root cause of all interoperability and data challenges is the need to transform data between systems using different, and often incompatible, data models.
It is fundamentally different from the current Application Centric Approach, in that data is SHARED, and therefore, ‘NOT OWNED’ by the applications that create it.
This means in a Data Centric Approach data can deliver MORE VALUE, as it is readily sharable and reusable by multiple applications. In addition, it removes the overhead of having to build and maintain non-value-added processes, e.g. to move data between applications.”
Another company in the same domain is Eurostep, who are also focusing on business collaboration between in various industries. Eurostep has been working with various industry standards, like AP203/214, PLCS and AP233. Eurostep has developed their Share-A-space platform to enable a data-centric collaboration.
This type of data collaboration is crucial for all industries. Where the aerospace and automotive industry are probably the most mature on this topic, the process industry and construction industry are currently also focusing on discovering data standards and collaboration models (ISO 15926 / BIM). It will be probably the innovators in these industries that clear the path for others. For sure it will not come from the software vendors as I discussed before.
If you reach this line, it means the topic has been interesting in depth for you. In the past three post starting from the future trend, an example and the data modeling background, I have tried to describe what is happening in a simplified manner.
If you really want to dive into the PLM for the future, I recommend you visit the upcoming PDT 2014 conference in Paris on October 14 and 15. Here experts from different industries will present and discuss the future PLM platform and its benefits. I hope to meet you there.
Some more to read:
My holidays are over. After reading and cycling a lot, it is time to focus again on business and future. Those of you who have followed my blog the past year must have noticed that I have been talking on a regular base about business moving to a data-oriented approach instead of a document / file-based approach. I wrote an introduction to this topic at the beginning of this year: Did you notice PLM has been changing?
This year I have had many discussions around this topic with companies acting in various industries; manufacturing, construction, oil & gas, nuclear and general EPC-driven companies. There was some commonality in all these discussions:
- PLUS: Everyone believes it is a beautiful story and it makes sense
- MINUS: Almost nobody wants to act upon it as it is an enormous business change and to change the way a company works you need C-level understanding
- PLUS: Everyone thinks the concept is clear to them
- MINUS: Few understand what it means to work data-oriented and what the impact on their business would be
Therefore, what I will try to do in the upcoming blog posts (two-three-four ??) is to address the two negative observations and how to make them more precise.
What is data / information / knowledge?
Data for me is a collection of small artifacts (numbers, characters, lines, sound bits, …) which have no meaning at all. This could be bundled together as a book, a paper drawing, a letter but also bundled together as a digital format like an eBook, a CAD file, an email and even transmission bytes of a network / internet provider can be considered as data.
Data becomes significant once provided in the context of each other or in the context of other data. At that time, we start calling it information. For that reason, a book or a drawing provides information as the data has been structured in such a manner to become meaningful. The data sent through the network cable only becomes information when it is filtered and stripped from the irrelevant parts.
Information is used to make decisions based on knowledge. Knowledge is the interpretation of information, which combined in a particular way, helps us to make decisions. And the more decisions we make and the more information we have about the results of these decisions, either by us or other, it will increase our knowledge.
Data and big data
Now we have some feeling about data, information and knowledge. For academics, there is room to discuss and enhance the definition. I will leave it by this simple definition.
Big data is the term for all digital data that is too large to handle in a single data management system, but available and searchable through various technologies. Data can come from any source around the world as through the internet an infrastructure exists to filter and search for particular data.
By analyzing and connecting the data coming from these various sources, you can generate information (placing the data in context) and build knowledge. As it is an IT-driven activity, this can be done in the background and give almost actual data to any person. This is a big difference with information handling in the old way, where people have to collect and connect manual the data.
The power of big data applies to many business areas. If you know how your customers are thinking and associating their needs to your products, you can make them better and more targeted to your potential market. Or, if you know how your products are behaving in the field during operation (Internet of Things) you can provide additional services, instant feedback and be more proactive. Plus the field data once analyzed provide actual knowledge helping you to make better products or offer more accurate services.
Wasn’t there big data before?
Yes, before the big data era there was also a lot of information available. This information could be stored in “analogue” formats ( microfiche, paper, clay tablets, papyrus) or in digital formats, better known as files or collections of files (doc, pdf, CAD-files, ZIP….).
Note the difference. Here I am speaking about information as the data is contained in these formats.
You have to open or be in front of information container first, before seeing the data. In the digital world, this is often called document management, content management. The challenge of these information containers is that you need to change the whole container version once you modify one single piece of data inside it. And each information container holds duplicated information from a data element. Therefore, it is hard to manage a “single version of the truth” approach.
And here comes the data-oriented approach
The future is about storing all these pieces of data inside connected data environments, instead of storing a lot of data inside a (versioned) information container (a file / a document).
Managing these data elements in the context of each other allow people to build information from any viewpoint – project oriented, product oriented, manufacturing oriented, service oriented, etc.
The data remains unique, therefore supporting much closer the single version of the truth approach. Personally I consider the single version of the truth as a utopia, however reducing the amount of duplicated data by having a data-oriented approach will bring a lot more efficiency.
In my next post, I will describe an example of a data-oriented approach and how it impacts business, both from the efficiency point of view and from the business transformation point of view. As the data-oriented approach can have immense benefits . However, they do not come easy. You will have to work different.
Some more details
An important point to discuss is that this data-oriented approach requires a dictionary, describing the primary data elements used in a certain industry. The example below demonstrates a high-level scheme for a plant engineering environment.
Data standards exist in almost any industry or they are emerging and crucial for the longevity and usage of the data. I will touch it briefly in one of the upcoming posts, however, for those interested in this topic in relation to PLM, I recommend attending the upcoming PDT Europe. If you look at the agenda there is a place to learn and discuss a lot about the future of PLM.
I hope to see you there.
Last week I attended the PI Apparel conference in London. It was the second time this event was organized and approximate 100 participants were there for two full days of presentations and arranged network meetings. Last year I was extremely excited about this event as the different audience, compare to classical PLM events, and was much more business focused.
Read my review from last year here: The weekend after PI Apparel 2013
This year I had the feeling that the audience was somewhat smaller, missing some of the US representatives and perhaps there was a slightly more, visible influence from the sponsoring vendors. Still an enjoyable event and hopefully next year when this event will be hosted in New York, it will be as active as last year.
Here are some of my observations.
Again the event had several tracks in parallel beside the keynotes, and I look forward in the upcoming month to see the sessions I could not attend. Obvious where possible I followed the PLM focused sessions.
First keynote came from Micaela le Divelec Lemmi, Executive Vice President and Chief Corporate Operations Officer of Gucci. She talked us through the areas she is supervising and gave some great insights. She talked about how Gucci addresses sustainability through risk and cost control. Which raw materials to use, how to ensure the brands reputation is not at risk, price volatility and the war on talent. As Gucci is a brand in the high-end price segment, image and reputation are critical, and they have the margins to assure it is managed. Micaela spoke about the short-term financial goals that a company as Gucci has related to their investors. Topics she mentioned (I did not write them down as I was tweeting when I heard them) were certainly worthwhile to consider and discuss in detail with a PLM consultant.
Micaela further described Gucci´s cooperate social responsibility program with a focus on taking care of the people, environment and culture. Good to learn that human working conditions and rights are a priority even for their supply chain. Although it might be noted that 75 % of Gucci´s supply chain is in Italy. One of the few brands that still has the “Made in Italy” label.
My conclusion was that Micaela did an excellent PR job for Gucci, which you would expect for a brand with such a reputation. Later during the conference we had a discussion would other brands with less exclusivity and more operating in the mass consumer domain be able to come even close to such programs?
The company is successful in manufacturing and selling licensed products from Pierre Cardin, Cacharel and US Polo Association mainly outside the US and Western Europe.
Their primary focus was to provide access to the most accurate and most updated information from one source. In parallel, standardization of codes and tech packs was a driver. Through standardization quality and (re)use could be improved, and people would better understand the details. Additional goals are typical PLM goals: following the product development stages during the timeline, notify relevant users about changes in the design, work on libraries and reuse and integrate with SAP.
Interesting Hakan mentioned that in their case SAP did not recommend to use their system for the PLM related part due to lack of knowledge of the apparel industry. A wise decision which would need followup for other industries.
In general the PLM implementation described by Göktug and Hakan was well phased and with a top-down push to secure there is no escape to making the change. As of all PLM implementations in apparel they went live in their first phase rather fast as the complex CAD integrations from classical PLM implementations were not needed here.
Next I attended the Infor session with the title: Work the Way you Live: PLM built for the User. A smooth marketing session with a function / feature demo demonstrating the flexibility and configuration capabilities of the interface. Ease of use is crucial in the apparel industry, where Excel is still the biggest competitor. Excel might satisfy the needs from the individual, it lacks the integration and collaboration aspect a PLM system can offer.
More interesting was the next session that I attended from Marcel Oosthuis, who was responsible as Process Re-Engineering Director (read PLM leader). Marcel described how they had implemented PLM at Tommy Hilfiger, and it was an excellent story (perhaps too good to be true).
I believe larger companies with the right focus and investment in PLM resources can achieve this kind of results. The target for Tommy Hilfiger´s PLM implementation was beyond 1000 users, therefore, a serious implementation.
Upfront the team defined first what the expected from the PLM system to select (excellent !!). As the fashion industry is fast, demanding and changing all the time, the PLM system needs to be Swift, Flexible and Prepared for Change. This was not a classical PLM requirement.
In addition, they were looking for a high-configurable system, providing best practices and a vendor with a roadmap they could influence. Here I got a little more worried as high-configurable and best practices not always match the prepared for change approach. A company might be tempted to automate the way they should work based on the past (best practices from the past)
It was good to hear that Marcel did not have to go into the classical ROI approach for the system. His statement, which I fully endorse that it is about the capability to implement new and better processes. They are often not comparable with the past (and nobody measured the past)
Marcel described how the PLM team (eight people + three external from the PLM vendor) made sure that the implementation was done with the involvement of the end users. End user adoption was crucial as also key user involvement when building and configuring the system.
It was one of the few PLM stories where I hear how all levels of the organization were connected and involved.
Next Sue Butler, director from Kurt Salmon, described how to maximize ROI from your PLM investment. It is clear that many PLM consultants are aligned, and Sue brought up all the relevant points and angles you needed to look at for successful PLM implementation.
Main points: PLM is about changing the organization and processes, not about implementing a tool. She made a point that piloting the software is necessary as part of the learning and validation process. I agree on that under the condition that it is an agile pilot which does not take months to define and perform. In that case, you might be already locked in into the tool vision too much – focus on the new processes you want to achieve.
Moreover, because Sue was talking about maximize ROI from a PLM implementation, the topics focus on business areas that support evolving business processes and measure (make sure you have performance metrics) came up.
The next session Staying Ahead of the Curve through PLM Roadmap Reinvention conducted by Austin Mallis, VP Operations, Fashion Avenue Sweater Knits, beautifully completed previous sessions related to PLM.
Austin nicely talked about setting the right expectations for the future (There is no perfect solution / Success does not mean stop / Keeping the PLM vision / No True End). In addition, he described the human side of the implementation. How to on-board everyone (if possible) and admitting you cannot get everyone on-board for the new way of working.
Luckily the speakers before me that day already addressed many of the relevant topics, and I could focus on three main thoughts completing the story:
1. Who decides on PLM and Why?
I published the results from a small survey I did a month ago via my blog (A quick PLM survey). See the main results below.
It was interesting to observe that both the management and the users in the field are the majority demanding for PLM. Consultants have some influence and PLM vendors even less. The big challenge for a company is that the management and consultants often talk about PLM from a strategic point of view, where the PLM vendor and the users in the field are more focused on the tool(s).
From the expectations you can see the majority of PLM implementations is about improving collaboration, next time to market, increase quality and centralizing and managing all related information.
2. Sharing data instead of owning data
(You might have read about it several times in my blog) and the trend that we move to platforms with connected data instead of file repositories. This should have an impact on your future PLM decisions.
3. Choosing the right people
The third and final thought was about choosing the right people and understanding the blocker. I elaborated on that topic already before in my recent blog post: PLM and Blockers
My conclusions for the day were:
A successful PLM implementation requires a connection in communication and explanation between all these levels. These to get a company aligned and have an anchored vision before even starting to implement a system (with the best partner)
The day was closed by the final keynote of the day from Lauren Bowker heading T H E U N S E E N. She and her team are exploring the combinations of chemistry and materials to create new fashion artifacts. Clothes and materials that change color based on air vent, air pollution or brain patterns. New and inspiring directions for the fashion lovers.
Have a look here: http://seetheunseen.co.uk/
The morning started with Suzanne Lee, heading BioCouture who is working on various innovative methodologies to create materials for the apparel industry by using all kind of live micro-organisms like bacteria, fungi and algae and using materials like cellulose, chitin and protein fibers, which all can provide new possibilities for sustainability, comfort, design, etc. Suzanne´s research is about exploring these directions perhaps shaping some new trends in the 5 – 10 years future ahead. Have a look into the future here:
Renate Eder took us into the journey of visualization within Adidas, with her session: Utilizing Virtualization to Create and Sell Products in a Sustainable Manner.
It was interesting to learn that ten years ago she started the process of having more 3D models in the sales catalogue. Where classical manufacturing companies nowadays start from a 3D design, here at Adidas at the end of the sales cycle 3D starts. Logical if you see the importance and value 3D can have for mass market products.
Adidas was able to get 16000 in their 3D catalogue thanks to the work from 60 of their key suppliers who were fully integrated in the catalogue process. The benefit from this 3D catalogue was that their customers, often the large stores, need lesser samples, and the savings are significant here (plus a digital process instead of transferring goods).
Interesting discussion during the Q&A part was that the virtual product might even look more perfect than the real product, demonstrating how lifelike virtual products can be.
And now Adidas is working further backwards from production patterns (using 3D) till at the end 3D design. Although a virtual 3D product cannot 100 % replace the fit and material feeling, Renate believes that also introducing 3D during design can reduce the work done during pilots.
Finally for those who stayed till the end there was something entirely different. Di Mainstone elaborating on her project: Merging Architecture & the Body in Transforming the Brooklyn Bridge into a Playable Harp. If you want something entirely different, watch here:
The apparel industry remains an exciting industry to follow. For some of the concepts – being data-centric, insane flexible, continuous change and rapid time to market are crucial here.
This might lead development of PLM vendors for the future, including using it based on cloud technology.
From the other side, the PLM markets in apparel is still very basic and learning, see this card that I picked up from one of the vendors. Focus on features and functions, not touching the value (yet)
Friends, this is the first evening that there is no soccer on television for two weeks. So I have time to write something.
Currently, I am preparing my session for PI Apparel 2014 in London on 15/16 July. Last year´s PI Apparel was a discovery for me as the audience was so different compared to classical PLM conferences. Of course, the products might be not as complex, but the time to market needs and, therefore, the need to work as fast and concurrent as possible is a huge differentiator. See my post from last year´s conference here: The weekend after PI Apparel 2013
In a way, PLM for apparel companies is more data-centric than some of the original industries where PLM was born. In the traditional way, file management and document sharing were the initial drivers.
At this years conference, I will talk about the needed change in the way people work that comes with a PLM implementation. I will share the full story plus my observations in my next post end of July.
Before that, I have a question to all readers of this blog who are working for a company that has implemented or starts implementing PLM to answer two questions from the survey below. The answers will help me to confirm my prejudgments or change my mind.
So if you have some time between the soccer matches, please respond to the survey below if you qualify:
Thanks and enjoy the upcoming matches
Two weeks ago I attended the Nobletek PLM forum in Belgium, where a group of experts, managers and users discussed topics related to my favorite theme: “Is PLM changing? “
Dick Terleth (ADSE) lead a discussion with title “PLM and Configuration Management as a proper profession” or "How can the little man grow?". The context of the discussion was related to the topic: “How is it possible that the benefits of PLM (and Configuration Management) are not understood at C-level?” or with other words: “Why is the value for Configuration Management and PLM not obvious?”.
In my previous post, PLM is doomed unless …., I quoted Ed Lopategui (www.eng-eng.com), who commented that being a PLM champion (or a Configuration Management expert as Dick Terleth would add) is bad for your career. Dick Terleth asked the same question, showing pictures of the self-assured accountant and the Configuration Management or PLM professional. (Thanks Dick for the pictures). Which job would you prefer?
The PLM ROI discussion
A first attempt to understand the difference could be related to the ROI discussion, which seems to be only applicable for PLM. Apparently ERP and financial management systems are a must for companies. No ROI discussion here. Persons who can control/report the numbers seem to have the company under control. For the CEO and CFO the value of PLM is often unclear. And to make it worse, PLM vendors and implementers are fighting for their unique definition of PLM so we cannot blame companies to be confused. This makes it clear that if you haven´t invested significant time to understand PLM, it will be hard to see the big picture. And at C-level people do not invest significant time to understand the topic. It is the C-level´s education, background or work experience that make him/her decide.
So if the C-level is not educated on PLM, somebody has to sell the value to them. Oleg Shilovitsky wrote about it recently in his post Why is it hard to sell PLM ROI and another respected blogger, Joe Barkai, sees the sun come up behind the cloud, in his latest post PLM Service Providers Ready To Deliver Greater Value. If you follow the posts of independent PLM bloggers (although who is 100 % independent), you will see a common understanding that implementing PLM currently requires a business transformation as old processes were not designed for a modern infrastructure and digital capabilities.
PLM is about (changing) business processes
Back to the Nobletek PLM forum. Douglas Noordhoorn, the moderator of the forum challenged the audience stating that PLM has always been there (or not there – if you haven´t discovered it). It is all about managing the product development processes in a secure way. Not talking about “Best Practices” but “Good practices." Those who had a proper education in the aerospace industry learned that good processes are crucial to deliver planes that can fly and are reliable.
Of course, the aerospace industry is not the same as other industries. However, more and more other industries in my network, like Nuclear new build, the construction industry or other Engineering, Procurement and Construction companies want to learn from aerospace and automotive good practices. They realize they are losing market share due to the fact that the cost of failure combined with relative high labor costs makes them too expensive. But from where to they get their proper good practices education?
The PLM professional?
And this was an interesting point coming up from the Nobletek forum. There is no proper, product agnostic education for PLM (anymore). If you study logistics, you will learn a lot about various processes and how they can be optimized for a certain scenario. When you study engineering, there is a lot of focus on engineering disciplines and methods. But there is no time to educate engineers in-depth to understand the whole product development process and how to control it. Sometimes I give a guest lecture to engineering classes. It is never an important part of the education.
To become a PLM professional
For those who never had any education in standard engineering processes, there is Frank Watts Engineering control book, which probably would be a good base. But it is not the PLM professional only that should be aware, of the good practices. Moreover, all companies manufacturing products, plants or buildings should learn these basics. As a side step, it would make a discussion around BIM more clear. At this time, manufacturing companies are every time discovering their good practices in the hard way.
And when this education exists, companies will be aware that it is not only about the tools, but it is the way the information is flowing through the organization. Even there is a chance that somewhere at C-level someone has been educated and understands the value. For ERP everyone agrees. For PLM, it remains a labyrinth of processes designed by companies learning on the job currently. Vendors and implementers pushing what they have learned. Engineering is often considered as a hard-to-manage discipline. As a SAP country manager once said to me: “Engineers are actually resources that do not want be managed, but we will get them …..”
And then the future ……
I support the demand for a better education in engineering processes especially for industries outside aerospace or automotive. I doubt if it will have a significant impact although it might create the visibility and understanding for PLM at C-level. No need anymore for the lone ranger who fights for PLM. Companies will have better educated people that understand the need for good practices that exist. These good practices will be the base for companies when discussing with PLM vendors and implementers. Instead of vendors and implementers pushing their vision, you can articulate, and follow your vision.
However, we need a new standard book too. We are currently in the middle of a big change. Thanks to modern technology and connectivity the world is changing. I wrote and spoke about it in: Did you notice PLM is changing?
This awareness needs to become visible at C-level.
Who will educate them ??
Now back to soccer – 4 years ago Spain-The Netherlands was the last match – the final. Now it is the first match for them – will the Dutch change the game ?
Human beings are a strange kind of creatures. We think we make a decision based on logic, and we think we act based on logic. In reality, however, we do not like to change, if it does not feel good, and we are lazy in changing our habits.
Disclaimer: It is a generalization which is valid for 99 % of the population. So if you feel offended by the previous statement, be happy as you are one of the happy few.
Our inability to change can be seen in the economy (only the happy few share). We see it in relation to global climate change. We see it in territorial fights all around the world.
Owning instead of sharing. ?
The cartoon below gives an interesting insight how personal interests are perceived more important than general interest.
It is our brain !
More and more I realize that the success of PLM is also related to his human behavior; we like to own and find it difficult to share. PLM primarily is about sharing data through all stages of the lifecycle. A valid point why sharing is rare , is that current PLM systems and their infrastructures are still too complex to deliver shared information with ease. However, the potential benefits are clear when a company is able to transform its business into a sharing model and therefore react and anticipate much faster on the outside world.
But sharing is not in our genes, as:
- In current business knowledge is power. Companies fight for their IP; individuals fight for their job security by keeping some specific IP to themselves.
- As a biological organism, composed of a collection of cells, we are focused on survival of our genes. Own body/family first is our biological message.
Breaking these habits is difficult, and I will give some examples that I noticed the past few weeks. Of course, it is not completely a surprise for readers of my blog, as a large number of my recent posts are related to the complexity of change. Some are related to human behavior:
Ed Lopategui, an interesting PLM blogger, see http://eng-eng.com, wrote a long comment to my PLM and Blockers post. The (long) quote below is exactly describing what makes PLM difficult to implement within a company full of blockers :
“I also know that I was focused on doing the right thing – even if cost me my position; and there were many blockers who plotted exactly that. I wore that determination as a sort of self-imposed diplomatic immunity and would use it to protect my team and concentrate any wrath on just myself. My partner in that venture, the chief IT architect admitted on several occasions that we wouldn’t have been successful if I had actually cared what happened to my position – since I had to throw myself and the project in front of so many trains. I owe him for believing in me.
But there was a balance. I could not allow myself to reach a point of arrogance; I would reserve enough empathy for the blockers to listen at just the right moments, and win them over. I spent more time in the trenches than most would reasonably allow. It was a ridiculously hard thing and was not without an intellectual and emotional cost.
In that crucible, I realized that finding people with such perspective (putting the ideal above their own position) within each corporation is *exceptionally* rare. People naturally don’t like to jump in front of trains. It can be career-limiting. That’s kind of a problem, don’t you think? It’s a limiting factor without a doubt, and not one that can be fulfilled with consultants alone. You often need someone with internal street cred and long-earned reputation to push through the tough parts”
Ed concludes that it is exceptionally rare to find people putting the ideal above their own position. Again referring to the opening statement that only a (happy) few are advocates for change
Now let´s look at some facts why it is exceptionally rare, so we feel less guilty.
Although it was not the easiest book to read during a holiday, it was well written considering the complexity of the topic discussed. Jeff describes how the information architecture of the brain could work based on the neocortex layering.
In his model, he describes how the brain processes information from our senses, first in a specific manner but then more and more in an invariant approach. You have to read the book to get the full meaning of this model. The eye opener for me was that Jeff described the brain as a prediction engine. All the time the brain anticipates what is going to happen, based on years of learning. That’s why we need to learn and practice building and enrich this information model.
And the more and more specialized you are on a particular topic, it can be knowledge but it can also be motoric skill, the deeper in the neocortex this pattern is anchored. This makes is hard to change (bad) practices.
The book goes much further, and I was reading it more in the context of how artificial intelligence or brain-like intelligence could support the boring PLM activities. I got nice insights from it, However the main side observation was; it is hard to change our patterns. So if you are not aware of it, your subconscious will always find reasons to reject a change. Follow the predictions !
Thinking Fast and Slow
And this is exactly the connection with another book I have read before: Thinking Fast and Slow from Daniel Kahneman. Daniel explains that our brain is running its activities on two systems:
System 1: makes fast and automatic decisions based on stereotypes and emotions. System 1 is what we are using most of the time, running often in subconscious mode. It does not cost us much energy to run in this mode.
System 2: takes more energy and time; therefore, it is slow and pushes us to be conscious and alert. Still system 2 can be influenced by various external, subconscious factors.
Thinking Fast and Slow nicely complements On Intelligence, where system 1 described by Daniel Kahneman is similar to the system Jeff Hawkins describes as the prediction engine. It runs in an subconscious mode, with optimal energy consumption allowing us to survive most of the time.
Fast thinking leads to boiling frogs
And this links again to the boiling frog syndrome. If you are not familiar with the term follow the link. In general it means that people (and businesses) are not reacting on (life threating) outside change when it goes slowly, but would react immediately if they are confronted with the end result. (no more business / no more competitive situation)
Conclusion: our brain by default wants to keep business in predictive mode, so implementing a business change is challenging, as all changes are painful and against our subconscious system.
So PLM is doomed, unless we change our brain behavior ?
The fact that we are not living in caves anymore illustrates that there have been always those happy few that took a risk and a next step into the future by questioning and changing comfortable habits. Daniel Kahneman´s system 2 and also Jeff Hawkins talk about the energy it takes to change habits, to learn new predictive mechanisms. But it can be done.
I see two major trends that will force the classical PLM to change:
- The amount of connected data becomes so huge, it does not make sense anymore to store it and structure the information in a single system. The time required to structure data does not deliver enough ROI in a fast moving society. The old “single system that stores all”-concept is dying.
- The newer generations (generation Y and beyond) grew up with the notion that it is impossible to learn, capture and own specific information. They developed different skills to interpret data available from various sources, not necessary own and manage it all.
These two trends lead to the point where it becomes clear that the future in system thinking becomes obsolete. It will be about connectivity and interpretation of connected data, used by apps, running on a platform. The openness of the platform towards other platform is crucial and will be the weakest link.
The PLM vision is not doomed and with a new generations of knowledge workers the “brain change” has started. The challenge is to implement the vision across systems and silos in an organization. For that we need to be aware that it can be done and allocate the “happy few” in your company to enable it.
What do you think ???????????????????????????
The past month I had several discussions related to the complexity of PLM. Why is PLM conceived as complex ? Why is it hard to sell PLM internal into an organization ? Or to phrase it differently: “What makes PLM so difficult for normal human beings. As conceptually it is not so complex”
So what makes it complex ? What´s behind PLM ?
The main concept behind PLM is that people share data. It can be around a project, a product, a plant through the whole lifecycle. In particular during the early lifecycle phases, there is a lot of information that is not yet 100 percent mature. You could decide to wait till everything is mature before sharing it with others (the classical sequential manner), however the chance of doing it right the first time is low. Several iterations between disciplines will be required before the data is approved. The more and more a company works sequential, the higher costs of changes are and the longer the time to market. Due to this rigidness of this sequential approach, it becomes difficult to respond rapidly to customer or market demands. Therefore in theory, (and it is not a PLM theory), concurrent engineering should reduce the amount of iterations and the total time to market by working in parallel in not approved data yet.
PLM goes further, it is also about sharing of data and as it started originally in the early phases of the lifecycle, the concept of PLM was often considered something related to engineering. And to be fair, most of the PLM (CAD-related) vendors have a high focus on the early stages of the lifecycle and strengthen this idea. However sharing can go much further, e.g. early involvement of suppliers (still engineering) or support for after-sales/services (the new acronym SLM). In my recent blog posts I discussed the concepts of SLM and the required data model for that.
The complexity lies in the word “sharing”. What does sharing mean for an organization, where historically every person was awarded for the knowledge he/she has/owned, instead of being awarded for the knowledge this person made available and shared. Many so-called PLM implementations have failed to reach the sharing target as the implementation focus was on storing data per discipline and not necessary storing data to become shareable and used by others. This is a huge difference.
Some famous (ERP) vendors claim if you store everything in their system, you have a “single version of the truth”. Sounds attractive. My garbage bin at home is also a place where everything ends up in a single place, but a garbage bin has not been designed for sharing, as another person has no clue and time to analyze what´s inside. Even data in the same system can be hidden for others as the way to find data is not anticipated.
Data sharing instead of document deliverables
The complexity of PLM is that data should be created and shared in a matter not necessary the most efficient manner for a single purpose, however with some extra effort, to make it usable and searchable for others. A typical example is drawings and documents management, where the whole process for a person is focused on delivering a specific document. Ok for that purpose, but this document on its own becomes a legacy for the long-term as you need to know (or remember) what´s inside the document.
A logical implication of data sharing is that, instead of managing documents, organizations start to collect and share data elements (a 3D model, functional properties, requirements, physical properties, logistical properties, etc). Data can be connected and restructured easily through reports and dashboards, therefore, proving specific views for different roles in the organization. Sharing becomes possible and it can be online. Nobody needed to consolidate and extract data from documents (Excels ?)
This does not fit older generations and departmental managed business units that are rewarded only on their individual efficiency. Have a look at this LinkedIn discussion where the two extremes are visible.
“The sad thing about PLM is that only PLM experts can understand it! It seems to be a very tight knit club with very little influence from any outside sources.
I think PLM should be dumped. It seems to me that computerizing engineering documentation is relatively easy process. I really think it has been over complicated. Of course we need to get the CAD vendors out of the way. Yes it was an obvious solution, but if anyone took the time to look down the road they would see that they were destroying a well established standard that were so cost effective and simple. But it seems that there is no money in simple”
And a the other side Kais stating:
“If we want to be able to use state-of-the art technology to support the whole enterprise, and not just engineering, and through-life; then product information, in its totality, must be readily accessible and usable at all times and not locked in any perishable CAD, ERP or other systems. The Data Centric Approach that we introduced in the Datamation PLM Model is built on these concepts”
Readers from my blog will understand I am very much aligned with Kais and PLM guys have a hard time to convince Joe of the benefits of PLM (I did not try).
Making the change happen
Beside this LinkedIn discussion, I had discussions with several companies where my audience understood the data-centric approach. It was nice to be in the room together, sharing the ideas what would be possible. However the outside world is hard to convince and here it is about change management.
I read an interesting article in IndustryWeek from John Dyer with the title: What Motivates Blockers to Resist Change?
John describes the various types of blockers and when reading the article combined with my PLM twisted brain, I understood again that this is one of the reasons PLM is perceived as complex – you need to change and there are blockers:
Blocker (noun) – Someone who purposefully opposes any change (improvement) to a process for personal reasons
“Blockers” can occupy any position in a company. They can be any age, gender, education level or pay rate. We tend to think of blockers as older, more experienced workers who have been with the company for a long time, and they don’t want to consider any other way to do things. While that may be true in some cases, don’t be surprised to find blockers who are young, well-educated and fairly new to the company.
The problem with blockers
The combination of business change and the existence of blockers are one of the biggest risks for companies to go through a business transformation. By the way, this is not only related to PLM, it is related to any required change in business.
A company I have been working with was eager in studying their path to the future, which required more global collaboration, a competitive business model and a more customer centric approach. After a long evaluation phase they decided they need PLM, which was new for most of the people in the company. Although the project team was enthusiastic, they were not able to pass the blockers for a change. Ironically enough they lost a significant part of their business to companies that have implemented PLM. Defending the past is not a guarantee for the future.
A second example is Nokia. Nokia was famous for they ways they were able to transform their business in the past. How come they did not see the smartphone and touch screens upcoming ? Apparently based on several articles presented recently, it was Nokia´s internal culture and superior feeling that they were dominating the market, that made it impossible to switch. The technology was known, the concepts were there, however the (middle) management was full of blockers.
Two examples where blockers had a huge impact on the company.
Staying in business and remaining competitive is crucial for companies. In particular the changes that currently happen require people to work different in order to stay completive. Documents will become reports generated from data. People handling and collecting documents to generate new documents will become obsolete as a modern data-centric approach makes them redundant. Keeping the old processes might destroy a company. This should convince the blockers to give up