You are currently browsing the category archive for the ‘Cloud’ category.
Shaping the PLM platform of the Future
It was the first time I attended this event. I was positively surprised about the audience and content. Where other PLM conferences were often more focusing on current business issues, here a smaller audience (130 persons) was looking into more details around the future of PLM. Themes like PLM platforms, the Circular Economy, Open Standards and longevity of data were presented and discussed here.
The emergence of the PLM platform
1. The product lifecycle will become more and more circular due to changing business models and in parallel the different usage/availability of materials will have an impact how we design and deliver products
Can current processes and tools support today’s complexity. And what about tomorrow? According to a CIMdata survey there is a clear difference in profit and performance between leaders and followers, and the gap is increasing faster. “Can you afford yourself to be a follower ?” is a question companies should ask themselves.
Rethinking PLM platform does not bring the 2-3 % efficiency benefit but can bring benefits from 20 % and more.
Peter sees a federated platform as a must for companies to survive. I in particular likes his statement:
The new business platform paradigm is one in which solutions from multiple providers must be seamlessly deployed using a resilient architecture that can withstand rapid changes in business functions and delivery modalities
Industry voices on the Future PLM platform
Steven Vetterman from ProSTEP talked about PLM in the automotive industry. Steven started describing the change in the automotive industry, by quoting Heraclitus Τα πάντα ρεί – the only constant is change. Steven described two major changes in the automotive industry:
1. The effect of globalization, technology and laws & ecology
2. The change of the role of IT and the impact of culture & collaboration
Interesting observation is that the preferred automotive market will shift to the BRIC countries. In 2050 more than 50 % of the world population (estimate almost 10 billion people at that time) will be living in Asia, 25 percent in Africa. Europe and Japan are aging. They will not invest in new cars.
For Steven, it was clear that current automotive companies are not yet organized to support and integrate modern technologies (systems engineering / electrical / software) beyond mechanical designs. Neither are they open for a true global collaboration between all players in the industry. Some of the big automotive companies are still struggling with their rigid PLM implementation. There is a need for open PLM, not driven from a single PLM system, but based on a federated environment of information.
Yves Baudier spoke on behalf of the aerospace industry about the standardization effort at their Strategic Standardization Group around Airbus and some of its strategic suppliers, like Thales, Safran, BAE systems and more. If you look at the ASD Radar, you might get a feeling for the complexity of standards that exist and are relevant for the Airbus group.
It is a complex network of evolving standard all providing (future) benefits in some domains. Yves was talking about the through Lifecycle support which is striving for data creation once and reuse many times during the lifecycle. The conclusion from Yves, like all the previous speakers is that: The PLM Platform of the Future will be federative, and standards will enable PLM Interoperability
Energy and Marine
Shefali Arora from Wärtsilä spoke on behalf of the energy and marine sector and gave an overview of the current trends in their business and the role of PLM in Wärtsilä. With PLM, Wärtsilä wants to capitalize on its knowledge, drive costs down and above all improve business agility. As the future is in flexibility. Shefali gave an overview of their PLM roadmap covering the aspects of PDM (with Teamcenter), ERP (SAP) and a PLM backbone (Share-A-space). The PLM backbone providing connectivity of data between all lifecycle stages and external partners (customer / suppliers) based on the PLCS standard. Again another session demonstrating the future of PLM is in an open and federated environment
The future PLM platform is a federated platform which adheres to standards provides openness of interfaces that permit the platform to be reliable over multiple upgrade cycles and being able to integrate third-parties (Peter Bilello)
The afternoon session I followed the Systems Engineering track. Peter Bilello gave an overview of Model-Based Systems engineering and illustrated based on a CIMdata survey that even though many companies have a systems engineering strategy in place it is not applied consistently. And indeed several companies I have been dealing with recently expressed their desire to integrate systems engineering into their overall product development strategy. Often this approach is confused by believing requirements management and product development equal systems engineering. Still a way to go.
Dieter Scheithauer presented his vision that Systems Engineering should be a part of PLM, and he gave a very decent, academic overview how all is related. Important for companies that want to go into that direction, you need to understand where you aiming at. I liked his comparison of a system product structure and a physical product structure, helping companies to grab the difference between a virtual, system view and a physical product view:
More Industry voices
The afternoon session started with Christophe Castaing, explaining BIM (Building Information Modeling) and the typical characteristics of the construction industry. Although many construction companies focus on the construction phase, for 100 pieces of information/exchange to be managed during the full life cycle only 5 will be managed during the initial design phase (BIM), 20 will be managed during the construction phase (BAM) and finally 75 will be managed during the operation phase (BOOM). I wrote about PLM and BIM last year: Will 2014 become the year the construction industry will discover PLM?
Christophe presented the themes from the French MINnD project, where the aim is starting from an Information Model to come to a platform, supporting and integrated with the particular civil and construction standards, like IFC. CityGml but also PLCS standard (isostep ISO 10303-239
Amir Rashid described the need for PLM in the consumer product markets stating the circular economy as one of the main drivers. Especially in consumer markets, product waste can be extremely high due to the short lifetime of the product and everything is scrapped to land waste afterward. Interesting quote from Amir: Sustainability’s goal is to create possibilities not to limit options. He illustrated how Xerox already has sustainability as part of their product development since 1984. The diagram below demonstrates how the circular economy can impact all business today when well-orchestrated.
Marc Halpern closed the tracks with his presentation around Product Innovation Platforms, describing how Product Design and PLM might evolve in the upcoming digital era. Gartner believes that future PLM platforms will provide insight (understand and analyze Big Data), Adaptability (flexible to integrate and maintain through an open service oriented architecture), promoting reuse (identifying similarity based on metadata and geometry), discovery (the integration of search analysis and simulation) and finally community (using the social paradigm).
If you look to current PLM systems, most of them are far from this definition, and if you support Gartner’s vision, there is still a lot of work for PLM vendor to do.
Interesting Marc also identified five significant risks that could delay or prevent from implementing this vision:
- inadequate openness (pushing back open collaboration)
- incomplete standards (blocking implementation of openness)
- uncertain cloud performance (the future is in cloud services)
- the steep learning curve (it is a big mind shift for companies)
- Cyber-terrorism (where is your data safe?)
After Marc´s session there was an interesting panel discussion with some the speakers from that day, briefly answering discussing questions from the audience. As the presentations have been fairly technical, it was logical that the first question that came up was: What about change management?
A topic that could fill the rest of the week but the PDT dinner was waiting – a good place to network and digest the day.
Day 2 started with two interesting topics. The first presentation was a joined presentation from Max Fouache (IBM) and Jean-Bernard Hentz (Airbus – CAD/CAM/PDM R&T and IT Backbones). The topic was about the obsolescence of information systems: Hardware and PLM applications. As in the aerospace industry some data needs to be available for 75 years. You can imagine that during 75 years a lot can change to hardware and software systems. At Airbus, there are currently 2500 applications, provided by approximate 600 suppliers that need to be maintained. IBM and Airbus presented a Proof of Concept done with virtualization of different platforms supporting CATIA V4/V5 using Linux, Windows XP, W7, W8 which is just a small part of all the data.
The conclusion from this session was:
To benefit from PLM of the future, the PLM of the past has to be managed. Migration is not the only answer. Look for solutions that exist to mitigate risks and reduce costs of PLM Obsolescence. Usage and compliance to Standards is crucial.
Next Howard Mason, Corporate Information Standards Manager took us on a nice journey through the history of standards developed in his business. I loved his statement: Interoperability is a right, not a privilege
In the systems engineering track Kent Freeland talked about Nuclear Knowledge Management and CM in Systems Engineering. As this is one of my favorite domains, we had a good discussion on the need for pro-active Knowledge Management, which somehow implies a CM approach through the whole lifecycle of a plant. Knowledge management is not equal to store information in a central place. It is about building and providing data in context that it can be used.
Ontology for systems engineering
Leo van Ruijven provided a session for insiders: An ontology for Systems Engineering based on ISO 15926-11. His simplified approach compared to the ISO 15288 lead to several discussion between supporters and opponents during lunch time.
Master Data Management
Based on the type of information companies want to manage in relation to each other supported by various applications (PLM, ERP, MES, MRO, …) this can be a complex exercise and Marc ended with recommendations and an action plan for the MDM lead. In my customer engagements I also see more and more the digital transformation leads to MDM questions. Can we replace Excel files by mastered data in a database?
Almost at the end of the day I was speaking about the PDM platform of the people targeted for the people from the future. Here I highlighted the fundamental change in skills that’s upcoming. Where my generation was trained to own and capture information as much as possible information in your brain (or cabinet), future generations are trained and skilled in finding data and building information out of it. Owning (information) is not crucial for them. Perhaps as the world is moving fast. See this nice YouTube movie at the end.
Ella Jamsin ended the conference on behalf of the Ellen MacArthur Foundation explaining the need to move to a circular economy and the PLM should play a role in that. No longer is PLM from cradle-to-grave but PLM should support the lifecycle from cradle-to-cradle.
Unfortunate I could not attend all sessions as there were several parallel sessions. Neither have I written about all sessions I attended. The PDT Europe conference, a conference for people who mind about the details around the PLM future concepts and the usage of standards, is a must for future strategists.
This is for the moment the last post about the difference between files and a data-oriented approach. This time I will focus on the need for open exchange standards and the relation to proprietary systems. In my first post, I explained that a data-centric approach can bring many business benefits and is pointing to background information for those who want to learn more in detail. In my second post, I gave the example of dealing with specifications.
It demonstrated that the real value for a data-centric approach comes at the moment there are changes of the information over time. For a specification that is right the first time and never changes there is less value to win with a data-centric approach. Moreover, aren’t we still dreaming that we do everything right the first time.
The specification example was based on dealing with text documents (sometimes called 1D information). The same benefits are valid for diagrams, schematics (2D information) and CAD models (3D information)
The challenge for a data-oriented approach is that information needs to be stored in data elements in a database, independent of an individual file format. For text, this might be easy to comprehend. Text elements are relative simple to understand. Still the OpenDocument standard for Office documents is in the background based on a lot of technical know-how and experience to make it widely acceptable. For 2D and 3D information this is less obvious as this is for the domain of the CAD vendors.
CAD vendors have various reasons not to store their information in a neutral format.
- First of all, and most important for their business, a neutral format would reduce the dependency on their products. Other vendors could work with these formats too, therefore reducing the potential market capture. You could say that in a certain manner the Autodesk 2D format for DXF (and even DWG) have become a neutral format for 2D data as many other vendors have applications that read and write back information in the DXF-data format. So far DXF is stored in a file but you could store DXF data also inside a database and make it available as elements.
- This brings us to the second reason why using neutral data formats are not that evident for CAD vendors. It reduces their flexibility to change the format and optimize it for maximal performance. Commercially the significant, immediate disadvantage of working in neutral formats is that it has not been designed for particular needs in an individual application and therefore any “intelligent” manipulations on the data are hard to achieve..
The same reasoning can be applied to 3D data, where different neutral formats exist (IGES, STEP, …. ). It is very difficult to identify a common 3D standard without losing many benefits that an individual 3D CAD format brings currently. For example, CATIA is handling 3D CAD data in a complete different way as Creo does, and again handled different compared to NX, SolidWorks, Solid Edge and Inventor. Even some of them might use the same CAD kernel.
However, it is not only about the geometry anymore; the shapes represent virtual objects that have metadata describing the objects. In addition other related information exists, not necessarily coming from the design world, like tasks (planning), parts (physical), suppliers, resources and more
PLM, ERP, systems and single source of truth
This brings us in the world of data management, in my world mainly PLM systems and ERP systems. An ERP system is already a data-centric application, the BOM is already available as metadata as well as all the scheduling and interaction with resources, suppliers and financial transactions. Still ERP systems store a lot of related documents and drawings, containing content that does not match their data model.
PLM systems have gradually becoming more and more data centric as the origin was around engineering data, mostly stored in files. In a data-centric approach, there is the challenge to exchange data between a PLM system and an ERP system. Usually there is a need to share information between two systems, mainly the items. Different definitions of an item on the PLM and ERP side make it hard to exchange information from one system to the other. It is for that reason why there are many discussions around PLM and ERP integration and the BOM.
In the modern data-centric approach however we should think less and less in systems and more and more in business processes performed on actual data elements. This requires a company-wide, actually an enterprise-wide or industry-wide data definition of all information that is relevant for the business processes. This leads into Master Data Management, the new required skill for enterprise solution architects
The data-centric approach creates the impression that you can achieve a single source of the truth as all objects are stored uniquely in a database. SAP solves the problem by stating everything fits in their single database. To my opinion this is more a black hole approach: Everything gets inside, but even light cannot escape. Usability and reuse of information that was stored with the intention not to be found is the big challenge here.
Other PLM and ERP vendors have different approaches. Either they choose for a service bus architecture where applications in the background link and synchronize common data elements from each application. Therefore, there is some redundancy, however everything is connected. More and more PLM vendors focus on building a platform of connected data elements, where on top applications will run, like the 3DExperience platform from Dassault Systèmes.
As users we are more and more used to platforms as Google, Apple provide these platforms already in the cloud for common use on our smartphones. The large amount of apps run on shared data elements (contacts, locations …) and store additional proprietary data.
Platforms, Networks and standards
And here we enter an interesting area of discussion. I think it is a given that a single database concept is a utopia. Therefore, it will be all about how systems and platforms communicate with each other to provide in the end the right information to the user. The systems and platforms need to be data-centric as we learned from the discussion around the document (file centric) or data-centric approach.
In this domain, there are several companies already active for years. Datamation from Dr. Kais Al-Timimi in the UK is such a company. Kais is a veteran in the PLM and data modeling industry, and they provide a platform for data-centric collaboration. This quote from one of his presentations, illustrates we share the same vision:
“……. the root cause of all interoperability and data challenges is the need to transform data between systems using different, and often incompatible, data models.
It is fundamentally different from the current Application Centric Approach, in that data is SHARED, and therefore, ‘NOT OWNED’ by the applications that create it.
This means in a Data Centric Approach data can deliver MORE VALUE, as it is readily sharable and reusable by multiple applications. In addition, it removes the overhead of having to build and maintain non-value-added processes, e.g. to move data between applications.”
Another company in the same domain is Eurostep, who are also focusing on business collaboration between in various industries. Eurostep has been working with various industry standards, like AP203/214, PLCS and AP233. Eurostep has developed their Share-A-space platform to enable a data-centric collaboration.
This type of data collaboration is crucial for all industries. Where the aerospace and automotive industry are probably the most mature on this topic, the process industry and construction industry are currently also focusing on discovering data standards and collaboration models (ISO 15926 / BIM). It will be probably the innovators in these industries that clear the path for others. For sure it will not come from the software vendors as I discussed before.
If you reach this line, it means the topic has been interesting in depth for you. In the past three post starting from the future trend, an example and the data modeling background, I have tried to describe what is happening in a simplified manner.
If you really want to dive into the PLM for the future, I recommend you visit the upcoming PDT 2014 conference in Paris on October 14 and 15. Here experts from different industries will present and discuss the future PLM platform and its benefits. I hope to meet you there.
Some more to read:
Last week I attended the PI Apparel conference in London. It was the second time this event was organized and approximate 100 participants were there for two full days of presentations and arranged network meetings. Last year I was extremely excited about this event as the different audience, compare to classical PLM events, and was much more business focused.
Read my review from last year here: The weekend after PI Apparel 2013
This year I had the feeling that the audience was somewhat smaller, missing some of the US representatives and perhaps there was a slightly more, visible influence from the sponsoring vendors. Still an enjoyable event and hopefully next year when this event will be hosted in New York, it will be as active as last year.
Here are some of my observations.
Again the event had several tracks in parallel beside the keynotes, and I look forward in the upcoming month to see the sessions I could not attend. Obvious where possible I followed the PLM focused sessions.
First keynote came from Micaela le Divelec Lemmi, Executive Vice President and Chief Corporate Operations Officer of Gucci. She talked us through the areas she is supervising and gave some great insights. She talked about how Gucci addresses sustainability through risk and cost control. Which raw materials to use, how to ensure the brands reputation is not at risk, price volatility and the war on talent. As Gucci is a brand in the high-end price segment, image and reputation are critical, and they have the margins to assure it is managed. Micaela spoke about the short-term financial goals that a company as Gucci has related to their investors. Topics she mentioned (I did not write them down as I was tweeting when I heard them) were certainly worthwhile to consider and discuss in detail with a PLM consultant.
Micaela further described Gucci´s cooperate social responsibility program with a focus on taking care of the people, environment and culture. Good to learn that human working conditions and rights are a priority even for their supply chain. Although it might be noted that 75 % of Gucci´s supply chain is in Italy. One of the few brands that still has the “Made in Italy” label.
My conclusion was that Micaela did an excellent PR job for Gucci, which you would expect for a brand with such a reputation. Later during the conference we had a discussion would other brands with less exclusivity and more operating in the mass consumer domain be able to come even close to such programs?
The company is successful in manufacturing and selling licensed products from Pierre Cardin, Cacharel and US Polo Association mainly outside the US and Western Europe.
Their primary focus was to provide access to the most accurate and most updated information from one source. In parallel, standardization of codes and tech packs was a driver. Through standardization quality and (re)use could be improved, and people would better understand the details. Additional goals are typical PLM goals: following the product development stages during the timeline, notify relevant users about changes in the design, work on libraries and reuse and integrate with SAP.
Interesting Hakan mentioned that in their case SAP did not recommend to use their system for the PLM related part due to lack of knowledge of the apparel industry. A wise decision which would need followup for other industries.
In general the PLM implementation described by Göktug and Hakan was well phased and with a top-down push to secure there is no escape to making the change. As of all PLM implementations in apparel they went live in their first phase rather fast as the complex CAD integrations from classical PLM implementations were not needed here.
Next I attended the Infor session with the title: Work the Way you Live: PLM built for the User. A smooth marketing session with a function / feature demo demonstrating the flexibility and configuration capabilities of the interface. Ease of use is crucial in the apparel industry, where Excel is still the biggest competitor. Excel might satisfy the needs from the individual, it lacks the integration and collaboration aspect a PLM system can offer.
More interesting was the next session that I attended from Marcel Oosthuis, who was responsible as Process Re-Engineering Director (read PLM leader). Marcel described how they had implemented PLM at Tommy Hilfiger, and it was an excellent story (perhaps too good to be true).
I believe larger companies with the right focus and investment in PLM resources can achieve this kind of results. The target for Tommy Hilfiger´s PLM implementation was beyond 1000 users, therefore, a serious implementation.
Upfront the team defined first what the expected from the PLM system to select (excellent !!). As the fashion industry is fast, demanding and changing all the time, the PLM system needs to be Swift, Flexible and Prepared for Change. This was not a classical PLM requirement.
In addition, they were looking for a high-configurable system, providing best practices and a vendor with a roadmap they could influence. Here I got a little more worried as high-configurable and best practices not always match the prepared for change approach. A company might be tempted to automate the way they should work based on the past (best practices from the past)
It was good to hear that Marcel did not have to go into the classical ROI approach for the system. His statement, which I fully endorse that it is about the capability to implement new and better processes. They are often not comparable with the past (and nobody measured the past)
Marcel described how the PLM team (eight people + three external from the PLM vendor) made sure that the implementation was done with the involvement of the end users. End user adoption was crucial as also key user involvement when building and configuring the system.
It was one of the few PLM stories where I hear how all levels of the organization were connected and involved.
Next Sue Butler, director from Kurt Salmon, described how to maximize ROI from your PLM investment. It is clear that many PLM consultants are aligned, and Sue brought up all the relevant points and angles you needed to look at for successful PLM implementation.
Main points: PLM is about changing the organization and processes, not about implementing a tool. She made a point that piloting the software is necessary as part of the learning and validation process. I agree on that under the condition that it is an agile pilot which does not take months to define and perform. In that case, you might be already locked in into the tool vision too much – focus on the new processes you want to achieve.
Moreover, because Sue was talking about maximize ROI from a PLM implementation, the topics focus on business areas that support evolving business processes and measure (make sure you have performance metrics) came up.
The next session Staying Ahead of the Curve through PLM Roadmap Reinvention conducted by Austin Mallis, VP Operations, Fashion Avenue Sweater Knits, beautifully completed previous sessions related to PLM.
Austin nicely talked about setting the right expectations for the future (There is no perfect solution / Success does not mean stop / Keeping the PLM vision / No True End). In addition, he described the human side of the implementation. How to on-board everyone (if possible) and admitting you cannot get everyone on-board for the new way of working.
Luckily the speakers before me that day already addressed many of the relevant topics, and I could focus on three main thoughts completing the story:
1. Who decides on PLM and Why?
I published the results from a small survey I did a month ago via my blog (A quick PLM survey). See the main results below.
It was interesting to observe that both the management and the users in the field are the majority demanding for PLM. Consultants have some influence and PLM vendors even less. The big challenge for a company is that the management and consultants often talk about PLM from a strategic point of view, where the PLM vendor and the users in the field are more focused on the tool(s).
From the expectations you can see the majority of PLM implementations is about improving collaboration, next time to market, increase quality and centralizing and managing all related information.
2. Sharing data instead of owning data
(You might have read about it several times in my blog) and the trend that we move to platforms with connected data instead of file repositories. This should have an impact on your future PLM decisions.
3. Choosing the right people
The third and final thought was about choosing the right people and understanding the blocker. I elaborated on that topic already before in my recent blog post: PLM and Blockers
My conclusions for the day were:
A successful PLM implementation requires a connection in communication and explanation between all these levels. These to get a company aligned and have an anchored vision before even starting to implement a system (with the best partner)
The day was closed by the final keynote of the day from Lauren Bowker heading T H E U N S E E N. She and her team are exploring the combinations of chemistry and materials to create new fashion artifacts. Clothes and materials that change color based on air vent, air pollution or brain patterns. New and inspiring directions for the fashion lovers.
Have a look here: http://seetheunseen.co.uk/
The morning started with Suzanne Lee, heading BioCouture who is working on various innovative methodologies to create materials for the apparel industry by using all kind of live micro-organisms like bacteria, fungi and algae and using materials like cellulose, chitin and protein fibers, which all can provide new possibilities for sustainability, comfort, design, etc. Suzanne´s research is about exploring these directions perhaps shaping some new trends in the 5 – 10 years future ahead. Have a look into the future here:
Renate Eder took us into the journey of visualization within Adidas, with her session: Utilizing Virtualization to Create and Sell Products in a Sustainable Manner.
It was interesting to learn that ten years ago she started the process of having more 3D models in the sales catalogue. Where classical manufacturing companies nowadays start from a 3D design, here at Adidas at the end of the sales cycle 3D starts. Logical if you see the importance and value 3D can have for mass market products.
Adidas was able to get 16000 in their 3D catalogue thanks to the work from 60 of their key suppliers who were fully integrated in the catalogue process. The benefit from this 3D catalogue was that their customers, often the large stores, need lesser samples, and the savings are significant here (plus a digital process instead of transferring goods).
Interesting discussion during the Q&A part was that the virtual product might even look more perfect than the real product, demonstrating how lifelike virtual products can be.
And now Adidas is working further backwards from production patterns (using 3D) till at the end 3D design. Although a virtual 3D product cannot 100 % replace the fit and material feeling, Renate believes that also introducing 3D during design can reduce the work done during pilots.
Finally for those who stayed till the end there was something entirely different. Di Mainstone elaborating on her project: Merging Architecture & the Body in Transforming the Brooklyn Bridge into a Playable Harp. If you want something entirely different, watch here:
The apparel industry remains an exciting industry to follow. For some of the concepts – being data-centric, insane flexible, continuous change and rapid time to market are crucial here.
This might lead development of PLM vendors for the future, including using it based on cloud technology.
From the other side, the PLM markets in apparel is still very basic and learning, see this card that I picked up from one of the vendors. Focus on features and functions, not touching the value (yet)
Everyone wants to be a game changer and in reality almost no one is a game changer. Game changing is a popular term and personally I believe that in old Europe and probably also in the old US, we should have the courage and understanding changing the game in our industries.
Why ? Read the next analogy.
With my Dutch roots and passion for soccer, I saw the first example of game changing happening in 1974 with soccer. The game where 22 players kick a ball from side to side, and the Germans win in the last minute.
My passion and trauma started that year where the Dutch national team changed the soccer game tactics by introducing totaalvoetbal.
Defenders could play as forwards and they other way around. Combined with the offside-trap; the Dutch team reached the finals of the world championship soccer both in 1974 and 1978. Of course losing the final in both situations to the home playing teams (Germany in 74 – Argentina in 78 with some help of the referee we believe)
This concept brought the Dutch team for several years at the top, as the changed tactics brought a competitive advantage. Other teams and players, not educated in the Dutch soccer school could not copy that concept so fast
At the same time, there was a game changer for business upcoming in 1974, the PC.
On the picture, you see Steve Jobs and Steve Wozniak testing their Apple 1 design. The abbreviation IT was not common yet and the first mouse device and Intel 8008 processor were coming to the market.
This was disruptive innovation at that time, as we would realize 20 years later. The PC was a game changer for business.
Johan Cruyff remained a game changer and when starting to coach and influence the Barcelona team, it was his playing concept tika-taka that brought the Spanish soccer team and the Barcelona team to the highest, unbeatable level in the world for the past 8 years
Instead of having strong and tall players to force yourself to the goal, it was all about possession and control of the ball. As long as you have the ball the opponent cannot score. And if you all play very close together around the ball, there is never a big distance to pass when trying to recapture the ball.
This was a game changer, hard to copy overnight, till the past two years. Now other national teams and club teams have learned to use these tactics too, and the Spanish team and Barcelona are no longer lonely at the top.
Game changers have a competitive advantage as it takes time for the competition to master the new concept. And the larger the change, the bigger the impact on business.
Also, PLM was supposed to be a game changer in 2006. The term PLM became more and more accepted in business, but was PLM really changing the game ?
PLM at that time was connecting departments and disciplines in a digital manner with each other, no matter where they were around the globe. And since the information was stored in centralized places, databases and file sharing vaults, it created the illusion that everyone was working along the same sets of data.
The major successes of PLM in this approach are coming from efficiency through digitization of data exchange between departments and the digitization of processes. Already a significant step forward and bringing enough benefits to justify a PLM implementation.
Still I do not consider PLM in 2006 a real game changer. There was often no departmental or business change combined with it. If you look at the soccer analogy, the game change is all about a different behavior to reach the goal, it is not about better tools (or shoes).
The PLM picture shows the ideal 2006 picture, how each department forwards information to the next department. But where is PLM supporting after sales/services in 2006 ? And the connection between After Sales/Services and Concept is in most of the companies not formalized or existing. And exactly that connection should give the feedback from the market, from the field to deliver better products.
The real game changer starts when people learn and understand sharing data across the whole product or project lifecycle. The complexity is in the word sharing. There is a big difference between storing everything in a central place and sharing data so other people can find it and use it.
People are not used to share data. We like to own data, and when we create or store data, we hate the overhead of making data sharable (understandable) or useful for others. As long as we know where it is, we believe our job is safe.
But our job is no longer safe as we see in the declining economies in Europe and the US. And the reason for that:
Data is changing the game
In the recent years the discussion about BI (Business Intelligence) and Big Data emerged. There is more and more digital information available. And it became impossible for companies to own all the data or even think about storing the data themselves and share it among their dispersed enterprises. Combined with the rise of cloud-based platforms, where data can be shared (theoretically) no matter where you are, no matter which device you are using, there is a huge potential to change the game.
It is a game changer as it is not about just installing the new tools and new software. There are two major mind shifts to make.
- It is about moving from documents towards data. This is an extreme slow process. Even if your company is 100 % digital, it might be that your customer, supplier still requires a printed and wet-signed document or drawing, as a legal confirmation for the transaction. Documents are comfortable containers to share, but they are killing for fast and accurate processing of the data that is inside them.
- It is about sharing and combining data. It does not make sense to dump data again in huge databases. The value only comes when the data is shared between disciplines and partners. For example, a part definition can have hundreds of attributes, where some are created by engineering, other attributes created by purchasing and some other attributes directly come from the supplier. Do not fall in the ERP-trap that everything needs to be in one system and controlled by one organization.
Because of the availability of data, the world has become global and more transparent for companies. And what you see here is that the traditional companies in Europe and the US struggle with that. Their current practices are not tuned towards a digital world, more towards the classical, departmental approach. To change this, you need to be a game changer, and I believe many CEOs know that they need to change the game.
The upcoming economies have two major benefits:
- Not so much legacy, therefore, building a digital enterprise for them is easier. They do not have to break down ivory towers and 150 years of proud ownership.
- The average cost of labor is lower than the costs in Europe and the US, therefore, even if they do not do it right at the first time; there is enough margin to spend more resources to meet the objectives.
The diagram I showed in July during the PI Apparel conference was my interpretation of the future of PLM. However, if you analyze the diagram, you see that it is not a 100 % classical PLM scope anymore. It is also about social interaction, supplier execution and logistics. These areas are not classical PLM domains and therefore I mentioned in the past, the typical PLM system might dissolve in something bigger. It will be all about digital processes based on data coming for various sources, structured and unstructured. Will it still be PLM or will we call it different ?
The big consultancy firms are all addressing this topic – not necessary on the PLM level:
2012 Cap Gemini – The Digital advantage: …..
2013 Accenture – Dealing with digital technology’s disruptive impact on the workforce
For CEOs it is important to understand that the new, upcoming generations are already thinking in data (generation Y and beyond). By nature, they are used to share data instead of owning data in many aspects. Making the transition to the future is, therefore, also a process of connecting and understanding the future generations. I wrote about it last year: Mixing past and future generations with a PLM sauce
This cannot be learned from an ivory tower. The easiest way is not to be worried by this trend and continue working as before, losing business and margin slowly year by year.
As in many businesses people are fired for making big mistakes, doing nothing unfortunate is most of the time not considered as a big mistake, although it is the biggest mistake.
During the upcoming PI Conference in Berlin I will talk about this topic in more detail and look forward to meet and discuss this trend with those of you who can participate.
The soccer analogy stops here, as the data approach kills the the old game.
In soccer, the maximum remains 11 players on each side and one ball. In business, thanks to global connectivity, the amount of players and balls involved can be unlimited.
Because the leagues I was playing in, were always limited in scope: by age, local,regional, etc. Therefore it was easy to win in a certain scope and there are millions of soccer champions beside me. For business, however, there are almost no borders.
Global competition will require real champions to make it work !!!
The last month I haven’t been able to publish much of my experiences as I have been in the middle of several PLM selection processes for various industries. Now in a quiet moment looking back, I understand it is difficult for a company to choose a PLM solution for the future.
I hope this post will generate some clarity and may lead to some further discussion with other experts in the audience. I wrote about the do’s and don’ts of PLM selection in 2010, and most of it is still actual; however, there is more. Some of the topics explained:
Do you really need PLM ?
This is where it starts. PLM is not Haarlemerolie, an old Dutch medicine that was a cure for everything since the 17th century. The first step is that you need to know what you want to achieve and how you are aiming to achieve it. Just because a competitor has a PLM system installed, does not mean they use it properly or that your company should do it too. If you do not know why your company needs PLM, stop reading and start investigating.
If you are still reading this, you are part of the happy few, as justifying the need for PLM is not easy. Numerous of companies have purchased a PLM system just because they think they needed PLM. Or there was someone convinced that this software would bring PLM.
Most of these cases there was the confusion with PDM. Simply stating: PDM is more a departmental tool (engineering – multidisciplinary) where PLM is a mix of software, infrastructure to connect all departments in a company and support the product through its entire lifecycle.
Implementing “real” PLM is a business change, as people have to start sharing data instead of pushing documents from department to department. And this business transformation is a journey. It is not a fun journey, nicely characterized in Ed Lopategui’s blog post, the PLM Trail.
Although I believe it is not always that dramatic, Ed set the expectations right. Be well prepared before you start.
Why do companies still want PLM, while it is so difficult to implement?
The main reason is to remain competitive. If margins are under pressure, you can try to be more efficient, get better and faster tools. But by working in the old way, you can only be a little better.
Moving from a sequential, information pushing approach towards an on-line, global information sharing manner is a change in business processes. It is interaction between all stakeholders. Doing things different requires courage, understanding and trust you made the right choice. When it goes wrong, there are enough people around you to point fingers at why it went wrong – hindsight is so easy.
Doing nothing and becoming less and less competitive is easier (the boiling frog again) as in that case the outside world will be blamed, and there is nobody to point fingers at (although if you understand the issue you should make the organization aware the future is at stake)
Why is PLM so expensive?
Assuming you are still reading, and you and your management are aligned there is a need for PLM, a first investigation into possible solutions will reveal that PLM is not cheap.
When you calculate the overall investment required in PLM, the management often gets discouraged by the estimated costs. Yes, the benefits are much higher, but to realize these benefits, you need to have a clear understanding of your own business and a realistic idea how the future would look like. The benefits are not in efficiency. The main benefits come from capabilities that allow you to respond better and faster than by just optimizing your departments. I read a clarifying post recently, which is addressing this issue: Why PLM should be on every Executive’s agenda !
From my experience with PLM projects, it is surprising to learn that companies do not object to spend 5 to 20 times more money for an ERP implementation. It is related to the topic: management by results or management by means.
PLM is not expensive compared to other enterprise systems. It can become expensive (like ERP implementations) if you lose control. Software vendors have a business in selling software modules, like car resellers have a business in selling you all the comfort beyond the basics.
The same for implementation partners, they have a business in selling services to your company, and they need to find the balance between making money and delivering explainable value. Squeezing your implementation partner will cause a poor delivery. But giving them an open check means that, at a certain moment, someone will stand up and shutdown the money drain as the results are no longer justifiable. Often I meet companies in this stage, the spirit has gone. It is all about the balance between costs and benefits.
This happens in all enterprise software projects, and the only cure is investing in your own people. Give your employees time and priority to work in a PLM project. People with knowledge of the business are essential, and you need IT resources to implement. Do not make the mistake to leave business uncommitted to the PLM implementation. Management and middle management does not take the time to understand PLM as they are too busy or not educated / interested.
Make business owners accountable for the PLM implementation – you will see stress (it is not their daily job – they are busy), but in the longer time you will see understanding and readiness of the organization to achieve the expected results.
We are the largest – why select the largest ?
When your assignment is to select a new enterprise system, life could be easy for you. Select a product or service from the largest business and your career is saved. Nobody gets blamed for selecting the largest vendor, although if you work for a small mid-sized company, you might think twice.
Many vendors and implementers start their message with:
“…. Market leader in ABC, though leader in XYZ, recognized by 123”
The only thing you should learn from this message is that this company probably has delivered a trustworthy solution in the past. Looking at the past you get an impression of its readiness and robustness for the future. Many promising companies have been absorbed by the larger ones and disappeared. As Clayton Christensen wrote in The Innovators Dilemma:
“What goes up does not go down”.
Meaning these large companies focus on their largest clients and will focus less on the base of the business pyramid (where the majority is), making them vulnerable for disruptive innovation.
Related to this issue there is an interesting post (and its comments), written by Oleg Shilovitsky recently: How many PLM vendors disappear in disruption predicted by Gartner.
Still when selecting a PLM vendor it is essential to know if they have the scale to support you in the future and if they have the vision to guide you into the future.
The future of PLM is towards managing data in a connected manner, not necessary coming from a single database, not necessary using only structured data. If your PLM vendor or implementer is pushing you to realize document and file management, they are years late and not the best for your future.
PLM is a big elephant
PLM is considered as a big elephant, and I agree if you address everything in one shot that PLM can do. PLM has multiple directions to start from – I wrote about it: PLM at risk – it does not have a single job
PLM has a huge advantage compared to a transactional system like ERP and probably CRM. You can implement a PLM infrastructure and its functionality step by step in the organization, start with areas that are essential and produce clear benefits for the organization. That is the main reason that PLM implementations can take 2 – 3 years. You give the organization time to learn, to adapt and to extend.
We lose our flexibility ?
Nobody in an organization likes to be pushed in a cooperate way of working, which by definition is not as enjoyable and as flexible as they way you currently work. It is still an area where PLM implementations can improve: provide the user with an environment that is not too rigid and does not feel like a rigid system. You seen this problem with old traditional large PLM implementations for example with automotive OEMs. For them, it is almost impossible to switch to a new PLM implementation as everything has been built and connected in such a proprietary way, almost impossible to move to more standard systems and technologies. Late PLM implementations should learn from these lessons learned.
PLM vendor A says PLM vendor B will be out of business
One of the things I personally dislike is FUD (Fear, Uncertainty and Doubt). It has become a common practice in politics and I have seen PLM vendors and implementers using the same tactics. The problem with FUD is that it works. Even if the message is not verifiable, the company looking for a PLM system might think there must be some truth in this statement.
My recommendation to a company that gets involved in FUD during a PLM selection process, they should be worried about the company spreading the FUD. Apparently they have no stronger arguments to explain to you why they are the perfect solution; instead they tell you indirectly we are the less worst.
Is the future in the cloud ?
I think there are two different worlds. There is the world of smaller businesses that do not want to invest in an IT-infrastructure and will try anything that looks promising – often tools oriented. This is one of my generalizations of how US businesses work – sorry for that. They will start working with cloud based systems and not be scared by performance, scalability and security. As long all is easy and does not disturb the business too much.
Larger organizations, especially with a domicile in Europe, are not embracing cloud solutions at this moment. They think more in private or on-premise environments. Less in cloud solutions as security of information is still an issue. The NSA revelations prove that there is no moral limit for information in the sake of security – combined with the fear of IP theft from Asia, I think European companies have a natural resistance for storing data outside of their control.
For sure you will see cloud advocates, primarily coming from the US, claiming this is the future (and they are right), but there is still work to do and confidence to be built.
PLM selection often has a focus on checking hundreds of requirements coming from different departments. They want a dream system. I hope this post will convince you that there are so many other thoughts relevant to a PLM selection you should take into account. And yes you still need requirements (and a vision).
Your thoughts ?
- CIMdata Publishes PLM Geography Report (detroit.cbslocal.com)
Who does not remember this tagline from the first official Soap series starting in 1977 and released in the Netherlands in 1979?
Every week the Campbells and the Tates entertained us with all the ingredients of a real soap: murder, infidelity, aliens’ abduction, criminality, homosexuality and more.
The episode always ended with a set of questions, leaving you for a week in suspense , hoping the next episode would give you the answers.
For those who do not remember the series or those who never saw it because they were too young, this was the mother of all Soaps.
What has it to do with PLM?
Soap has to do with strange people that do weird things (I do not want to be more specific). Recently I noticed that this is happening even in the PLM blogger’s world. Two of my favorite blogs demonstrated something of this weird behavior.
First Steve Ammann in his Zero Wait-State blog post: A PLM junkie at sea point-solutions versus comprehensive mentioned sailing from Ventura CA to Cabo San Lucas, Mexico on a 35 foot sailboat and started thinking about PLM during his night shift. My favorite quote:
Besides dealing with a couple of visits from Mexican coast guard patrol boats hunting for suspected drug runners, I had time alone to think about my work in the PLM industry and specifically how people make decisions about what type of software system or systems they choose for managing product development information. Yes only a PLM “junkie” would think about PLM on a sailing trip and maybe this is why the Mexican coast guard was suspicious.
Second Oleg in his doomsday blog post: The End of PLM Communism, was thinking about PLM all the weekend. My favorite quote:
I’ve been thinking about PLM implementations over the weekend and some perspective on PLM concepts. In addition to that, I had some healthy debates over the weekend with my friends online about ideas of centralization and decentralization. All together made me think about potential roots and future paths in PLM projects.
It demonstrates the best thinking is done during out-of-office time and on casual locations. Knowing this from my long cycling tours in the weekend, I know it is true.
I must confess that I have PLM thoughts during cycling.
Perhaps the best thinking happens outside an office?
I leave the follow up on this observation to my favorite Dutch psychologist Diederik Stapel, who apparently is out of office too.
Both posts touch the topic of a single comprehensive solution versus best-of-breed solutions. Steve is very clear in his post. He believes that in the long term a single comprehensive solution serves companies better, although user performance (usability) is still an issue to consider. He provides guidance in making the decision for either a point solution or an integrated solution.
And I am aligned with what Steve is proposing.
Oleg is coming from a different background and in his current position he believes more in a distributed or network approach. He looks at PLM vendors/implementations and their centralized approach through the eyes of someone who knows the former Soviet Union way of thinking: “Centralize and control”.
The association with communism which was probably not the best choice when you read the comments. This association makes you think as the former Soviet Union does not exist anymore, what about former PLM implementations and the future? According to Oleg PLM implementations should be more focused on distributed systems (on the cloud ?), working and interacting together connecting data and processes.
And I am aligned with what Oleg is proposing.
Confused? You want be after reading my recent experience.
I have been involved in the discussion around the best possible solution for an EPC contractor (Engineering Procurement Construction) in the Oil & Gas industry. The characteristic of their business is different from standard manufacturing companies. EPC contractors provide services for an owner/operator of a plant and they are selected because of their knowledge, their price, their price, their price, quality and time to deliver.
This means an EPC contractor is focusing on execution, making sure they have the best tools for each discipline and this is the way they are organized and used to work. The downside of this approach is everyone is working on its own island and there is no knowledge capitalization or sharing of information. The result each solution is unique, which brings a higher risk for errors and fixes required during construction. And the knowledge is in the head of experience people ….. and they retire at a certain moment.
So this EPC contractor wanted to build an integrated system, where all disciplines are connected and sharing information where relevant. In the Oil & Gas industry, ISO15926 is the standard. This standard is relative mature to serve as the neutral exchange standard of information between disciplines. The ideal world for best in class tools communicating with each other, or not ?
Imagine there are 6 discipline tools, an engineering environment optimized for plant engineering, a project management environment, an execution environment connecting suppliers and materials, a delivery environment assuring the content of a project is delivered in the right stages and finally a knowledge environment, capitalizing lessons learned, standards and best practices.
This results in 6 tools and 12 interfaces to a common service bus connecting these tools. 12 interfaces as information needs to be send and received from the service bus per application. Each tools will have redundant data for its own execution.
What happens if a PLM provider could offer three of these tools on a common platform? This would result into 4 tools to install and only 8 interfaces. The functionality in the common PLM system does not require data redundancy but shares common information and therefore will provide better performance in a cross-discipline scenario.
In the ultimate world all tools will be on one platform, providing the best performance and support for this EPC contractor. However this is utopia. It is almost impossible to have a 100 % optimized system for a group of independent companies working together. Suppliers will not give up their environment and own IP to embed it in a customer´s ideal environment. So there is always a compromise to find between a best integrated platform (optimal performance – reduced cost of interfaces and cost of ownership) and the best connected environment (tools connection through open standards).
And this is why both Steve and Oleg have a viewpoint that makes sense. Depending on the performance of the tools and the interaction with the supplier network the PLM platform can provide the majority of functionality. If you are a market dominating OEM you might even reach 100 % coverage for your own purpose, although the modern society is more about connecting information where possible.
MY CONCLUSION after reading both posts:
- Oleg tries to provoke, and like a soap, you might end up confused after each episode.
- Steve in his post gives a common sense guidance, useful if you spend time on digesting it, not a soap.
Now I hope you are not longer confused and wish you all a successful and meaningful 2013. The PLM soap will continue in alphabetical order:
- Will Aras survive 21-12-2012 and support the Next generation ?
- Will Autodesk get of the cloud or have a coming out ?
- Will Dassault get more Experienced ?
- Will Oracle PLM customers understand it is not a database ?
- Will PTC get out of the CAD jail and receive $ 200 ?
- Will SAP PLM be really 3D and user friendly ?
- Will Siemens PLM become a DIN or ISO standard ?
See the next episodes of my PLM blog in 2013
It is interesting to read management books and articles and reflect the content in the context of PLM. In my previous post How the brain blocks PLM acceptance and in Stephen Porter´s (not yet finished) serial The PLM state: the 7 habits of highly effective PLM adoption, you can discover obvious points that we tend to forget in the scope of PLM as we are so focused on our discipline.
This summer holiday I was reading the Innovator´s Dilemma: When New Technologies Cause Great Firms to Fail by Clayton Christensen. Christensen is an associated professor at the Harvard Business School and he published this book already in 1997. Apparently not everyone has read the book and I recommend that if you are involved in the management of a PLM company to read it.
Christensen states there are two types of technologies. Leading companies are supporting their customers and try to serve them better and better by investing a lot in improving their current products. Christensen calls this sustaining technology as the aim is to improve existing products. Sustaining technologies lead to every time more and more effort to improve the current product performance and capabilities due to the chosen technology and solution concepts. These leading companies are all geared up around this delivery process and resources are optimized to sustain leadership, till ….
The other technology Christensen describes is disruptive technology, which initially is not considered as competition for existing technologies as it under performs in the same scope, so no way to serve the customer in the same way. The technology underperforms if you would apply to the same market, but it has unique capabilities that make it fit for another market. Next if the improvement path of disruptive technology can be faster than the improvement path for the sustaining technology, it is possible that their paths meet at a certain point. And although coming from a different set of capabilities, due to the faster improvement process the disruptive technology becomes the leading one and companies that introduced the disruptive technology became the new market leaders.
Why leading companies failed..
Christensen used the disk drive industry as an example as there the change in technology was so fast that it was a perfect industry to follow it´s dynamics. Later he illustrates the concepts with examples from other industries where the leading firms failed and stopped to exist because disruptive technologies overtook them and they were not able to follow that path too.
Although the leading companies have enough resources and skills, he illustrates that it is a kind of logical path – big companies will always fail as it is in their nature to focus on sustaining technology. Disruptive technologies do not get any attention as they are targeting a different unclear market in the beginning and in addition it is not clear where the value from this disruptive technology comes from, so which manager wants to risk his or her career to focus on something uncertain in an existing company.
Christensen therefore advises these leading companies, if they expect certain technologies to become disruptive for their business, to start a separate company and take a major share position there. Leave this company focus on its disruptive technology and in case they are successful and cross the path of the sustaining technology embed them again in your organization. Any other approach is almost sure to fail, quote:
Expecting achievement-driven employees in a large organization to devote critical mass of resources, attention and energy to disruptive projects targeted at a small market is equivalent to flapping one´s arms in an effort to fly
As the book was written in 1997, it was not in the context of PLM. Now let´s start with some questions.
Is ERP in the stage of sustaining technology?
Here I would say Yes. ERP vendors are extending their functional reach to cover more than the core functionality for two reasons: they need continuous growth in revenue and their customers ask for more functionality around the core. For sustaining technologies Christensen identifies four stages. Customers select a product for functionality, when other vendors have the same functionality reliability becomes the main differentiation. And after reliability the next phase is convenience and finally price.
From my personal observations, not through research, I would assume ERP for the major vendors is in the phase between convenience and price. If we follow Christensen´s analysis for SAP and Oracle it means they should not try to develop disruptive technologies inside their organization, neither should they try to downscale their product for the mid-market or add a different business model. Quote:
What goes up – does not go down. Moving to a high-end market is possible (and usually the target) – they will not go to small, poor defined low-end markets
How long SAP and Oracle will remain market leaders will depend on disruptive technologies that will meet the path of ERP vendors and generate a new wave. I am not aware of any trends in that area as I am not following the world of ERP closely
Is PLM in the stage of sustaining technology?
Here I would say No because I am not sure what to consider as a clear definition of PLM. Different vendors have a different opinion of what a PLM system should provide as core technologies. This makes it hard to measure it along the lifecycle of sustaining technology with the phases: functionality, reliability, convenience and price.
Where the three dominant PLM providers (DS/PTC/Siemens) battle in the areas of functionality, reliability and convenience others are focusing on convenience and price.
Some generalized thoughts passed my mind:
- DS and PTC somehow provoke their customers by launching new directions where they believe the customer will benefit from. This somehow makes it hard to call it sustaining technology.
- · Siemens claiming they develop their products based on what customers are asking for. According to Christensen they are at risk in the long term as customers keep you captive and do not lead you to disruptive technologies.
- · All three focus on the high-end and should not aim for smaller markets with the same technology. This justifies within DS the existence of CATIA and SolidWorks and in Siemens the existence of NX and SolidEdge. Unifying them would mean the end of their mid-market revenue and open it for others.
Disruptive technologies for PLM
Although PLM is not a sustained technology to my opinion, there are some disruptive technologies that might come into the picture of mainstream PLM.
First of all there is the Open Source software model, introduced by Aras, which initially is not considered as a serious threat for the classical PLM players – “big customers will never rely on open source”. However the Open Source model allows product improvements to move faster than main stream, reaching at a certain point the same level of functionality, reliability and convenience. The risk for Open Source PLM is that it is customer driven, which according Christensen is the major inhibitor for disruptive steps in the future
Next there is the cloud. Autodesk PLM and Kenesto are the two most visible companies in this domain related to PLM. Autodesk is operating from a comfort zone – it labels its product PLM, it does not try to match what the major PLM vendors try to do and they come from the small and medium mid-size market. Not too many barriers to come into the PLM mid-market in a disruptive manner. But does the mid-market need PLM? Is PLM a bad annotation for its cloud based product? Time will tell.
The management from Kenesto obviously has read the book. Although the initially concept came from PLM++ (bad marketing name), they do not to compete with mainstream PLM and aim their product at a different audience – business process automation. Then if their product picks up in the engineering / product domain, it might enter the PLM domain in a disruptive manner (all according to the book – they will become market leaders)
Finally Search Based Applications which are also a disruptive technology for the PLM domain. Many companies struggle with the structured data approach a classical PLM system requires and especially for mid-market companies this overhead is a burden. They are used to work in a cognitive manner, the validation and formalization is often done in the brain of experienced employees. Why cannot search based technology not be used to create structured data and replace or support the experienced brain?
If I open my Facebook page, I see new content related to where I am, what I have been saying or surfing for. Imagine an employee´s desktop that works similar, where your data is immediately visible and related information is shown. Some of the data might come from the structured system in the background, other might be displayed based on logical search criteria; the way our brain works. Some startups are working in this direction and Inforbix (congratulations Oleg & team) has already been acquired by Autodesk or Exalead by DS.
For both companies if they believe in the above concept, they should remain as long as possible independent from the big parent company as according to Christensen they will not get the right focus and priorities if they are part of the sustainable mainstream technology
This blog post was written during a relaxing holiday in Greece. The country here is in a crisis, they need disruptive politicians. They did it 3500 years ago and I noticed the environment is perfect for thinking as you can see below.
Meanwhile I am looking forward to your thoughts on PLM, in which state we are what the disruptive technologies are.
Sorry for the provoking title in a PLM blog, but otherwise you would not read my post till the end.
In the past months I have been working closely with several large companies (not having a mid-market profile). And although they were all in different industries and have different business strategies, they still had these common questions and remarks:
- How to handle more and more digital data and use it as valuable information inside the company or for their customers / consumers ?
- What to do with legacy data (approved in the previous century) and legacy people (matured and graduated in the previous century) preventing them to change ?
- We are dreaming of a new future, where information is always up-to-date and easy to access – will this ever happen ?
- They are in the automotive industry, manufacturing industry, infrastructure development and maintenance, plant engineering, construction and plant maintenance
- They all want data to be managed with (almost) zero effort
- And please, no revolution or change for the company
Although I have been focusing on the mid-market, it is these bigger enterprises that introduce new trends and as you can see from the observations above, there is a need for a change. But also it looks like the demands are in a contradiction to each other.
I believe it is just about changing the game.
If you look at the picture to the left, you see one of the contradictions that lead to PLM.
Increasing product quality, reducing time to market and meanwhile reducing costs seemed to be a contradiction at that time too.
Although PLM has not been implemented (yet) in every company that could benefit from it, it looks like the bigger enterprises are looking for more.
the L from PLM remains – they still want to connect all information that is related to the lifecycle of their products or plants.
the M from Management has a bad association – companies believe that moving from their current state towards a managed environment of data is a burden. Too much overhead is the excuse to not manage dat. And their existing environments to manage data do not excel in user-friendliness. And therefore people jump towards using Excel.
So if the P is not longer relevant, the M is a burden, what remains of PLM ?
Early June I presented at the Dassault Systems 3DExperience forum the topic of digital Asset Lifecycle Management for owners / operators. One of the areas where I believe PLM systems can contribute a lot to increase business value and profitability (quality and revenue – see using a PLM system for Asset Lifecycle Management )
Attending the key note speech it was clear that Dassault Systems does not talk about PLM anymore as a vision. Their future dream is a (3D) lifelike experience of the virtual world. And based on that virtual model, implement the best solution based on various parameters: revenue, sustainability, safety and more. By trying to manage the virtual world you have the option to avoid real costly prototypes or damaging mistakes.
I believe it is an ambitious dream but it fits in the above observations. There is more beyond PLM.
In addition I learned from talking with my peers (the corridor meetings) that also Siemens and PTC are moving towards a more industry or process oriented approach, trying to avoid the association with the generic PLM label.
Just at the time that Autodesk and the mid-market started to endorse PLM, the big three are moving away from this acronym.
This reminds me of what happened in the eighties when 3D CAD was introduced. At the time the mid-market was able to move to mainstream 3D (price / performance ratio changed dramatically) the major enterprises started to focus on PDM and PLM. So it is logical that the mid-market is 10 – 15 years behind new developments – they cannot afford experiments with new trends.
- the management of structured and unstructured data as a single platform. We see the rise of Search Bases Application and business intelligence based on search and semantic algorithms. Using these capabilities integrated with a structured (PLM ? ) environment is the next big thing.
- Apps instead of generic applications that support many roles. The generic applications introduce such a complexity to the interface that they become hard to use by a casual user. Most enterprise systems, but also advanced CAD or simulation tools with thousands of options suffer from this complexity. Would not it be nice if you only had to work with a few dedicated apps as we do in our private life ?
- Dashboards (BI) that can be created on the
flyrepresenting actual data and trends based
on structured and unstructured data.
It reminded me of a PLM / ERP discussion I had with a company, where the general manager all the time stated the types of dashboards he wanted to see. He did not talk about PLM, ERP or other systems – he wanted the on-line visibility
- Cloud services are coming. Not necessary centralizing all data on the cloud to reduce it cost. But look at SIRE and other cloud services that support a user with data and remote processing power at the moment required.
- Visual navigation through a light 3D Model providing information when required. This trend is not so recent but so far not integrated with other disciplines, the Google maps approach for 3D.
So how likely are these trends to change enterprise systems like PLM, ERP or CRM. In the table below I indicated where it could apply:
As you can see the PLM row has all the reasons to introduce new technologies and change the paradigm. For that reason combined with the observations I mentioned in the beginning, I am sure there is a new TLA (Three Letter Acronym) upcoming.
The good news is that PLM is dynamic and on the move. The bad news for potential PLM users is that the confusion remains – too many different PLM definitions and approaches currently – so what will be the next thing after PLM ?
Conclusion: The acronym PLM is not dead and becomes mainstream. On the high-end there is for sure a trend to a wider and different perspective of what was initially called PLM. After EDM, TDM, PDM and PLM we are waiting for the next TLA
The trigger for this post is based was a discussion I had around the Autodesk 360 cloud based PLM solution. To position this solution and to simplify the message for my conversation partner Joe the plumber, I told him”: “You can compare the solution with Excel on-line. As many small mid-market companies are running around with metadata (no CAD files) in Excel, the simplified game changer with this cloud based PLM offering is that the metadata is now in the cloud, much easier to access and only a single version exists.”
(sorry for Autodesk, if I simplified it too much, but sometimes your conversation partner does not have an IT background as they are plumbers)
He was right and I had to go more in-depth to explain difference. This part of the conversation was similar to discussions I had in some meetings with owner / operators in the civil and energy sector, discussing the benefits of PLM practices for their industry.
I wrote about this in previous posts:
The trouble with dumb documents
Here it was even more a key point of the discussion that most of the legacy data is stored in dumb documents. And the main reason dumb documents are used is because the data needs to be available during the long lifecycle of the the plant, application independent if possible. So in the previous century this was paper, later scanned documents (TIFF – PDF) and currently mainly PDF. Most of the data now is digital but where is the intelligence ?
The challenges these companies have is that despite the fact information is now stored in a digital file, the next step is how to deal with the information in an intelligent manner. A document or an Excel file is a collection of information, you might call it knowledge, but to get access to the knowledge you need to find it.
Did you try to find a specific document in Google docs or SharePoint ? The conclusion will be the file name becomes very important, and perhaps some keywords ?
Is search the solution ?
To overcome this problem, full text search and search based applications were developed, that allow us to index and search inside the documents. A piece of cake for Google and a niche for others to index not only standard documents but also more technical data (drawings, scans from P&ID, etc, etc).
Does this solve the problem ?
Partly, as suddenly the user finds a lot more data. Search on Google for the words “Right data” and you have 3.760.000.000 hits (or more). But what is the right data ? The user can only decide what is the right data by understanding the context.
- Is it the latest version ?
- Is it reflecting the change we made at that functional position ?
- What has changed ?
And here comes the need for more intelligent data. And this is typically where a PLM system provides the answer.
A PLM systems is able to manage different types of information, not only documents. In the context of a plant or a building, the PLM system would also contain:
- a functional definition / structure (linked to its requirements)
- a logical definition / structure (how is it supposed to be ?)
- a physical definition / structure (what is physically there ?)
- a location definition / structure (where in the plant / building ?)
and this is all version managed and related to the supported documents and other types of information. This brings context to the documents and therefore it exposes knowledge.
As there is no automatic switch from dumb documents towards intelligent data, it will be a gradual process to move towards this vision. I see a major role for search based applications to support data discovery. Find a lot of information, but than have the capability to capture the result (or generate a digest of the result) and store it connected to your PLM system, where is it managed in the future and provides the context.
Conclusion: We understand that paper documents are out of time. Moving these documents to digital files stored in a central location, either in SharePoint or a cloud-based storage location is a step we will regret in ten years from now, as intelligent data is not only inside the digital files but also depending on its context.