You are currently browsing the category archive for the ‘PLM’ category.
In my earlier post The weekend after PDT Europe I wrote about the first day of this interesting conference. We ended that day with some food for thought related to a bimodal PLM approach. Now I will take you through the highlights of day 2.
Interoperability and openness in the air (aerospace)
I believe Airbus and Boeing are one of the most challenged companies when it comes to PLM. They have to cope with their stakeholders and massive amount of suppliers involved, constrained by a strong focus on safety and quality. And as airplanes have a long lifetime, the need to keep data accessible and available for over 75 years are massive challenges. The morning was opened by presentations from Anders Romare (Airbus) and Brian Chiesi (Boeing) where they confirmed they could switch the presenter´s role between them as the situations in Airbus and Boeing are so alike.
Anders Romare started with a presentation called: Digital Transformation through an e2e PLM backbone, where he explained the concept of extracting data from the various silo systems in the company (CRM, PLM, MES, ERP) to make data available across the enterprise. In particular in their business transformation towards digital capabilities Airbus needed and created a new architecture on top of the existing business systems, focusing on data (“Data is the new oil”).
In order to meet a data-driven environment, Airbus extracts and normalizes data from their business systems and provides a data lake with integrated data on top of which various apps can run to offer digital services to existing and new stakeholders on any type of device. The data-driven environment allows people to have information in context and almost real-time available to make right decisions. Currently, these apps run on top of this data layer.
Now imagine information captured by these apps could be stored or directed back in the original architecture supporting the standard processes. This would be a real example of the bimodal approach as discussed on day 1. As a closing remark Anders also stated that three years ago digital transformation was not really visible at Airbus, now it is a must.
Next Brian Chiesi from Boeing talked about Data Standards: A strategic lever for Boeing Commercial Airplanes. Brian talked about the complex landscape at Boeing. 2500 Applications / 5000 Servers / 900 changes annually (3 per day) impacting 40.000 users. There is a lot of data replication because many systems need their own proprietary format. Brian estimated that if 12 copies exist now, in the ideal world 2 or 3 will do. Brian presented a similar future concept as Airbus, where the traditional business systems (Systems Engineering, PLM, MRP, ERP, MES) are all connected through a service backbone. This new architecture is needed to address modern technology capabilities (social / mobile / analytics / cloud /IoT / Automation / ,,)
Interesting part of this architecture is that Boeing aims to exchange data with the outside world (customers / regulatory/supply chain /analytics / manufacturing) through industry standard interfaces to have an optimal flow of information. Standardization would lead to a reduction of customized applications, minimize costs of integration and migration, break the obsolescence cycle and enable future technologies. Brian knows that companies need to pull for standards, vendors will deliver. Boeing will be pushing for standards in their contracts and will actively work together with five major Aerospace & Defense companies to define required PLM capabilities and have a unified voice to PLM solutions providers.
My conclusion on these to Aerospace giants is they express the need to adapt to move to modern digital businesses, no longer the linear approach from the classic airplane programs. Incremental innovation in various domains is the future. The existing systems need to be there to support their current fleet for many, many years to come. The new data-driven layer needs to be connected through normalization and standardization of data. For the future focus on standards is a must.
Simon Floyd from Microsoft talked about The Impact of Digital Transformation in the Manufacturing Enterprise where he talked us through Digital Transformation, IoT, and analytics in the product lifecycle, clarified by examples from the Rolls Royce turbine engine. A good and compelling story which could be used by any vendor explaining digital transformation and the relation to IoT. Next, Simon walked through the Microsoft portfolio and solution components to support a modern digital enterprise based on various platform services. At the end, Simon articulated how for example ShareAspace based on Microsoft infrastructure and technology can be an interface between various PLM environments through the product lifecycle.
Simon’s presentation was followed by a panel discussion where the theme was: When is history and legacy an asset and barriers of entry and When does it become a burden and an invitation to future competitors.
Mark Halpern (Gartner) mentioned here again the bimodal thinking. Aras is bimodal. The classical PLM vendors running in mode 1 will not change radically and the new vendors, the mode 2 types will need time to create credibility. Other companies mentioned here PropelPLM (PLM on Salesforce platform) or OnShape will battle the next five years to become significant and might disrupt.
Simon Floyd(Microsoft) mentioned that in order to keep innovation within Microsoft, they allow for startups within in the company, with no constraints in the beginning to Microsoft. This to keep disruption inside you company instead of being disrupted from outside. Another point mentioned was that Tesla did not want to wait till COTS software would be available for their product development and support platform. Therefore they develop parts themselves. Are we going back to the early days of IT ?
Interesting trend I believe too, in case the building blocks for such solution architecture are based on open (standardized ?) services.
After the lunch, the conference was split in three streams where I was participating in the “Creating and managing information quality stream.” As I discussed in my presentation from day 1, there is a need for accurate data, starting a.s.a.p. as the future of our businesses will run on data as we learned from all speakers (and this is not a secret – still many companies do not act).
In the context of data quality, Jean Brange from Boost presented the ISO 8000 framework for data and information quality management. This standard is now under development and will help companies to address their digital needs. The challenge of data quality is that we need to store data with the right syntax and semantic to be used and in addition, it needs to be pragmatic: what are we going to store that will have value. And then the challenge of evaluating the content. Empty fields can be discovered, however, how do you qualify the quality of field with a value. The ISO 8000 framework is a framework, like ISO 9000 (product quality) that allow companies to work in a methodological way towards acceptable and needed data quality.
Magnus Färneland from Eurostep addressed the topic of data quality and the foundation for automation based on the latest developments done by Eurostep on top of their already rich PLCS data model. The PLCS data model is an impressive model as it already supports all facets of product lifecycle from design, through development and operations. By introducing soft typing, EuroStep allows a more detailed tuning of the data model to ensure configuration management. When at which stage of the lifecycle is certain information required (and becomes mandatory) ? Consistent data quality enforced through business process logic.
The conference ended with Marc Halpern making a plea for Take Control of Your Product Data or Lose Control of Your Revenue, where Marc painted the future (horror) scenario that due to digital transformation the real “big fish” will be the digital business ecosystem owner and that once you are locked in with a vendor, these vendors can uplift their prices to save their own business without any respect for your company’s business model. Marc gave some examples where some vendor raised prices with the subscription model up to 40 %. Therefore even when you are just closing a new agreement with a vendor, you should negotiate a price guarantee and a certain bandwidth for increase. And on top of that you should prepare an exit strategy – prepare data for migration and have backups using standards. Marc gave some examples of billions extra cost related to data quality and loss. It can hurt !! Finally, Marc ended with recommendations for master data management and quality as a needed company strategy.
Gerard Litjens from CIMdata as closing speaker gave a very comprehensive overview of The Internet of Thing – What does it mean for PLM ? based on CIMdata’ s vision. As all vendors in this space explain the relation between IoT and PLM differently, it was a good presentation to be used as a base for the discussion: how does IoT influence our PLM landscape. Because of the length of this blog post, I will not further go into these details – it is worth obtaining this overview.
Concluding: PDT2016 is a crucial PLM conference for people who are interested in the details of PLM. Other conferences might address high-level customer stories, at PDT2016 it is about the details and sharing the advantages of using standards. Standards are crucial for a data-driven environment where business platforms with all their constraints will be the future. And I saw more and more companies are working with standards in a pragmatic manner, observing the benefits and pushing for more data standards – it is not just theory.
See you next year ?
I am just back from the annual PDT conference (12th edition), this year hosted in Paris from 9 to 10 November, co-located with CIMdata’s PLM Road Map 2016 for Aerospace & Defense. The PDT conference, organized by EuroStep and CIMdata, is a relatively small conference with a little over a hundred attendees. The attractiveness of this conference is that the group of dedicated participants is very interactive with each other sharing honest opinions and situations, sometimes going very deep into the details, needed to get the full picture. The theme of the conference was: “Investing for the future while managing product data legacy and obsolescence.” Here are some of the impressions from these days, giving you food for thought to join next year.
Setting the scene
Almost traditionally Peter Bilello (CIMdata) started the conference followed by Marc Halpern (Gartner). Their two presentations had an excellent storyline together.
Pieter Bilello started and discussed Issues and Remedies for PLM Obsolescence. Peter did not address PLM obsolescence for the first time. It is a topic many early PLM adaptors are facing and in a way the imminent obsolescence of their current environments block them of taking advantage of new technologies and capabilities current PLM vendors offer. Having learned from the past CIMdata provides a PLM Obsolescence Management model, which should be on every companies agenda, in the same way as data quality (which I will address later). Being proactive in obsolescence can save critical situations and high costs. From the obsolescence theme, Peter looked forward to the future and the value product innovation platforms can offer, given the requirements that data should be able to flow through the organization, connecting to other platforms and applications, increasing the demand to adhere and push for standards.
Marc Halpern followed with his presentation, titled: More custom products demand new IT strategies and new PLM application where he focused on the new processes and methodology needed for future businesses with a high-focus on customer-specific deliveries, speed, and automation. Automation is always crucial to reducing production costs. In this delivery process 3D printing could bring benefits and Mark shared the plusses and minuses of 3D printing. Finally, when automation of a customer specific order would be possible, it requires a different IT-architecture, depicted by Mark. After proposing a roadmap for customizable products, Mark shared some examples of ROI benefits reported by successful transformation projects. Impressive !!
My summary of these two sessions is that both CIMdata and Gartner confirm the challenges companies have to change their old legacy processes and PLM environments which support the past, meanwhile moving to more, customer-driven processes and, modern data-driven PLM functionality. This process is not just an IT or Business change, it will also be a cultural change.
JT / STEP AP242 / PLCS
Next, we had three sessions related to standards, where Alfred Katzenbach told the success story of JT, the investment done to get this standard approved and performing based on an active community to get the most out of JT, beyond its initial purpose of viewing and exchanging data in a neutral format. Jean-Yves Delanaunay explained in Airbus Operation the STEP AP242 definition is used as the core standard for 3D Model Based Definition (MB) exchange, part of the STEP standards suite and as the cornerstone for Long Term Archiving and Retrieval of Aerospace & Defense 3D MBD.
There seems to be some rivalry between JT and STEP242 viewing capabilities, which go beyond my understanding as I am not an expert from the field here. Nigel Shaw ended the morning session positioning PLCS as a standard for interoperability of information along the whole lifecycle of a product. Having a standardized data model as Nigel showed would be a common good approach for PLM vendors to converge to a more interoperable standard.
My summary of standards is that there is a lot of thinking, evaluation, and testing done by an extensive community of committed people. It will be hard for a company to define a better foundation for a standard in their business domain. Vendors are focusing on performance inside their technology offering and therefore will never push for standards (unless you use their products as a standard). A force for adhering to standards should come from the user community.
After lunch we had three end-users stories from:
- Eric Delaporte (Renault Group) talked about their NewPDM project and the usage of standards mainly for exchanges. Two interesting observations: Eric talks about New PDM – the usage of the words New (when does New become regular?) and PDM (not talking about PLM ?) and secondly as a user of standards he does not care about the JT/AP242 debate and uses both standards where applicable and performing.
- Sebastien Olivier (France Ministry of Defense) gave a bi-annual update of their PCLS-journey used in two projects, Pencil (Standardized Exchange platform and centralized source of logistical information) and MAPS (Managing procurement contracts for buying In-Service Support services) and the status of their S3000L implementation (International procedure for Logistic Support Analysis). A presentation for the real in-crowd of this domain.
- Juha Rautjarvi discussion how efficient use of knowledge for safety and security could be maintained and enhanced through collaboration. Here Juha talks about the Body of Knowledge which should be available for all stakeholders in the context of security and safety. And like a physical product this Body of Knowledge goes through a lifecycle, continuous adapting to what potentially arises from the outside world
My conclusion on this part was that if you are not really in these standards on a day-to-day base (and I am not), it is hard to pick the details. Still, the higher level thought processes behind these standard approaches allow you to see the benefits and impact of using standards, which is not the same as selecting a tool. It is a strategic choice.
Modular / Bimodular / not sexy ?
Jakob Asell from Modular Management gave an overview how modularity can connect the worlds of sales, engineering, and manufacturing by adding a modular structure as a governing structure to the various structures used by each discipline. This product architecture can be used for product planning and provides and end-to-end connectivity of information. Modular Management is assisting companies in moving towards this approach.
Next my presentation title: The importance of accurate data. Act now! addressed the topic of the switch from classical, linear, document-driven PLM towards a modern, more incremental and data-driven PLM approach. Here I explained the disadvantage of the old evolutionary approach (impossible – too slow/too costly) and an alternative method, inspired by Gartner’s bimodular IT-approach (read my blog post: Best Practices or Next Practices). No matter which option you are looking for correct and quality data is the oil for the future, so companies should consider allowing the flow of data as a health issue for the future.
The day was closed with a large panel, where the panelist first jumped on the topic bimodal (bipolar ?? / multimodal ??) talking about mode 1 (the strategic approach) and mode 2 (the tactical and fast approach based on Gartner’s definition). It was clear that the majority of the panel was in Mode 1 mode. Which fluently lead to the discussion of usage of standards (and PLM) as not being attractive for the young generations (not sexy). Besides the conclusion that it takes time to understand the whole picture and see the valuable befits a standard can bring and join this enthusiasm
I realize myself that this post is already too long according blogging guidelines. Therefore I will tell more about day 2 of the conference next week with Airbus going bimodal and more.
Stay tuned for next week !
At this moment I am finalizing my session for PDT2016 where I will talk about the importance of accurate data. Earlier this year I wrote a post about that theme: The importance of accurate data. Act now!
My PDT session will be elaborating on this post, with a focus on why and how we need this change in day-to-day business happen. So if you are interested in a longer story and much more interesting topics to learn and discuss, come to Paris on 9 and 10 November.
Dreaming is free
Recently I found a cartoon on LinkedIn and shared it with my contacts, illustrating the optimistic view companies have when they are aiming to find the best solution for their business, going through an RFI phase, the RFP phase, and ultimately negotiation the final deal with the PLM solution provider or vendor. See the image below:
All credits to the author – I found this image here
The above cartoon gives a humoristic view of the (PLM) sales process (often true). In addition, I want to share a less optimistic view related to PLM implementations after the deal has been closed. Based on the PLM projects if have been coaching in the past, the majority of these projects became in stress mode once the stakeholders involved only focused on the software, the functions and features and centralizing data. Implementing the software without a business transformation caused a lot of discomfort.
Users started to complain that the system did not allow them to do their day-to-day work in the same way. And they were right! They should have a new day-to-day work in the future, with different priorities based on the new PLM infrastructure.
This cultural change (and business change) was often not considered as the PLM system was implemented from an IT-perspective, not with a business perspective.
Over time, a better understanding of PLM and the fact that vendors and implementers have improved their portfolio and implementation skills, classical PLM implementations are now less disruptive.
A classical PLM implementation can be done quickly is because the system most of the time does not change the roles and responsibilities of people. Everyone remains working in his/her own silo. The difference: we store information in a central place so it can be found. And this approach would have worked if the world was not changing.
The digital enterprise transformation.
With the upcoming digitization and globalization of the market, enterprises are forced to adapt their business to become more customer-driven. This will have an impact on how PLM needs to be implemented. I wrote about this topic in my post: From a linear world to fast and circular. The modern digital enterprise has new roles and responsibilities and will eliminate roles and responsibilities that can be automated through a data-driven, rule-based approach. Therefore implementing PLM in a modern approach should be related (driven) by a business transformation and not the other way around!
In the past two years, I have explained this story to all levels inside various organizations. And nobody disagreed. Redefining the processes, redefining roles was the priority. And we need a team to help people to make this change – these people are change management experts. The benefits diagram from Gartner as shown below was well understood, and most companies agreed the ambition should be to the top curve, in any case, stay above the red curve
But often reality relates to the first cartoon. In the majority of the implementations I have seen the past two years, the company did not want to invest in change management, defining the new process and new roles first for an optimum flow of information. They spent the entire budget on software and implementation services. With a minimum of staff, the technology was implemented based on existing processes – no change management at all. Disappointing, as short-term thinking destroyed the long-term vision and benefits were not as large as they had been dreaming.
Without changing business processes and cultural change management, the PLM team will fight against the organization, instead of surfing on the wave of new business opportunities and business growth.
If your company is planning to implement modern PLM which implicit requires a business transformation, make sure cultural change management is part of your plan and budget. It will bring the real ROI. Depending on your company´s legacy, if a business transformation is a mission impossible, it is sometimes easier to start a new business unit with new processes, new roles and potentially new people. Otherwise, the benefits will remain (too) low from your PLM implementation.
I am curious to learn your experience related to (the lack) of change management – how to include it into the real scope – your thoughts ?
As a reaction to this post, Oleg Shilovitsky wrote a related blog post: PLM and the death spiral of cultural change. See my response below to this post as it will contribute to the understanding of this post
Oleg, thanks for contributing to the theme of cultural change. Your post illustrates that my post was not clear enough, or perhaps too short. I do not believe PLM is that difficult because of technology, I would even claim that technology is a the bottom of my list of priorities. Not stating it is not important, but meaning that when you are converging with a company to a vision for PLM, you probably know the kind of technologies you are going to use.
The highest priority to my opinion is currently the business transformation companies need to go through in order to adapt their business to remain relevant in a digital world. The transformation will require companies to implement PLM in a different manner, less silo-oriented, more focus on value flows starting from the customer.
Working different means cultural change and a company needs to allocate time, budget and energy to that. The PLM implementation is supporting the cultural change not driving the cultural change.
And this is the biggest mistake I have seen everywhere. Management decides to implement a new PLM as the driver for cultural change, instead of the result of cultural change. And they reason this is done, is most of the time due to budget thinking as cultural change is ways more complex and expensive than a PLM implementation.
The past half-year I have been intensively discussing potential PLM roadmaps with companies of different sizes and different maturity in PLM. Some companies are starting their PLM journey after many years of discussion and trying to identify the need and scope, others have an old PLM implementation (actually most of the time it is cPDM) where they discover that business paradigms from the previous century are no longer sufficient for the future.
The main changing paradigms are:
- From a linear product-driven delivery process towards an individual customer focused offering based on products and effective services, quickly -adapting to the market needs.
- From a document-driven, electronic files exchange based processes and systems towards data-driven platforms supporting information to flow in almost real-time through the whole enterprise.
Both changes are related and a result of digitization. New practices are under development as organizations are learning how to prepare and aim for the future. These new practices are currently dominating the agenda from all strategic consultancy firms as you cannot neglect the trend towards a digital enterprise. And these companies need next practices.
And what about my company?
It is interesting to see that most of the PLM implementers and vendors are promoting best practices, based on their many years of experience working having customers contributing to functionality in their portfolio.
And it is very tempting to make your customer feel comfortable by stating:
“We will implement our (industry) best practices and avoid customization – we have done that before!”
I am sure you have heard this statement before. But what about these best practices as they address the old paradigms from the past?
Do you want to implement the past to support the future?
Starting with PLM ? Use Best Practices !
If the company is implementing PLM for the first time and the implementation is bottom-up you should apply the old PLM approach. My main argument: This company is probably not capable/ready to work in an integrated way. It is not in the company´s DNA yet. Sharing data and working in a controlled environment is a big step to take. Often PLM implementations failed at this point as the cultural resistance was too big.
When starting with classical PLM, avoid customization and keep the scope limited. Horizontal implementations (processes across all departments) have more success than starting at engineering and trying to expand from there. An important decision to make at this stage is 2D leading (old) or the 3D Model leading (modern). Some future thoughts: How Model-based definition can fix your CAD models. By keeping the scope limited, you can always evolve to the next practices in 5 -10 years (if your company is still in business).
Note 1: remark between parenthesis is a little cynical and perhaps for the timeframe incorrect. Still, a company working bottom-up has challenges to stay in a modern competitive global environment.
Note 2: When writing this post I got notified about an eBook available with the tittle Putting PLM within reach written by Jim Brown. The focus is on cloud-based PLM solution that require less effort/investments on the IT-side and as side effect it discourages customization (my opinion) – therefore a good start.
Evolving in PLM – Next Practices
Enterprises that have already a PDM/PLM system in place for several years should not implement the best practices. They have reached the level that the inhibitors off a monolithic, document based environment are becoming clear.
They (must) have discovered that changing their product offering or their innovation strategy now with partners is adding complexity that cannot be supported easily. The good news, when you change your business model and product offering, there is C-level attention. This kind of changes do not happen bottom-up.
Unfortunate business changes are often discussed at the execution level of the organization without the understanding that the source of all products or offering data needs to be reorganized too. PLM should be a part of that strategic plan and do not confuse the old PLM with the PLM for the future.
The PLM for the future has to be built upon next practices. These next practices do not exists out of the box. They have to be matured and experienced by leading companies. The price you pay when being a leader Still being a leader bring market share and profit your company cannot meet when being a follower.
The Bi-modal approach
As management of a company, you do not want a disruption to switch from one existing environment to a new environment. Too much risk and too disruptive – people will resist – stress and bad performance everywhere. As the new data-driven approach is under development (we are learning), the end target is still moving.
Evolving using the old PLM system towards the new PLM approach is not recommended. This would be too expensive, slow and cumbersome. PLM would get a bad reputation as all the complexity of the past and the future are here. It is better to start the new PLM with a new business platform and customer-oriented processes for a limited offering and connect it to your legacy PLM.
Over the years the new PLM will become more clear and grow where the old PLM will become less and less relevant. Depending on the dynamics of your industry this might take a few years till decades.
It must and will be a business-driven learning path for new best practices
Best Practices and Next Practices are needed in parallel. Depending on the maturity and lack of sharing information in your company, you can choose. Consider the bi-modal approach to choose a realistic time path.
What do you think? Could this simplified way of thinking help your company?
Coming back from holiday (a life without PLM), it is time to pick up blogging again. And like every start, the first step is to make a status where we are now (with PLM) and where PLM is heading. Of course, it remains a nopinion based on dialogues I had this year.
First and perhaps this is biased, there is a hype in LinkedIn groups or in the blogs that I follow, a kind of enthusiasm coming from OnShape and Oleg Shilovitsky´s new company OpenBOM: the hype of cloud services for CAD/Data Management and BOM management.
Two years ago I discussed at some PLM conferences, that PLM should not necessary be linked to a single PLM system. The functionality of PLM might be delivered by a collection of services, most likely cloud-based, and these services together providing support for the product lifecycle. In 2014 I worked with Kimonex, an Israeli startup that developed a first online BOM solution, targeting the early design collaboration. Their challenge was to find customers that wanted to start this unknown journey. Cloud-based meant real-time collaboration, and this is also what Oleg wrote about last week: Real-time collaborative edit is coming to CAD & PLM
Real-time collaboration is one of the characteristics of a digital enterprise, where thanks to the fact information is stored as data, information can flow rapidly through an organization. Data can be combined and used by anyone in the organization in a certain context. This approach removes the barriers between PLM and ERP. To my opinion, there is no barrier between PLM and ERP. The barrier that companies create exists because people believe PLM is a system, and ERP is a system. This is the way of (system) thinking is coming from the previous century.
So is the future about cloud-based, data-driven services for PLM?
To my opinion systems are still the biggest inhibitor for modern PLM. Without any statistical analysis based on my impressions and gut feelings, this is what I see:
- The majority of companies that say the DO PLM, actually do PDM. They believe it is PLM because their vendor is a PLM company and they have bought a PLM system. However, in reality, the PLM system is still used as an advanced PDM system by engineering to push (sometimes still manual) at the end information into the well-established ERP system. Check with your company which departments are working in the PLM system – anyone beyond engineering ?
- There is a group of companies that have implemented PLM beyond their engineering department, connecting to their suppliers in the sourcing and manufacturing phase. Most of the time the OEMs forces their suppliers to deliver data in an old-fashioned way or sometimes more advanced integrated in the OEM environment. In this case, the supplier has to work in two systems: their own PDM/PLM environment and the OEM environment. Not efficient, still the way traditional PLM vendors promote partner / supply chain integration .
This is an area where you might think that a services-based environment like OnShape or OpenBOM might help to connect the supply chain. I think to so too. Still, before that we reach this stage there are some hurdles to overcome:
Persistence of data
The current generation in management of companies older than 20 years grew up with the fact that “owning data” is the only way to stay relevant in business. Even open innovation is a sensitive topic. What happens with data your company does not own because it is in the cloud in an environment you do not own (but share ?) . As long as companies insist on owning the data, a service-based PLM environment will not work.
A nice compromise at this time is ShareAspace from EuroStep. I wrote about ShareAspace last year when I attended sessions from Volvo Car (The weekend after PI Munich 2016) ShareAspace was used as a middleware to map and connect between two PLM/PDM environments. In this way, persistence of data remains. The ShareAspace data model is based on PLCS, which is a standard in the core industries. And standards are to my opinion the second hurdle for a services-based approach
A standard is often considered as overhead, and the reason for that is that often a few vendors dominate the market in a certain domain and therefore become THE standard. Similar to persistence of data, what is the value of data that you own but that you only can get access to through a vendor´s particular format?
Good for the short-term, but what about the long-term. (Most of the time we do not think about the long-term and consider interoperability problems as a given). Also, a services based PLM environment requires support for standards to avoid expensive interfaces and lack of long-term availability. Check in your company how important standards are when selecting and implementing PLM.
There is a nice hype for real-time collaboration through cloud solutions. For many current companies not good enough as there is a lot of history and the mood to own data. Young companies that discover the need for a modern services-based solution might be tempted to build such an environment. For these companies, the long-term availability might be a topic to study
Note: I just realized if you are interested in persistency and standards you should attend PDT 2016 on 9 & 1o November in Paris. Another interesting post just published from Lionel Grealou : Single Enterprise BOM: Utopia vs Dystopia is also touching this subject
Summer holidays are upcoming. Time to look back and reflect on what happened so far. As a strong believer that a more data-driven PLM is required to support modern customer-focused business models, I have tried to explain this message to many individuals around Europe with mixed success.
Compared to a year ago the notion of a new PLM approach, digital and data-driven, has been resonating more and more. Two years ago I presented at the Product Innovation conference in Berlin a session with the title: Did you notice PLM is changing ? The feedback at that time was that it was a beautiful story, probably happening in the far future. Last year in Düsseldorf ( my review here), the digital trend (s) became clearer. And this year in Munich (my review here), people mentioned upcoming changes were unavoidable, in particular in the relation with IoT, how it could drastically change existing business models.
For me, the enjoyable thing of the PI Conferences is that they give a snapshot of what people care the most in the context of their product development and in particular PLM. When you are busy in day-to-day business, everything seems to move slowly forward. However, by looking back, I must admit the pace of change has increased dramatically, not the same pace as it was five or ten years ago.
Something is happening, and it happens fast !
And here I want to encourage my readers to step back for a moment from day-to-day business and look around what is happening, in business and in the world. It is all related !
Jobs are disappearing in the middle class due to automation and direct connectivity with customers creates new types of businesses. Old jobs will never come back, not even when you close your border. And this is what worries many societies. This global, connected world has created a new way of doing business, challenging old and traditional businesses (and people) as their models become obsolete.
The primary reaction is trying to close the discomfort outside. Let´s act as if it never happened and just switch back to the good life in the previous century or centuries.
To be honest, it is all about the discomfort this new world brings to us. This new world requires new skills, in particular, more personal skills to develop continuously, learn and adapt for the future. Closing your mind and thought for the future, by hanging in the past, only brings you further away from the future and create more discomfort.
Are you talking PLM ?
Yes, the previous section was very generic, however also valid for PLM. Modern enterprises are changing the way they are going to do business and PLM is a crucial part of that total picture. Jeff Immelt, CEO of GE, explains in a discussion with Microsoft´s CEO Satya Nadella what it takes for an organization to be ready for the future. He does not talk about PLM, he talks about the need for people to be different in attitude and responsibilities – it is a business transformation – people first. Have a look here:
And although Jeff does not mention PLM, the changing digital business paradigm will affect all classical system, PLM, ERP, CRM. And your PLM vision and plans should anticipate for such a business transformation. Implementing PLM now in the same way is has been done for 10 years in the past, with the processes from the past in mind might make your company even more rigid than before. See my recent blog post: The value of PLM at the C-level.
Take this thought into consideration during your holidays. Can you be comfortable in this world by keep on hanging on the past or should you consider an uncomfortable, but crucial change the way your company will remain (flexible) in future business?
My holiday this year will be in my ultimate comfort zone at the beach. Reading books, no internet, discussing with friends what moves us. Two weeks to charge the batteries for this exciting, rapidly changing world of business (and PLM). I look forward coming back with some of my findings in my upcoming blogs.
Getting in and out of your comfort zone happens everywhere. Read this HBR article with a lot of similarities: If You’re Not Outside Your Comfort Zone, You Won’t Learn Anything
See you soon in the PLM (dis)comfort zone
If you have followed my blog the recent years, you might have discovered my passion for a modern, data-driven approach for PLM. (Read about it here: The difference between files and data-driven – a tutorial (3 posts)).
The data-driven approach will be the foundation for product development and innovation in a digital enterprise. Digital enterprises can outperform traditional businesses is such a way that within five to ten years, non-digital businesses will be considered as dinosaurs (if they still exist).
In particular, a digital enterprise is operating in an agile, iterative way with the customer continuously in focus, where traditional enterprises often work more in a linear way, pushing their products to the market (if the market still is waiting for these commodities).
Read more about this topic here: From a linear world to fast and circular?
When and how to become a digital enterprise?
It is (almost) inevitable your company will transform at a particular time into a digital enterprise too. Either driven by a vision to remain ahead of the competition or as a final effort to stay in business as competing against agile digital competitors is killing your market share.
One characteristic of a digital enterprise is that all benefits rely on accurate data flowing through the organization and its eco-system. And it does not matter if the data resides in a single system/platform like the major vendors are promoting or the fact that data is federated and consumed by the right person with the right role. I am a believer in the latter concept, still seeing current startups trying to create the momentum to achieve such an infrastructure. Have a look at my blog buddy’s company OpenBOM and Oleg’s recent article: The challenges of distributed BOM handover in data-driven manufacturing
No matter what you believe at this stage, the future is about accurate data. I bumped recently into some issues related actual data again. Some examples:
A change in objectives is needed!
One of the companies I am working with were only focusing on individual outputs, either in their drawings (yes, the 3D Model was not leading yet) or/and in their Excels (sounds familiar ? ). When we started implementing a PLM backbone, it became apparent during the discovery phase we could not use any advanced search tools to have quick wins by aggregating data for better understanding of the information we discovered. Drawings and Models did not contain any (file) properties. Therefore, the only way to understand information was by knowing its (file) name and potential its directory. Of course, the same file could be in multiple directories and as there were almost no properties, how to know what belongs to what item ?
When discussing the future of PLM with such companies, you always hear people (mainly engineers) say:
“we are not administrators, we need to get our job done.”
This shortsighted statement is often supported by management, and then you get stuck in the past.
It is time for the management and engineers to realize their future is also based on a data-driven approach. Therefore adding data to a drawing or CAD model, or in the case of PLM, part / process characteristics become the job of an engineer. We have to redefine roles as in a digital enterprise there is nobody to fix data downstream. People fixing data issues are too expensive.
I do not want to go digital
Most companies at this time are not ready for a digital enterprise yet. The changing paradigm is relatively new. Switching now to a modern approach cannot be done either because their culture is still based on the previous century or they are just in the middle of a standard PLM process, just learning to share files within their (global) origination. These companies might create an attitude:
“I do now want to go digital”
I believe this is ostrich behavior, like saying:
“I want all information printed on paper on my desk so I can work in comfort (and keep my job).”
History shows hanging to the past is killing for companies. Those companies that did not invest in the first electronic wave are probably out of business (unless they never had competition). The same for digital. In potentially ten years from now, it is not affordable to work in a traditional way anymore as labor cost and speed of information flowing through an organization are going to be crucial KPIs to stay in business.
As Dutch, we are always seeking compromises. It helped our country to become a leading trading nation and due to the compromises, we struggle less with strikes compared to our neighboring companies. Therefore my proposal for those who do not like digital at this stage: Add just a little digital workload to your day-to-day business, preferably stimulated and motivated by your management and promoted as a company initiative. By adding as much as possible relevant properties and context to your work, you will be working on the digital future of your company. When the times is there to become digital, it will be much easier to connect your old legacy information to the new digital platform, speeding up the business transformation.
And of course there will be tools
If you are observing what is happening in the PLM domain , you will see more and more tools for data discovery and data cleansing will appear on the market. Dick Bourke wrote end of last year an introduction article about this topic at Engineering.com: Is-Suspect-Product-Data-the-Elephant-in-the-Search-and-Discover-Room? Have a read to get interested.
And there are rewards
Once you have more accurate data, you can:
- Find it (saving search time)
- Create reports through automation (saving processing time)
- Apply rules (saving validation work & time or processing time)
- Create analytics (predict the future – priceless J)
We are in a transition phase the way PLM will is implemented. What is clear, no matter in which stage you are, accurate data is going to be crucial for the future? Use this awareness for your company to stay in business.
Finally, I have time to share my PLM experiences with you in this blog. The past months have been very busy as I moved to a new house, and I wanted to do and control a lot of activities myself. Restructuring your house in an agile way is not easy. Luckily there was a vision how the house should look like. Otherwise, the “agile” approach would be an approach of too many fixes. Costly and probably typical for many old construction projects.
Finally, I realized the beauty of IKEA´s modular design and experienced the variety of high-quality products from BLUM (an impressive company in Austria I worked with)
In parallel, I have been involved in some PLM discussions where in all cases the connection with the real C-level was an issue. And believe it or not, my blog buddy Oleg Shilovitsky just published a post: Hard to sell PLM? Because nobody gives a SH*T about PLM software. Oleg is really starting from the basics explaining you do not sell PLM; you sell a business outcome. And in larger enterprises I believe you sell at this time the ability to do a business transformation as business is becoming digital, with the customer in the center. And this is the challenge I want to discuss in this post
The value of PLM at the C-level
Believe it or not, it is easier to implement PLM (in general) instead of explaining a CEO why a company needs modern PLM. A nice one-liner to close this post, however, let me explain what I mean by this statement and perhaps show the reasons why PLM does not seem to be attractive so much at the C-level. I do not want to offend any particular PLM company, Consultancy firm or implementor, therefore, allow me to stay on a neutral level.
The C-level time challenge
First, let´s imagine the situation at C-level. Recently I heard an excellent anecdote about people at C-level. When they were kids, the were probably the brightest and able to process and digest a lot of information, making their (school) careers a success. When later arriving in a business environment, they were probably the ones that could make a difference in their job and for that reason climbed the career ladder fast to reach a C-level position. Then arriving at that level, they become too busy to dive really deep into the details.
Everyone around them communicates in “elevator speeches” and information to read must me extremely condensed and easy to understand. As if people at C-level have no brains and should be informed like small kids.
I have seen groups of people working weeks on preparing the messages for the CEO. Every word is twisted hundred times – would he or she understand it? I believe the best people at C-level have brains, and they would understand the importance of PLM when someone explains it. However, it requires time if it does not come from your comfort zone.
Who explains the strategic value of PLM
There are a lot of strategic advisory companies who have access to the board room, and we can divide them into two groups. The ones that focus on strategy independent of any particular solution and the ones that concentrate on a strategy, guaranteeing their implementation teams are ready to deploy the solution. Let´s analyze both options and their advice:
Independent of a particular solution
When a company is looking for help from a strategic consultancy firm, you know upfront part of the answer. As every consultancy firm has a preferred sweet spot, based on their principal consultant(s). As a PLM consultant, I probably imagine the best PLM approach for your company, not being expert in financials or demagogic trends. If the advisory company has a background in accountancy, they will focus their advice on financials. If the company has a background in IT, they will focus their information on an infrastructure concept saving so much money.
A modern digital enterprise is now the trend, where digital allows the company to connect and interact with the customer and therefore react faster to market needs or opportunities. IoT is one of the big buzz words here. Some companies grasp the concept of being customer centric (the future) and adapt their delivery model to that, not realizing the entire organization including their product definition process should be changing too. You cannot push products to the market in the old linear way, while meanwhile expecting modern agile work processes.
Most of the independent strategic consultants will not push for a broader scope as it is out of their comfort zone. Think for a moment. Who are the best strategic advisors that can talk about the product definition process, the delivery process and products in operation and service? I would be happy if you give me their names in the comments with proof points.
Related to a particular solution
When you connect with a strategic advisory company, which an extensive practice in XXX or YYY, you can be sure the result will be strategic advice containing XXX or YYY. The best approach with ZZZ will not come on the table, as consultancy firms will not have the intention to investigate in that direction for your company. They will tell you: “With XXX we have successfully transformed (many) other companies like yours, so choose this path with the lowest risk.
And this is the part what concerns me the most at this time. Business is changing rapidly and therefore PLM should be changing too. If not that would be a strange situation? Read about the PLM Identity crisis here and here.
The solution is at C-level (conclusion)
I believe the at the end the future of your company will be dependent on your DNA, your CEO and the C-level supporting the CEO. Consultancy firms can only share their opinion from their point of view and with their understanding in mind.
If you have a risk-averse management, you might be at risk.
Doing nothing or following the majority will not bring more competitive advantage.
The awareness that business is global and changing rapidly should be on every company’s agenda.
Change is always an opportunity to get better; still no outsider can recommend you what is the best. Take control and leadership. For me, it is clear that the product development and delivery process should be a part of this strategy. Call it PLM or something different. I do not care. But do not focus on efficiency and ROI, focus on being able to be different from the majority. Apple makes mobile phones; Nespresso makes coffee, etc.
Think and use extreme high elevators to talk with your C-level!
In 1999, I started my company TacIT in order to focus on knowledge management. The name TacIT came from the term tacit knowledge, the
knowledge an expert has, combining knowledge from different domains and making the right decision, based on his or her experience / intuition? Tacit knowledge is the opposite of explicit knowledge which you can define in rules. In particular, large companies are always looking for ways to capture and share knowledge to raise the tacit knowledge of their employees.
When I analyzed knowledge management in 1999, many businesses thought it was just about installing intranet. At that time, it became in fashion to have an internal website where people were publishing their knowledge. Wikipedia was not yet launched. Some people got excited from the intranet capabilities; however a lot of information remained locked or hidden. What was clear to me at that time was that knowledge management as a bottom-up approach would not work in an organization for the following reasons:
- In 1999 knowledge was power, so voluntary sharing your knowledge was considered more or less reducing your job security. Others might become as skilled as you. A friend of mine was trying to capture knowledge from experts in his domain and only people close to retirement were willing to speak with him. Has this attitude meanwhile changed?
- It takes time to share your knowledge and in particular for the busy experts this is a burden. They want (or need) to go on to the next job and not spend “useless” time to describe what they have learned.
My focus on knowledge management disappeared in 2000 as I got dragged into PLM with the excuse in mind that PLM should be a kind of knowledge management too.
No knowledge management in PLM
In theory, the picture representing PLM is a circle, where through iterations organizations learn to improve their products and understand better the way their products are perceived and performing in the market. However, the reality was that PLM was used as an infrastructure to transfer and share information mainly within engineering disciplines. Each department had its own tools and demands. Most companies have silos for PDM, ERP, and Services, and people have no clue about which information exist within the organization. Most of the time, they only know their system and even worse they are the only ones that know where their data is stored (or hidden when you talk to colleagues)
When PLM became more and more accepted as the backbone for product information in a company, there was more attention for a structured manner of knowledge management in the context of lessons learned. Quality systems like ISO900x provide guidance for processes of quality improvement. Various industries have their own quality methodology, APQP, 8D, CAPA all to ensure quality gets improved in a learning organization. 8D and CAPA are examples of Issue management which are a must-do for every PLM implementation. It is the first step in sharing and discovering commonalities and trends related to your product, your processes, and your customers. When issues are solved by email and phone calls, the content and lesson learned remain often hidden for the rest of the organization.
Still storing all information into one PLM system is not what I would call knowledge management. Also, my garbage bin (I had a huge one) contains all my written notes and thoughts. Would anyone be able to work with my environment? No!
Knowledge Management is an attitude
When organizations really care about knowledge, it should be a top-down guided process. And knowledge is more than storing data in a static manner in a central place. Let´s have a look how modern knowledge management could work:
In a PLM system you will find mainly structured information, i.e. Bill of Materials containing Parts, Documents/CAD Models/Drawings describing products, Catalogs with standard parts, Suppliers and in modern environment perhaps even issues related to these information objects and all the change processes that have been performed on parts, products or documents.
This information already becomes valuable information if companies upfront spend time on planning and creating the context of the information. This means attributes are important and even maintaining relationships between different types of information. This is the value a PLM system can bring beyond a standard document management system or a parts database. Information in the right context brings much more value.
For example, a “Where used” of a part not only in the context of a BOM but also in the context of suppliers, all issues, all ECRs/ECOs, projects or customers implemented. It could be any relation, starting from any relevant information object.
“Do your job as fast as possible and do only what is necessary to deliver now” is often the message from a short-sighted manager who believes that spending time on the “NOW” is more important than spending time on the “FUTURE.”
Managing information to become valuable in the future is an investment that needs to be done in the world of structured data. Once done, a company will discover that this investment has improved the overall performance of the company as time for searching will reduce (from 20 %++ to 5 % –) and people are enabled to reuse instead of reinventing things or worse re-experience issues.
There is more structured information out there.
Of course, companies cannot wait for a few years till structured information becomes usable. Most of the time there is already a lot of information in the various systems the company is using. Emails, the ERP system, the PDM system and file directories may contain already relevant information. Here modern search-based applications like Exalead or Conweaver (for sure there are more apps in the market – these are the two I am familiar with) will help to collect and connect information in context coming from various systems. This allows users to see information across disciplines and across the lifecycle of a product.
Still these capabilities are not really knowledge management increasing the tacit knowledge of a company
How to collect tacit knowledge ?
Static information collection does not contribute to tacit knowledge, it provides some visibility to what exists and might help with explicit knowledge. Tacit knowledge can only be collected by an active process. People in an organization need to be motivated and stimulated to share their story, which is more than just sharing information. It is the reasoning why certain decisions were taken which helps others to learn. Innovation and learning come from associating information coming from different domains and creating opportunity and excitement to share stories. This is what modern companies like Google and Apple do and it is somehow the same way as information is shared at the coffee machine. This is the primary challenge. Instead of an opportunistic approach to knowledge sharing you want a reliable process of knowledge sharing. The process of capturing and sharing tacit knowledge could be improved by assigning knowledge agents in a company.
A knowledge agent has the responsibility to capture and translate lessons learned. For that reasons, a knowledge agent should be somebody who can capitalize information and store and publish it in a manner the information can be found back in various contexts. The advantage of such a process is that knowledge is obtained in a structured manner. In the modern world, a knowledge agent could be a community owner / moderator actively sharing and publishing information. Strangely knowledge agents are often considered as overhead as their immediate value is not directly visible (as many of the PLM activities are) although the job of a knowledge agent does not need to be a full-time job.
I found a helpful link related to the knowledge management agent here: 7 knowledge management tips. The information is not in the context of product development. However, it is generic enough to consider.
Many companies talk about PLM and Knowledge Management as equivalents to each other. It should be clear that they are different although also partly overlapping in purpose. Import to understand that both PLM knowledge and general Knowledge Management will only happen with a top-down strategy and motivation for the organization, either by assigning individual people to become knowledge agents or to have common processes for all to follow up.
I am curious to learn:
- Is knowledge management on your company´s agenda
and if Yes
- How is knowledge management implemented
Last week I attended the PI conference in Munich, which has become a tradition since 2011. Personally, I have been busy moving houses, so blogging has not been my priority recently. However, the PI Conference for me is still one of the major events happening in Europe. Excellent for networking and good for understanding what is going on in the world of PLM. Approximate 200 delegates attended from various industries and. Therefore, the two days were good to find and meet the right people.
As the conference has many parallel sessions, I will give some of the highlights here. The beauty of this conference is that all sessions are recorded. I am looking forward to catch-up with other meetings in the upcoming weeks. Here some of the highlights of the sessions that I attended.
Some of the highlights
The first keynote session was from Mark Gallagher with the title: High-Performance Innovation in Formula One. Mark took us through the past and current innovations that have been taken place in the F1. I was involved some years ago in a PLM discussion with one of the F1 team.
I believe F1 is a dream for engineers and innovators. Instead of a long time to market, in F1, it is all about bringing innovation to the team as fast as possible. And interesting to see IoT, direct feedback from the car during the race is already a “commodity” in F1 – see the picture. Now we need to industrialize it.
Peter Bilello (CIMdata) took us through The Future Sustainability of PLM. One of the big challenges for PLM implementations is to make the sustainable. Currently, we see many PLM implementations reaching a state of obsolescence, no longer able to support the modern business for various reasons.
Change of owner, mergers, a different type of product, the importance of software. All of these reasons can become a significant challenge when your PLM implementation has been tuned to support the past.
How to be ready for the future. Peter concluded that companies need to be pro-active manage their systems and PLM platforms might give an answer for the future. However, these platforms need to be open and rely on standards, to avoid locking in data in the platform.
Final comment: To stay competitive in the future companies need to have an adequate strategy and vision.
Gary Knight, PLM Business Architecture Manager from Jaguar Land Rover, gave an impressive presentation about the complete approach JLR has executed. Yes, there is the technical solution. However the required cultural change and business change to align the vision with execution on the floor are as important. Making people enthusiastic and take part in realizing the future.
The traditional productivity dip during a business transformation has been well supported by intensive change management support, allowing the company to keep the performance level equal without putting its employees under big pressure. Many companies I have seen could learn from that.
PLM and ERP
In the afternoon, I moderated a focus group related to PLM and ERP integration challenges. An old-fashioned topic you might think. However, the room was full of people (too many) all hoping to find the answers they need. Some conclusions:
- Understanding the difference between owning data and sharing data. Where sharing still requires certain roles to be responsible for particular data sets.
- First define the desired process how information should flow between roles in the organization without thinking in tools. Once a common agreement exists, a technical realization will not be the bottleneck.
- PLM and ERP integrations vary per primary process (ETO, BTO, CTO, and MTS). In each of these processes the interaction between PLM and ERP will be different due to timing issues or delivery model
Irene Gustafson from Volvo Cars explained the integration concept with partners / suppliers based on Eurostep´s ShareAspace. I wrote about this concept in my blog post: The weekend after PDT2015. Meanwhile, the concept of a collaboration hub instead of direct integration between an OEM and its supplier has become more traction.
Irene Gustafson made some interesting closing statements
- Integration should not be built into the internal structure, it takes away flexibility
- A large portion of collaborative data is important here and now. Long term only a limited part of that data will need to be saved
Eurostep announced their new upcoming releases based on different collaboration scenarios, InReach, InControl and InLife. These packages allow fast and more OOTB deployment of their collaboration hub (based on de PLCS standard)
Digital Transformation at Philips and GE
Anosh Thakkar, Chief Technology Officer from Philips, explained their digital business transformation from pushing products to the markets towards a HealthTech company, leaving the lightning division behind. Philips used three “transformers” to guide the business change:
- From Complex to Simple, aligning businesses to 4 simplified business model (instead of 75) and one process framework supported by core IT platforms reducing customizations and many applications (from 8000 to 1000)
- From Analog to Digital, connecting customer devices through a robust cloud-based platform. A typical example of modern digital businesses
- From Products to Solution, again with a focus on the end-user how they could work in an ideal way instead of delivering a device (the Experience economy)
Ronan Stephan, chief scientist of GE, presented the digital business transformation is working on. Ronan took us through the transformation models of Amazon, Apple, and Google, explaining how their platforms and the insight coming from platform information have allowed these companies to be extremely successful. GE is aiming to be the leader in the digital industry, connecting their company with all their customers (aerospace, transportation, power & healthcare) on their Predix platform.
On the second day, I presented to a relatively small audience (5 parallel sessions – all interesting) a session with the title: The PLM Identity crisis. Luckily there were still people in the conference that have the feeling something is changing in PLM. My main message was that PLM like everything else in the current world suffers from rapid changing business models (hardware products towards software driven systems) and lack of time to distinguish between facts and opinions. The world of one-liners. To my opinion existing PLM, concepts are no longer enough, however, the PLM market still is mainly based on classical linear thinking as my generation (the baby boomers) are still leading the business. Have a look at the presentation here, of find a nice complementary related post from my blog buddy Oleg Shilovitsky here.
As I am in the middle of moving houses, now in no man’s land, I do not have the time and comfortable environment to write a more extensive review this time. Perhaps I will come back with some other interesting thoughts from this conference after having seen more recordings.
My observation after the conference:
A year ago I wrote The Weekend After Product Innovation 2015 in Düsseldorf where managing software in the context of PLM was the new topic. This year you could see the fast change as now IoT platforms and M2M communication was the main theme. The digital revolution is coming …..