You are currently browsing the tag archive for the ‘PLM’ tag.

One week ago, Yoann Maingon wrote an innocent post with the question: Has FFF killed?  The question was raised related to a 2014 problem at GM, where a changed part was causing fatal accidents.

The discussion started by Yoann and here my short extract. Assuming this problem was a configuration management issue and Yoann somehow indicated that the problem might be related to the fact that ERP-systems do not carry a revision on the part number – leading to an unnoticed change.  Therefore, he assumes there is a disconnect between the PLM-side (where we have parts with multiple lifecycle states and revisions) and ERP (where we have an industrial lifecycle – prototype/production).

He posted his thoughts, and then LinkedIn exploded (currently 116 comments), which means it is a topic that is of significant concern in our community. Next, if you read the comments, there are different viewpoints:

  • What does FFF really imply?
  • What about revisions of parts?
  • What are the best practices?

Let’s investigate these viewpoints with some comments

What does FFF really imply?

When we talk about FFF in engineering, we mean Form, Fit and Function – the three primary characteristics to describe a part  (source Wikipedia)

  • Form refers to such characteristics as external dimensions, weight, size, and visual appearance of a part or assembly. This is the element of FFF that is most affected by an engineer’s aesthetic choices, including enclosure, chassis, and control panel, that become the outward “face” of the product.
  • Fit refers to the ability of the part or feature to connect to, mate with, or join to another feature or part within an assembly. The “fit” allows the part to meet the required assembly tolerances to be useful.
  • Function is a criterion that is met when the part performs its stated purpose effectively and reliably. In an electronics product, for example, a function can depend on the solid-state components used, the software or firmware, and quite often on the features of the electronics enclosure selected.

One of the comments in Yoann’s post referred to Safe/Unsafe as a potential functional characteristic. I think this addition is not needed. Safety should be a requirement for the part, not a characteristic.

FFF was and still is an approach for engineers to decide if a new, improved version of the part would get a revision or needs a new part number.

I think before we dive deeper into the other viewpoints, it is crucial to define the part number a little more.

In a correct PLM data model, there are two types of part numbers. First, the internal part number that your company uses inside its engineering Bill of Materials to identify a part. This part number can be a meaningless part only to provide uniqueness inside the company.

In 2015 I wrote several posts related to best practices and data modeling for PLM. The most relevant posts to this discussion are here:

The part number can specify a part that needs to be manufactured according to specification, or it can be a part that needs to be purchased from an available supplier/manufacturer. The manufacturer part number is, most of the time, a meaningful number (6 – 7 characters) as these parts need to be ordered by your company. The manufacturer part number is the SKU for the manufacturer. As you can imagine in the manufacturer’s catalog, there isn’t a revision mentioned. In graphics, see the image below:

Your company might sell Product MP-323121 (note: the ID is meaningful to help the customer to order the product).

Internally there is a related EBOM that specifies the product. The EBOM top part is O122 (note: here, we can use a meaningless identifier as all is digitally connected).

For the manufacturing of O122, we need to resolve the EBOM according to its specifications. Therefore, for Part O124, the company needs to decide to purchase from their approved manufacturers either part ABC-21231 or XYZ-88818 (note: again, a meaningful ID as these companies are not digitally connected).

Now coming back to the FFF-discussion. For the orange parts, with a meaningful ID, no revision exists. However, if Assembly O122 is 100% FFF compatible, the Product ID MP-323121 will not change. It allows your company to optimize the EBOM and/or MBOM, meanwhile keeping 100% compatibility to the outside world. (note: the same principle applies to the two manufacturers for Part O124.)

In case Top Assembly O122 has new or changed parts – what should happen there?

At that moment, the definition has changed. The definitions, most of the time described in documents/drawings/models, are related information to the BOM. Therefore the Top Assembly O122 should get a new identifier. There is no need to name it a revision, it is a new data set in the PLM-system, again with a meaningless identifier as we are connected digitally,

What about revisions of parts?

Of course, the management of changes existed long before PLM-systems were introduced.

The specifications of a part were defined in drawings. The drawing contained all the information, not only the geometry definitions, but also specifications on how to manufacture the part.

For complex products, a considerable set of consistently related drawings would be released to manufacturing.  A release process with physical signatures on it.

At the same time, there was no discussion: the drawing represents the part. And as there was no digital connection, part numbers/drawing numbers were meaningful, often with the format of the drawing as part of the identifier.

In case changes were needed, for example, fixing a dimension or tolerance as discovered during manufacturing, the drawing had to be revised to remain consistent. First, in the original drawing, the issue or change was marked in red (redlining). Then engineering had to create a new version of the drawing.

Depending on the impact of change (here comes also the FFF-principle), people decided if a new part number was needed (FFF-change) or that the change only required an update of the drawing(s), meaning a revision.  If the difference was small (for example, adding a missing annotation), it could be called a minor change, all to be reflected in the drawing number, which equals the part number in this approach. So, when we talk about revisions of parts, we are talking about a document change.

A lousy practice from that approach is also that often manufacturing just redlines a drawing and keeps the redlined drawing as their source. It is too time-consuming or difficult to update the source drawing(s) through a change process. Engineering is not aware of this change, and when a later change comes through from engineering, these “fixes” might be missed as there is no traceability.

Generic example of a PLM data model and its relationsWhen PLM-systems were introduced, of course, companies did not want to disrupt their existing ways of working. Therefore, they were asking the PLM-editors to enable revisions on parts and so the PLM-editors did (or do).

Decoupling of parts and documents in a PLM data model

However, if you want to use the PLM-system in the best manner, you need to “decouple” the concept:  part number equals drawing number, combined with the possibility to start using meaningless identifiers, as relations between parts and drawings are managed in the PLM-system through relational links.

Relevant post related to the PLM data model are:

What are the best practices?

As some people mentioned in their comments to Yoann’s post, why do we have to answer this question as all is already well understood and described in best practices? I agree with that statement: Best Practices exist – so how to obtain them?

First, there is the whole framework of Configuration Management, which existed long before PLM-systems were introduced. If you follow their methodology, you can be (almost) guaranteed your information is consistent and correct. Configuration Management is crucial in areas where the impact of an error is enormous, like the GM-example Yoann referred to. Also, companies in the Aerospace and Defense industry are the ones that have strict configuration management in place.

Configuration management does not come for free. It requires an investment in skills, potentially a change in ways of working, and requires an overhead. Manufacturing companies that are creating less “risky” products often focus more on optimizing (= reducing) the cost of their internal processes instead of investing in proper methodologies to manage consistency.

If you want to learn more about CM, investigate the Institute of Process Excellence (IPX), the founders of the CM2 framework for Enterprise Configuration Management, and much more. Note: Their knowledge does not come for free, which I can understand. However, it also creates a barrier for the company’s further investment in CM as this kind of strategic investments are hard to sell at the management level by individuals in a company.

In the context of CM, I advise you to follow Martijn Dullaart, who is quite active in our social community. His latest blog post related to this thread is: It’s about Interchangeability and Traceability

With the introduction of PLM-system, these companies and the PLM-editors created the opportunity to implement configuration management in their system.

The data inside the system would be the “single version of the truth.” Unfortunately, this was most of the time, just a sales strategy, falsely giving the impression that information is under control now. Last year I wrote several posts related to the relation between PLM and CM, starting from PLM and Configuration Management – a happy marriage?

If you are interested in another resource for information related to these topics, have a look at the website from Jörg Eisenträger who also collected his best practices for PLM and CM for sharing (thanks Paul van der Ree for the link)

Don’t expect best practices from your PLM-vendors as their role is to sell software. It is the continuous discussion between:

  • A PLM-system that forces companies to work according to embedded methodology (hard to sell/implement but idealistically correct)

And

  • A flexible PLM-system that allows you to build and configure anything (easy to sell/challenging to implement correctly, depending on “wise” decisions)

The Future

Even though most companies are working drawing-centric, with or without a linked PLM-backbone for BOM-management, the next upcoming challenge is to evolve to model-based practices. The current CM-practices still talk about documents, although documents are already electronic datasets in that context. The future, however, in a model-based enterprise evolves related to connected models, 3D Models, but also simulation and software models, with different lifecycles and pace of change. For the model-based enterprise, we need to develop digital best practices that guarantee the same level of quality, however, executed and/or supported by (AI) Artificial Intelligence. AI is needed as human beings cannot physically analyze and understand all the impact of a change in such an environment.

Conclusion

The FFF-discussion illustrates that building a consistent framework within PLM is not an easy goal to achieve. My blog buddy Oleg Shilovitsky would claim that we consultants create the complexity. PLM-editors will never solve this complexity, it is up to your company’s mission to invest in knowledge to understand why and how to reduce the complexity. With this post and the related links and discussions, I hope more clarity will help you to make “wise” decisions.

This post is based on a mix of interactions I had the last two weeks in my network, mainly on LinkedIn.  First, I enjoyed the discussion that started around Yoann Maingon post: Thoughts about PLM Business models. Yoann is quite seasoned in PLM, as you can see from his LinkedIn profile, and we have had interesting discussions in the past, and recently about a new PLM-system, he is developing Ganister PLM, based on a flexible Graph database.

Perhaps in that context, Yoann was exploring the various business models. Do you pay for the software (and maintenance), do you pay through subscription, what about a modular approach or a full license for all the functionality? All these questions made me think about the various business models that I encountered and how hard it is for a customer to choose the optimal solution.  And is the space for a new type of PLM? Is there space for free PLM? Some of my thoughts here:

PLM vendors need to be profitable

One of the most essential points to consider is that whatever PLM solution you are aiming to buy, make sure that your PLM vendor has a profitable business model. As once you started with a PLM solution, it is your company’s IP that will be stored in this environment, and you do not want to change every few years your PLM system. Switching PLM systems would be affordable if the PLM system would store their data in a standard format – I will share a more in-depth link under PLM and standards.

For the moment, you cannot state PLM vendors endorse standards. None of the real PLM vendors have a standardized data model, perhaps closest to standards are Eurostep, who have based that ShareAspace solution on top of the PLCS (ISO 10303) standard. However, ShareAspace is more positioned as a type of middleware, connecting between OEMs/Owner/Operators and their suppliers to benefit for standardized connectivity.

Coming back to the statement, PLM Vendors need to be profitable to provide a guarantee for the future of your company’s data is the first step. The major PLM Vendors are now profitable as during a consolidation phase starting 15 years ago, a lot of non-profitable PLM Vendors disappeared. Matrix One, Agile, Eigner & Partner PLM are the best-known companies that were bought for either their technology or market share. In that context, you might also look at OnShape.

Would they be profitable as a separate company, or would investors give up? To survive, you need to be profitable, so giving software away for free is not a good sign (see the software for free paragraph) as a company needs continuity.

PLM startups

In the past 10 years, I have seen and evaluated several new PLM companies. All of them did not really change the PLM paradigm, most of them were still focusing on being an engineering collaboration tools. Several of these companies have in their visionary statement that they are going to be the “Excel killer.” We all know Excel has the best user interface and capabilities to manipulate a collection of metadata.

Very popular is the BOM in Excel, extracted from the CAD-system (no need for an “expensive” PDM or PLM) or BOM used to share with suppliers and stakeholders (ERP is too rigid, purchasing does not work with PDM).

The challenge I see here is that these startups do not bring real new value. The cost of manipulating Excels is a hidden cost, and companies relying on Excel communication are the type of companies that do not have a strategic point of view. This is typical for Small and Medium businesses where execution (“let’s do it”) gets all the attention.

PLM startups often collect investor’s money because they promise to kill Excel, but is Excel the real problem? Modern PLM is about data sharing, which is an attitude change, not necessarily a technology change from Excel tables to (cloud) shared tables. However, will one of these “new Excel killers” PLMs be disruptive? I don’t think so.

PLM disruption?

A week ago, I read an interview with Clayton Christensen (thanks Hakan Karden), which I shared on LinkedIn a week ago. Clayton Christensen is the father of the Disruptive Innovation theory, and I have cited him several times in my blogs. His theory is, in my opinion, fundamental to understand how traditional businesses can be disrupted. The interview took place shortly before he died at the age of 67. He died due to complications caused by leukemia.

A favorite part of this interview is, where he restates what is really Disruptive Innovation as we often talk about disruption without understanding the context, just echoing other people:

Christensen: Disruptive innovation describes a process by which a product or service powered by a technology enabler initially takes root in simple applications at the low end of a market — typically by being less expensive and more accessible — and then relentlessly moves upmarket, eventually displacing established competitors. Disruptive innovations are not breakthrough innovations or “ambitious upstarts” that dramatically alter how business is done but, rather, consist of products and services that are simple, accessible, and affordable. These products and services often appear modest at their outset but over time have the potential to transform an industry.

Many of the PLM startups dream and position themselves as the new disruptor.  Will they succeed? I do not believe so if they only focus on replacing Excel, there is a different paradigm needed. Voice control and analysis perhaps (“Hey PLM if I change Part XYZ what will be affected”)?

This would be disruptive and open new options. I think PLM startups should focus here if they want my investment money.

PLM for free?

There are some voices that PLM should be free in an analogy to software management and collaboration tools. There are so many open-source software management tools, why not using them for PLM? I think there are two issues here:

  • PLM data is not like software data. A lot of PLM data is based on design models (3D CAD / Simulation), which is different from software. Designs are often not that modular as software for various reasons. Companies want to be modular in their products, but do they have the time and resources to reinvent their existing product. For software, these costs are so much lower as it is only a brain exercise. For hardware, the impact is significant. Bringing me to the second point.
  • The cost of change for hardware is entirely different compared to software. Changing software does not have an impact on existing stock or suppliers and, therefore, can be implemented once tested for its purpose. A hardware change impacts the existing production process. First, use the old parts before introducing the change, or do we accept the (costs) of scrap. Is our supply chain, or are our production tools ready to deliver continuity for the new version? Hardware changes are costly, and you want to avoid them. Software changes are cheap, therefore design your products to be configurable based on software (For example Tesla’s software controlling the features to be allowed)

Now imagine, with enough funding, you could provide a PLM for free.  Because of ease of deployment, this would be very likely a cloud offering, easy and scalable. However, all your IP is in that cloud too, and let’s imagine that the cloud is safer than on-premise, so it does not matter in which country your data is hosted (does it ?).

Next, the “free” PLM provider starts asking a small service fee after five years, as the promised ROI on the model hasn’t delivered enough value for the shareholders, they become anxious. Of course, you do not like to pay the fee. However, where is your data, and what happens when you do not pay?

If the PLM provider switches you off, you are without your IP. If you ask the PLM provider to provide your data, what will you get? A blob of XML-files, anything you can use?

In general, this is a challenge for all cloud solutions.

  • What if you want to stop your subscription?
  • What is the allowed Exit-strategy?

Here I believe customers should ask for clarity, and perhaps these questions will lead to a renewed understanding that we need standards.

PLM and standards

We had a vivid discussion in the blogging community in September last year. You can read more related to this topic in my post: PLM and the need for standards which describes the aspects of lock-in and needs for openness.

Finally, a remark related to the PLM-acronym. Another interesting discussion started around Joe Barkai’s post: Why I do not do PLM . Read the comments and the various viewpoint on PLM here. It is clear that the word PLM unites us all; however, the interpretation is different.

If someone in the street asks me what is your profession, I never mention I do PLM. I say: “I assist mainly manufacturing companies in redesigning their business processes using best practices and modern digital technologies”. The focus is on the business value, not on the ultimate definition of PLM

Conclusion

There are many business aspects related to PLM to consider. Yoann Maingon’s post started the thinking process, and we ended up with the PLM-definition. It all illustrates that being involved in PLM is never a boring journey. I am curious to learn about your journey and where we meet.

At the beginning of this week, I was attending the 9th edition of the PI conference in London. Where it started as a popular conference with 300 – 400 attendees at its best, we were now back to a smaller number of approximately 100 attendees.

It illustrates that PLM as a standalone topic is no longer attracts a broad audience as Marketkey (the organization of the conference) confirms. The intention is that future conferences will be focusing on the broader scope of PLM, where business transformation will be one of the main streams.

In this post, I will share my highlights of the conference, knowing that other sessions might have been valuable too, but I had to make a choice.

It is about people

Armin Prommersberger, CTO from DIRAC and the chairman of the conference, made a great point: “What we will discuss in the upcoming two days, it is all about people not about technology.”

I am not sure if this opening has influenced the mood of the conference, as when I look back to what was the central theme: It is all about how we deal with people when explaining, implementing and justifying PLM.

AI at the Forefront of a Digital Transformation

Muhannad Alomari from R2 Data Labs as a separate unit within Rolls Royce to explore and provide data innovation started with his keynote speech sharing the AI initiatives within his team.

He talked about several projects where AI will become crucial.

For example, the EHM program related to engine behavior. How to detect anomalies, how to establish predictive maintenance and maximize the time an airplane engine is in operation. Interesting to mention is that Muhannad explained that most simulation models are based on simplified simulation models, not accurate enough to discover anomalies.

Modeling in the PLM world with feedback from reality

Machine learning and feedback loops are crucial to optimize the models both for the discovery of irregularities and, of course, to improve understanding of the engine behavior and predict maintenance. Currently, maintenance is defined based on the worst-case scenario for the engine, which in reality, of course, will not be the case for most engines. There is a lot (millions) to gain here for a company.

Interesting to mention is that Muhannad gave a realistic view of the current status of Artificial Intelligence (AI). AI is currently still dumb – it is a set of algorithms that need to be adapted whenever new patterns are discovered. Deep learning is still not there – currently, we still need human beings for that.

This was in contrast with the session from Kalypso later with the title: Supercharge your PLM with advanced analytics. It was a typical example of where a realistic story (R2 Data Labs) shows such a big difference with what is sold by PLM vendors or implementers. Kalypso introduced Product Lifecycle Intelligence (PLI) – you can see the dream on the left (click on the image to enlarge).

Combine PLM with Analytics, and you have Intelligence.  My main comment is, knowing from the field the first three phases in most companies have a lack of data quality and consistency. Therefore any “Intelligence” probably will be based on unreliable sources. Not an issue if you are working in the domain of politics, however when it comes to direct cost and quality implications, it can be a significant risk. We still have a way to go before we have a reliable PLM data backbone for analytics.

 

Keeping PLM Momentum after a Successful Campaign

Susanna Mäentausta from Kemira in Finland gave an exciting update of their PLM project. Where in 2019, she shared with us their PLM roadmap (see my 2019 post: The weekend after PI PLMx London 2019); this time, Susanna shared with us how they are keeping the PLM momentum.

Often PLM implementations are started based on a hypothetical business case (I talked about this in my post The PLM ROI Myth). But then, when you implement PLM, you need to take care you provide proof points to motivate the management. And this is exactly what the PLM team in Kemira has been doing. Often management believes that after the first investment, the project is done (“We bought the software – so we are done”) however the business and process change that will deliver the value is not reported.

Susanna shared with us how they defined measurable KPIs for two reasons.  First, to motivate the management that there are business progress and benefits, however, it is a journey. And secondary the facts are used to kill the legends that “Before PLM we were much faster or efficient.” These types of legends are often expressed loudly by persons who consider PLM as an overhead (killing their freedom) instead of a way to be more efficient in business. In the end, for a company, the business is more important than the person’s belief.

On the question for Susanna, what she would have done better with hindsight, she answered: “Communicate, communicate, communicate.” A response I fully support as often PLM teams are too busy completing their day-to-day work, that there is no spare time for communication. Crucial to achieving a business change.

My agreement: PLM needs facts based during implementation and support combined with the understanding we are dealing with people and their emotions too. Both need full attention.

Acceleration Digitalization at Stora Enso

Samuli Savo, Chief Digital Officer at Stora Enso, explained the principles of innovation, related to digitalization at his company. Stora Enso, a Swedish/Finish company, historically one of the largest forestry companies in the world as well as one of the most significant paper and packaging producers, is working on a transformation to become the renewable materials company. For me, he made two vital points on how Stora Enso’s digitalization’s journey is organized.

He pleads for experimentation funded by corporate as in the experimental stage, as it does not make sense to have a business case. First DO and then ANALYZE, where many companies have to policy first to ANALYZE and then DO, killing innovative thinking.

The second point was the active process to challenge startups to solve business challenges they foresee and, combined with a governance process for startups, allow these companies to be supported and become embedded within member companies of the Combient Foundry, like Stora Enso. By doing such in a structured way, the outcome must lead to innovation.

I was thinking about the hybrid enterprise model that I have been explaining in the past. Great story.

Cyber-security and Future Mobility

Out of interest, I followed the session from Madeline Cheah, Cybersecurity Innovation Lead at HORIBA MIRA. She gave an excellent and well-structured overview. Madeline leads the cybersecurity research program. Part of this job is investigating ways to prevent vehicles from being attacked.  In particular, when it comes to connected and autonomous vehicles. How to keep them secure.

She discussed the known gaps are and the cybersecurity implications of future mobility so extensive that I even doubted will there ever be an autonomous vehicle on the road. So much to define and explore. She looked at it from the perspective of the Internet of Everything, where Everything is divided into Things, Data, Processes, and People. Still, a lot of work to do, see image below

Good Times Ahead: Delay Mitigation Through a Plan for Every Part

Ian Quest, director at Quick Release, gave an overview of what their company aims to be. You could translate it as the plumbers of the automotive industry Where in the ideal world information should be flowing from design to release, there are many bottlenecks, leakages, hiccups that need to be resolved as the image shows.

Where their customers often do not have the time and expertise to fix these issues, Quick Release brings in various skillsets and common sense. For example, how to deal with the Bill of Materials, Configuration Management, and many other areas that you need to address with methodology first instead of (vendor-based) technology. I believe there is a significant need for this type of company in the PLM-domain.

The second part, presented by Nick Solly, with a focus on their QRonos tool, was perhaps a little too much a focus on the capabilities of the tool. Ian Quest, in his introduction,  already made the correct statement:

The QRonos tool, which is more or less a reporting tool, illustrates again that when people care about reliable data (planning, tasks, parts, deliverables, …..), you can improve your business significantly by creating visibility to delays or bottlenecks. The value lies in measurable activities and from there, learn to predict or enhance – see R2 Labs, Kemira and the PLI dream.

Conclusion

It is clear that a typical PLM conference is no longer a technology festival – it is about people. People are trying to change or improve their business. Trying to learn from each other, knowing that the technical concepts and technology are there.

I am looking forward to the upcoming PI events where this change will become more apparent.

 

In my previous post, I shared my observations from the past 10 years related to PLM. It was about globalization and digitization becoming part of our daily business. In the domain of PLM, the coordinated approach has become the most common practice.

Now let’s look at the challenges for the upcoming decade, as to my opinion, the next decade is going to be decisive for people, companies and even our current ways of living. So let’s start with the challenges from easy to difficult

Challenge 1: Connected PLM

Implementing an end-to-end digital strategy, including PLM, is probably business-wise the biggest challenge. I described the future vision for PLM to enable the digital twin –How PLM, ALM, and BIM converge thanks to the digital twin.

Initially, we will implement a digital twin for capital-intensive assets, like satellites, airplanes, turbines, buildings, plants, and even our own Earth – the most valuable asset we have. To have an efficient digital continuity of information, information needs to be stored in connected models with shared parameters. Any conversion from format A to format B will block the actual data to be used in another context – therefore, standards are crucial. When I described the connected enterprise, this is the ultimate goal to be reached in 10 (or more) years. It will be data-driven and model-based

Getting to connected PLM will not be the next step in evolution. It will be disruptive for organizations to maintain and optimize the past (coordinated) and meanwhile develop and learn the future (connected). Have a look at my presentation at PLM Roadmap PDT conference to understand the dual approach needed to maintain “old” PLM and work on the future.

Interesting also my blog buddy Oleg Shilovitsky looked back on the past decade (here) and looked forward to 2030 (here). Oleg looks at these topics from a different perspective; however, I think we agree on the future quoting his conclusion:

PLM 2030 is a giant online environment connecting people, companies, and services together in a big network. It might sound like a super dream. But let me give you an idea of why I think it is possible. We live in a world of connected information today.

 

Challenge 2: Generation change

At this moment, large organizations are mostly organized and managed by hierarchical silos, e.g., the marketing department, the R&D department, Manufacturing, Service, Customer Relations, and potentially more.

Each of these silos has its P&L (Profit & Loss) targets and is optimizing itself accordingly. Depending on the size of the company, there will be various layers of middle management. Your level in the organization depends most of the time on your years of experience and visibility.

The result of this type of organization is the lack of “horizontal flow” crucial for a connected enterprise. Besides, the top of the organization is currently full of people educated and thinking linear/analog, not fully understanding the full impact of digital transformation for their organization. So when will the change start?

In particular, in modern manufacturing organizations, the middle management needs to transform and dissolve as empowered multidisciplinary teams will do the job. I wrote about this challenge last year: The Middle Management dilemma. And as mentioned by several others – It will be: Transform or Die for traditionally managed companies.

The good news is that the old generation is retiring in the upcoming decade, creating space for digital natives. To make it a smooth transition, the experts currently working in the silos will be missed for their experience – they should start coaching the young generation now.

 

Challenge 3: Sustainability of the planet.

The biggest challenge for the upcoming decade will be adapting our lifestyles/products to create a sustainable planet for the future. While mainly the US and Western Europe have been building a society based on unlimited growth, the effect of this lifestyle has become visible to the world. We consume with the only limit of money and create waste and landfill (plastics and more) form which the earth will not recover if we continue in this way. When I say “we,” I mean the group of fortunate people that grew up in a wealthy society. If you want to discover how blessed you are (or not), just have a look at the global rich list to determine your position.

Now thanks to globalization, other countries start to develop their economies too and become wealthy enough to replicate the US/European lifestyle. We are overconsuming the natural resources this earth has, and we drop them as waste – preferably not in our backyard but either in the ocean or at fewer wealth countries.

We have to start thinking circular and PLM can play a role in this. From linear to circular.

In my blog post related to PLM Roadmap/PDT Europe – day 1,  I described Graham Aid’s (Ragn-Sells) session:

Enabling the Circular Economy for Long Term Prosperity.

He mentioned several examples where traditional thinking just leads to more waste, instead of starting from the beginning with a sustainable model to bring products to the market.

Combined with our lifestyle, there is a debate on how the carbon dioxide we produce influences the climate and the atmosphere. I am not a scientist, but I believe in science and not in conspiracies. So there is a problem. In 1970 when scientists discovered the effect of CFK on the Ozone-layer of the atmosphere, we ultimately “fixed” the issue. That time without social media we still trusted scientists – read more about it here: The Ozone hole

I believe mankind will be intelligent enough to “fix” the upcoming climate issues if we trust in science and act based on science. If we depend on politicians and lobbyists, we will see crazy measures that make no sense, for example, the concept of “biofuel.” We need to use our scientific brains to address sustainability for the future of our (single) earth.

Therefore, together with Rich McFall (the initiator), Oleg Shilovitsky, and Bjorn Fidjeland (PLM-peers), we launched the PLM Green Alliance, where we will try to focus on sharing ideas, discussion related to PLM and PLM-related technologies to create a network of innovative companies/ideas. We are in the early stages of this initiative and are looking for ways to make it an active alliance. Insights, stories, and support are welcome. More to come this year (and decade).

 

Challenge 4: The Human brain

The biggest challenge for the upcoming decade will be the human brain. Even though we believe we are rational, it is mainly our primitive brain that drives our decisions. Thinking Fast and Slow from Daniel Kahneman is a must-read in this area. Or Predictably Irrational: The Hidden Forces that shape our decisions.  Note: these books are “old” books from years ago. However, due to globalization and social connectivity, they have become actual.

Our brain does not like to waste energy. If we see the information that confirms our way of thinking, we do not look further. Social media like Facebook are using their algorithms to help you to “discover” even more information that you like. Social media do not care about facts; they care about clicks for advertisers. Of course, controversial headers or pictures get the right attention. Facts are no longer relevant, and we will see this phenomenon probably this year again in the US presidential elections.

The challenge for implementing PLM and acting against human-influenced Climate Change is that we have to use our “thinking slow” mode combined with a general trust in science. I recommend reading Enlightenment now from Steven Pinker. I respect Steven Pinker for the many books I have read from him in the past. Enlightenment Now is perhaps a challenging book to complete. However, it illustrates that a lot of the pessimistic thinking of our time has no fundamental grounds. As a global society, we have been making a lot of progress in the past century. You would not go back to the past anymore.

Back to PLM.

PLM is not a “wonder tool/concept,” and its success is mainly depending on a long-term vision, organizational change, culture, and then the tools. It is not a surprise that it is hard for our brains to decide on a roadmap for PLM. In 2015 I wrote about the similarity of PLM and acting against Climate Change  – read it here: PLM and Global Warming

In the upcoming PI PLMx London conference, I will lead a Think Tank session related to Getting PLM on the Executive’s agenda. Getting PLM on an executive agenda is about connecting to the brain and not about a hypothetical business case only.  Even at exec level, decisions are made by “gut feeling” – the way the human brain decides. See you in London or more about this topic in a month.

Conclusion

The next decade will have enormous challenges – more than in the past decades. These challenges are caused by our lifestyles AND the effects of digitization. Understanding and realizing our biases caused by our brains is crucial.  There is no black and white truth (single version of the truth) in our complex society.

I encourage you to keep the dialogue open and to avoid to live in a silo.

Last week I shared the first impression from my favorite conference, the PLM Roadmap / PDT conference organized by CIMdata and Eurostep. You can read some of the highlights here: The weekend after PLM Roadmap / PDT 2019 Day 1.

Click on the logo to see what was the full agenda. In this post, I will focus on some of the highlights of day 2.

Chernobyl, The megaproject with the New Arch

Christophe Portenseigne from the Bouygues Construction Group shared with us his personal story about this megaproject, called Novarka. 33 years ago, reactor #4 exploded and has been confined with an object shelter within six months in 1986. This was done with heroic speed, and it was anticipated that the shelter would only last for 20 – 30 years.  You can read about this project here.

The Novarka project was about creating a shelter for Confinement of the radioactive dust and protection of the existing against external actions (wind, water, snow…) for the next 100 years!

And even necessary, the inside the arch would be a plant where people could work safely on the process of decommissioning the existing contaminated structures. You can read about the full project here at the Novarka website.

What impressed me the most the personal stories of Christophe taking us through some of the massive challenges that need to be solved with innovative thinking. High complexity, a vast number of requirements, many parties, stakeholders involved closed in June 2019. As Christophe mentioned, this was a project to be proud of as it creates a kind of optimism that no matter how big the challenges are, with human ingenuity and effort, we can solve them.

A Model Factory for the Efficient Development of High Performing Vehicles

Eric Landel, expert leader for Numerical Modeling and Simulation at Renault, gave us an interesting insight into an aspect of digitalization that has become very valuable, the connection between design and simulation to develop products, in this case, the Renault CLIO V, as much as possible in the virtual world. You need excellent simulation models to match future reality (and tests). The target of simulation was to get the highest safety test results in the Europe NCAP rating – 5 stars.

The Renault modeling factory implemented a digital loop (below) to ensure that at the end of the design/simulation, a robust design would exist.  Eric mentioned that for the Clio, they did not build a prototype anymore. The first physical tests were done on cars coming from the plant. Despite the investment in simulation software, a considerable saving in crash part over cost before TGA (Tooling Go Ahead).

Combined with the savings, the process has been much faster than before. From 10 weeks for a simulation loop towards 4 weeks. The next target is to reduce this time to 1 week. A real example of digitization and a connected model-based approach.

From virtual prototype to hybrid twin

ESI – their sponsor session Evolving from Virtual Prototype Testing to Hybrid Twin: Challenges & Benefits was an excellent complementary session to the presentation from Renault

PLM, MBSE and Supply chain – challenges and opportunities

Nigel Shaw’s presentation was one of my favorite presentations, as Nigel addressed the same topics that I have been discussing in the past years. His focus was on collaboration between the OEM and supplier with the various aspects of requirements management, configuration management, simulation and the different speeds of PLM (focus on mechanical) and ALM (focus on software)

How can such activities work in a digitally-connected environment instead of a document-based approach?  Nigel looked into the various aspects of existing standards in their domains and their future. There is a direction to MBE (Model-Based Everything) but still topics to consider. See below:

I agree with Nigel – the future is model-based – when will be the issue for the market leaders.

The ISO AP239 ed3 Project and the Through Life Cycle Interoperability Challenge

Yves Baudier from AFNET,  a reference association in France regarding industry digitation, digital threads, and digital processes for Extended Enterprise/Supply chain. All about a digital future and Yves presentation was about the interoperability challenge, mentioning three of my favorite points to consider:

  • Data becoming more and more a strategic asset – as digitalization of Industry and Services, new services enabled by data analytics
  • All engineering domains (from concept design to system end of life) need to develop a data-centric approach (not only model-centric)– An opportunity for PLM to cover the full life-cycle
  • Effectivity and efficiency of data interoperability through the life-cycle is now an essential industry requirement – e.g., “virtual product” and “digital twin” concepts

All the points are crucial for the domain of PLM.

In that context, Yves discussed the evolution of the ISO 10303-239 standard, also known as PLCS. The target with ISO AP239 ed3 is to become the standard for Aerospace and Defense for the full product lifecycle and through this convergence being able to push IT/PLM Vendors to comply – crucial for a digital enterprise

Time for the construction / civil industry

Christophe Castaing, director of digital engineering at Egis, shared with us their solution framework to manage large infrastructure projects by focusing on both the Asset Information (BIM-based) and the collaborative processes between the stakeholders, all based on standards. It was a broad and in-depth presentation – too much to share in a blog post. To conclude (see also Christophe’s slide below) in the construction industry more and more, there is the desire to have a digital twin of a given asset (building/construction), creating the need for standard information models.

Pierre Benning, IT director from Bouygues Public Works gave us an update on the MINnD project. MINnD standing for Modeling INteroperable INformation for sustainable INfrastructures in xD, a French research project dedicated to the deployment of BIM and digital engineering in the infrastructure sector. Where BIM has been starting from the construction industry, there is a need for a similar, digital modeling approach for civil infrastructure. In 2014 Christophe Castaing already reported the activities of the MINnD project – see The weekend after PDT 2014. Now Pierre was updating us on what are the activities for MINnD Season 2 – see below:

As you can see, again, the interest in digital twins for operations and maintenance. Perhaps here, the civil infrastructure industry will be faster than traditional industries because of its enormous value. BIM and GIS reconciliation is a precise topic as many civil infrastructures have a GIS aspect – Road/Train infrastructure for example. The third bullet is evident to me. With digitization and the integration of contractors and suppliers, BIM and PLM will be more-and-more conceptual alike. The big difference still at this moment: BIM has one standard framework where PLM-standards are still not in a consolidation stage.

Digital Transformation for PLM is not an evolution

If you have been following my blog in the past two years, you may have noticed that I am exploring ways to solve the transition from traditional, coordinated PLM processes towards future, connected PLM. In this session, I shared with the audience that digital transformation is disruptive for PLM and requires thinking in two modes.

Thinking in two modes is not what people like, however, organizations can run in two modes. Also, I shared some examples from digital transformation stories that illustrate there was no transformation, either failure or smoke, and mirrors. You can download my presentation via SlideShare here.

Fireplace discussion: Bringing all the Trends Together, What’s next

We closed the day and the conference with a fireplace chat moderated by Dr. Ken Versprille from CIMdata, where we discussed, among other things, the increasing complexity of products and products as a service. We have seen during the sessions from BAE Systems Maritime and Bouygues Construction Group that we can do complex projects, however, when there are competition and time to deliver pressure, we do not manage the project so much, we try to contain the potential risk. It was an interactive fireplace giving us enough thoughts for next year.

Conclusion

Nothing to add to Håkan Kårdén’s closing tweet – I hope to see you next year.

 

 

For me, the joint conference from CIMdata and Eurostep is always a conference to look forward too. The conference is not as massive as PLM-Vendor conferences (slick presentations and happy faces); it is more a collection of PLM-practitioners (this time a 100+) with the intent to discuss and share their understanding and challenges, independent from specific vendor capabilities or features.  And because of its size a great place to network with everyone.

Day 1 was more a business/methodology view on PLM and Day 2 more in-depth focusing on standards and BIM. In this post, the highlights from the first day.

The State of PLM

 

 

Peter Bilello, CIMdata’s president, kicked of with a review of the current state of the PLM industry. Peter mentioned the PLM-market grew by 9.4 % to $47.8 billion (more than the expected 7 %). Good for the PLM Vendors and implementers.

However, Peter also mentioned that despite higher spending, PLM is still considered as a solution for engineering, often implemented as PDM/CAD data management. Traditional organizational structures, marketing, engineering, manufacturing, quality were defined in the previous century and are measured as such.

This traditional approach blocks the roll-out of PLM across these disciplines. Who is the owner of PLM or where is the responsibility for a certain dataset are questions to solve. PLM needs to transform to deliver end-to-end support instead of remaining the engineering silo. Are we still talking about PLM in the future? See Peter’s takeaways below:

 

 

We do not want to open the discussion if the the name PLM should change – too many debates – however unfortunate too much framing in the past too.

The Multi View BOM

 

 

Fred Feru from Airbus presented a status the Aerospace & Defense PLM action group are working on: How to improve and standardize on a PLM solution for multi-view BOM management, in particular, the interaction between the EBOM and MBOM. See below:

 

You might think this is a topic already solved when you speak with your PLM-vendor. However, all existing solutions at the participant implementations rely on customizations and vary per company. The target is to come up with common requirements that need to be addressed in the standard methodology. Initial alignment on terminology was already a first required step as before you standardize, you need to have a common dictionary. Moreover, a typical situation in EVERY PLM implementation.

 

 

An initial version was shared with the PLM Editors for feedback and after iterations and agreement to come with a solution that can be implemented without customization. If you are interested in the details, you can read the current status here with Appendix A en Appendix B.

 

Enabling the Circular Economy for Long Term Prosperity

Graham Aid gave a fascinating presentation related to the potentials and flaws of creating a circular economy. Although Graham was not a PLM-expert (till he left this conference), as he is the Strategy and Innovation Coordinator for the Ragn-Sells Group, which performs environmental services and recycling across Sweden, Norway, Denmark, and Estonia. Have a look at their website here.

 

 

Graham shared with us the fact that despite logical arguments for a circular economy – it is more profitable at the end – however, our short term thinking and bias block us from doing the right things for future generations.

Look at the missing link for a closed resource-lifecycle view below.

Graham shared weird examples where scarce materials for the future currently were getting cheaper, and therefore there is no desire for recycling them. A sound barrier with rubble could contain more copper than copper ore in a mine.

In the PLM-domain, there is also an opportunity for supporting and working on more sustainable products and services. It is a mindset and can be a profitable business model. In the PDT 2014 conference, there was a session on circular product development with Xerox as the best example. Circular product development but also Product As A Service can be activities that contribute to a more sustainable world. Graham’s presentation was inspiring for our PLM community and hopefully planted a few seeds for the future. As it is all about thinking long-term.

 

 

With the PLM Green Alliance, I hope we will be able to create a larger audience and participation for a sustainable future. More about the PLM Green Alliance next week.

 

The Fundamental Role of PLM in Data-driven Product Portfolio Management

 

 

Hannu Hannila (Polar) presented his study related to data-driven product portfolio management and why it should be connected to PLM.  For many companies, it is a challenge to understand which products are performing well and where to invest. These choices are often supported by Data Damagement as Hannu called it.

An example below:

The result of this fragmented approach is that organizations make their decisions on subjective data and emotions. Where the assumption is that 20 % of the products a company is selling is related to 80 % of the revenue, Hannu found in his research companies where only 10 % of the products were contributing to the revenue. As PPM (Product Portfolio Management)  often is based on big emotions – who shouts the loudest mentality, influenced by the company’s pet products and influence by the HIPPO (HIghest Paid Person in the Office).  So how to get a better rationale?

 

 

Hannu explained a data-driven framework that would provide the right analytics on management level, depending on overall data governance from all disciplines and systems.  See below:

I liked Hannu’s conclusions as it aligns with my findings:

  • To be data-driven, you need Master Data Management and Data Governance
  • Product Portfolio Management is the driving discipline for PLM, and in a modern digital enterprise, it should be connected.

Sponsor sessions

Sponsors are always needed to keep a conference affordable for the attendees.  The sponsor sessions on day 1 were of good quality.  Here a quick overview and a link if you want to invest further

 

 

Configit – explaining the value of a configurator that connects marketing, technical and sales, introducing CLM (Configuration Lifecycle Management) – a new TLA

 

 

Aras – explaining their view on what we consider the digital thread

 

 

Variantum – explaining their CPQ solution as part of a larger suite of cloud offerings

 

 

Quick Release – bringing common sense to PLM implementations, similar to what I am doing as PLM coach – focusing on the flow of information

 

 

SAP – explaining the change in focus when a company moves toward a product as a service model

 

 

SharePLM – A unique company addressing the importance of PLM training delivered through eLearning

Conclusion

The first day was an easy to digest conference with a good quality of presentations. I only shared 50 % of the session as we already reached 1000+ words.  The evening I enjoyed the joint dinner, being able to network and discuss in depth with participants and finished with a social network event organized by SharePLM. Next week part 2.

Three weeks ago, I closed my PLM-twisted mind for a short holiday. Meanwhile, some interesting posts appeared about the PLM journey.

  • Is it a journey?
  • Should the journey be measurable?
  • And what kind of journey could you imagine?

Together these posts formed a base for a decent discussion amongst the readers.  I like these discussions. For me, the purpose of blogging is not the same as tweeting. It is not about just making noise so others will chime in or react (tweeting), it is about sharing an opinion, and if more people are interested, the discussion can start. And a discussion is not about right or false, as many conversations happen to be nowadays, it is about learning.

Let’s start with the relevant posts.

How to measure PLM?

The initial discussion started with Oleg Shilovitsky’s post about the need to measure the value of PLM. As Oleg mentions in his comments:

“During the last decades, I learned that every company that measured what they do was winning the business and succeeded (let’s count Google, Amazon, etc ..)”

This is an interesting statement, just measure! The motto people are using for digital businesses. In particular for the fast-moving software business. Sounds great, so let’s measure PLM. What can we measure with PLM? Oleg suggests as an example:

“Let’s say before PLM implemented a specific process, sales needed 2 days to get a quote. After PLM process implementation, it is 15 min.”

So what does this result tell us? Your sales can do 64 times more sales quotes. Do we need fewer salespeople now? We do not know from this KPI what is the real value for the company. This because there are so many other dependencies related to this process, and that makes PLM different from, for example, ERP. We do not talk about optimizing a process as Oleg might suggest below:

“Some of my PLM friends like to say – PLM is a journey and not some kind of software. Well, I’m not sure to agree about “journey,” but I can take PLM as a process. A process, which includes all stages of product development, manufacturing, support, and maintenance.”

Note: I do not want to be picky on Oleg, as he is provoking us all many times with just his thoughts. Moreover, several of them are a good points for discussion. So please dive into his LinkedIn posts and follow the conversation.

In Oleg’s follow-up post on measuring the value, he continued with Can we measure the PLM-journey which summarizes the comments from the previous post with a kind of awkward conclusion:

What is my conclusion? It is a time for PLM get out of old fashion guessing and strategizing and move into digital form of thinking – calculating everything. Modern digital businesses are strongly focused on the calculation and measurement of everything. Performance of websites, metrics of application usage, user experience, efficiency, AB testing of everything. Measurement of PLM related activity sounds like no brainier decision to me. Just my thoughts…

I think all of us agree that there needs to be a kind of indicative measurement in place to justify investments in place. There must be expected benefits that solve current business problems or bottlenecks.

My points that I want to share with you are:

  • It is hard to measure non-comparable ways of working – how do you measure collaboration?
  • Do you know what to measure? – engineering/innovation is not an ERP process
  • People and culture have so much impact on the results – how do you measure your company’s capability to adapt to new ways of working?

Meanwhile, we continue our journey…

Is PLM a never-ending journey?

In the context of the discussion related to the PLM journey, I assume Chad Jackson from Lifecycle Insights added his 3 minutes of thoughts. You can watch the video here:

Vlogging seems to become more prevalent in the US. The issue for me is that vlogs only touch the surface, and they are hard to scan for interesting reusable content. Something you miss when you are an experienced speed-reader. I like written content as it is easier to pick and share relevant pieces, like what I am doing now in this post.

Chad states that as long as PLM delivers quantified value, PLM could be expanding. This sounds like a journey, and I could align here. The only additional thought I would like to add to this point is that it is not necessary expanding all the time, it is also about a continuous change in the world and therefore your organization. So instead of expanding, there might be a need to do things differently: Have you noticed PLM is changing.

Next Chad mentions organizational fatigue. I understand the point – our society and business are currently changing extremely fast, which causes people to long for the past. A typical behavior I observe everywhere: in the past, everything was better. However, if companies would go back and operate like in the past, they would be out of business. We moved from the paper drawing board to 3D CAD, managing it through PDM and PLM to remain significant. So there is always a journey.

Fatigue comes from choosing the wrong directions, having a reactive culture – instead of being inspired and motivated to reach the next stage, the current stage is causing already so much stress. Due to the reactive culture, people cannot imagine a better future – they are too busy. I believe it is about culture and inspiration that makes companies successful – not by just measuring.  For avoiding change, think about the boiling frog metaphor, and you see what I mean

 

Upgrading to PLM when PDM falls short

At the same time, Jim Brown from Tech-Clarity published a PTC-sponsored eBook: Upgrading to PLM when PDM fall short, in which as he states:

This eBook explains how to recognize that you’ve outgrown PDM and offers several options to find the data and process management capabilities your company needs, whether it’s time to find a more capable PDM or upgrade to PLM. It also provides practical advice on what to look for in a PLM solution, to ensure a successful implementation, and in a software partner.

Jim is mentioning various business drivers that can drive this upgrade path. Enlarge the image to the left. I challenge all the believers in measurable digital results to imagine which KPIs they would use and how they can be related to pure PLM.

Here the upgrade process is aiming at replacing PDM by PLM something PLM vendors like. Immediate a significant numbers of licenses for the same basic PDM functionality – for your company hard to justify there is no additional value.

In many situations, I have seen that this type of PDM upgrade projects became advanced PDM projects – not PLM. The new PLM system was introduced in the engineering department and became an even bigger silo than before as other disciplines/departments were not willing to work with this new “monster” and preferred their own system. They believe that PLM is a system to be purchased and implemented, which is killing for a real PLM strategy.

Therefore I liked Oleg Shilovitsky’s post: 3 Reasons for Not Growing Existing PDM Into the Full PLM System.  Where Oleg’s points were probably more technology-driven, the value of this post was extended in the discussion. It became a discussion where various people and different opinions which I would like to have in real-time. The way LinkedIn filters/prioritizes comments makes it hard to have a chronological view of the discussion.

Still, if you are interested and have time for a puzzle, follow this discussion and add your thoughts

Conclusion

During my holidays, there was a vivid discussion related to the PLM value and journey. Looking back, it is clear we are part of a PLM journey. Some do not take part in the journey and keep on hanging to the past, those who understand the journey are all seeing different Points Of Interests – the characteristics of a journey

After my previous post about the PLM migration dilemma, I had several discussions with peers in the field why these PLM bad news are creating so much debate. For every PLM vendor, I can publish a failure story if I want. However, the reality is that the majority of PLM implementations do not fail.

Yes, they can cause discomfort or friction in an organization as implementing the tools often forces people to work differently.  And often working differently is not anticipated by the (middle) management and causes, therefore, a mismatch for the people, process & tools paradigm.

So we love bad news in real life. We talk about terrorism while meanwhile, a large number of people are dying through guns, cars, and even the biggest killer mosquitos. Fear stories sell better than success stories, and in particular, in the world of PLM Vendors, every failure of the competition is enlarged.  However, there are more actors involved in a PLM implementation, and if PLM systems would be that bad, they would not exist anymore and replace by ………?

Who to blame – the vendor?

Of course, it is the easiest way to blame the vendor as their marketing is promising to solve all problems. However, when you look from a distance to the traditional PLM vendor community, you see they are in a rat-race to deliver the latest and greatest technology ahead of their competition, often driven by some significant customers.

Their customers are buying the vision and expect it to be ready and industrialized, which is not the case – look at the digital twin hype or AI (Artificial Intelligence).  Released PLM software is not at the same maturity compared to office applications. Office applications do not innovate so much and have thousands of users during a beta-cycle and no dependency on processes.

Most PLM vendors are happy when a few customers jump on their latest release, combined with the fact that implementations of the most recent version are not yet a push on the button.  This might change in the long term if PLM Vendors can deliver cloud-based solutions.

PLM implementations within the same industry might look the same but often vary a lot due to existing practices, which will not change due to the tool – so there is a need for customization or configuration.

PLM systems with strong business rules inside their core might more and more develop towards configuration, where PLM toolkit-like systems might focus on ease of customization. Both approaches have their pro’s and con’s (in another blog post perhaps).

Another topic to blame the vendor is lack of openness.  You hear it in many discussions. If vendor X were open, they would not lock the data – a typical marketing slogan. If PLM vendors would be completely open, to which standards should they adhere?  Every PLM has its preferred collection of tools together – if you stay within their portfolio you have a minimum of compatibility or interface issues.

This logic started already with SAP in the previous century. For PLM vendors, there is no business model for openness. For example, the SmarTeam APIs for connecting and extracting data are available free of charge, leading to no revenue for the vendor and significant revenue for service providers. Without any license costs, they can build any type of interface/solution. In the end, when the PLM vendor has no sustainable revenue, the vendor will disappear as we have seen between 2000 and 2010, where several stand-alone PLM systems disappeared.

So yes, we can blame PLM vendors for their impossible expectations – coming to realistic expectations related to capabilities and openness is probably the biggest challenge.

Who to blame – the implementer?

The second partner in a PLM implementation is the implementation partner, often a specialized company related to the PLM vendor. There are two types of implementation partners – the strategic partners and the system integrators.

Let’s see where we can blame them.

Strategic partners, the consultancy firms,  often have a good relationship with the management, they help the company to shape the future strategy, including PLM. You can blame this type of company for their lack of connection to the actual business. What is the impact on the organization to implement a specific strategy, and what does this mean for current or future PLM?

Strategic partners should be the partner to support business change management as they are likely to have experience with other companies. Unfortunate, this type of companies does not have significant skills in PLM as the PLM domain is just a small subset of the whole potential business strategy.

You can blame them that they are useful in building a vision/strategy but fail to create a consistent connection to the field.

Implementation partners, the system integrators, are most of the times specialized in one or two PLM vendor’s software suites, although the smaller the implementation partner, the less broad their implementation skills. These implementation partners sometimes have built their own PLM best practices for a specific vendor and use this as a sales argument. Others just follow blindly what the vendor is promoting or what the customer is asking for.

They will do anything you request, as long as they get paid for it. The larger ones have loads of resources for offshore deliveries – the challenge you see here is that it might look cheap; however, it becomes expensive if there is no apparent convergence of the deliverables.

As I mentioned before they will never say No to a customer and claim to fill all the “gaps,” there are in the PLM environment.

You can blame implementation partners that their focus is on making money from services. And they are right, to remain in business your company needs to be profitable. It is like lawyers; they will invoice you based on their efforts. And the less you take on your plate, the more they will do for you.

The challenge for both consultancy partners as system integrators is to find a balance between experienced people, who really make it happen and educating juniors to become experts too. Often the customer pays for the education of these juniors

Who to blame – your company?

If your company is implementing PLM, then probably the perception is that that you made all the effort to make it successful.  You followed the advice of the strategic consultants, you selected the best PLM Vendor and system integrator, you created a budget – so what could go wrong?

This all depends on your company’s ambition and scope for PLM.

Implementing the as-is processes

If your PLM implementation is just there to automate existing practices and store data in a central location, this might work out. And this is most of the time when PLM implementations are successful. You know what to expect, and your system integrator knows what to expect.

This type of project can run close to budget, and some system integrators might be tempted to offer a fixed price. I am not a fan of fixed priced projects as you never know exactly what needs to be done. The system integrator might raise the target price with 20 – 40 % to cover their risk or you as a company might select the cheapest bid – another guarantee for failure. A PLM implementation is not a one-time project, it is an on-going journey. Therefore your choice needs to be sustainable.

My experience with this type of implementations is that it easy to blame the companies here too. Often the implementation becomes an IT-project, as business people are too busy to run their day-to-day jobs, therefore they only incidentally support the PLM project. The result is that at a specific moment, users confronted with the system feel not connected to the new system – it was better in the past. In particular, configuration management and change processes can become waterproof, leaving no freedom for the users. Then the blaming starts – first the software then the implementer.

But what if you have an ambitious PLM project as part of a business transformation?

In that case, the PLM platform is just one of the elements to consider. It will be the enabler for new ways of working, enabling customer-centric processes, multi-discipline collaboration, and more. All related to a digital transformation of the enterprise. Therefore, I mention PLM platform instead of PLM system. Future enterprises run on data through connected platforms. The better you can connect your disciplines, the more efficient and faster your company will operate. This, as opposed to the coordinated approach, which I have been addressing several times in the past.

A business transformation is a combination of end-to-end understanding of what to change – from management vision connected to the execution in the field. And as there is not an out-of-the-box template for business transformation, it is crucial a company experiments, evaluates and when successful, scales up new habits.

Therefore, it is hard to define upfront all the effort for the PLM platform and the implementation resources. What is sure is that your company is responsible for that, not an external part. So if it fails, your company is to blame.

Is everyone to blame?

You might have the feeling that everyone is to blame when a PLM implementation fails. I believe that is indeed the case. If you know in advance where all players have their strengths and weaknesses, a PLM implementation should not fail, but be balanced with the right resources. Depending on the scope of your PLM implementation, is it a consolidation or a transformation, you should take care of all stakeholders are participating in the anti-blame game.

The anti-blame game is an exercise where you make sure that the other parties in the game cannot blame you.

  • If you are a vendor – do not over commit
  • If you are a consultant or system integrator – learn to say NO
  • If you are the customer – make sure enough resources are assigned – you own the project. It is your project/transformation.

This has been several times my job in the past, where I was asked to mediate in a stalling PLM implementation. Most of the time at that time it was a blame game, missing the target to find a solution that makes sense. Here coaching from experienced PLM consultants makes sense.

 

Conclusion

Most of the time, PLM implementations are successful if the scope is well understood and not transformative. You will not hear a lot about these projects in the news as we like bad news.

To avoid bad news challenging PLM implementations should make sure all parties involved are challenging the others to remain realistic and invest enough. The role of an experienced external coach can help here.

 

 

Unfortunate one more time and old post with some new comments in green as I am not yet able to type at regular speed. I promise this will be the last reprise as I am sure in one week from now I will be double-handed again. The reason I chose this six-year-old post is that the topic is still actual, however, at that time, digital transformation was not yet in fashion for PLM.

If you look at the comments to the article at that time (Feb 2013), you will see some well-known names and behaviors.  What I can state for the moment – there are still people doubting there is a need for PLM, there are still people blaming technology  for the lousy perception of PLM, and there is a large group of silent companies out there that have implemented the basics of PLM, perhaps not as advanced as vendors/consultants have suggested, and they are reaping the benefits.

The main question in upcoming blog posts; “Is this enough ?” Happy rereading!

How come PLM is boring? – Feb 2013

PLM is a popular discussion topic in various blogs, LinkedIn discussion groups, PLM Vendor web sites, and for the upcoming Product Innovation Congress in Berlin.  I look forward to the event to meet and discuss with attendees their experience and struggle to improve their businesses using PLM. (Meanwhile, PI PLMx London has passed – for a review look here –The weekend after PI PLMx London 2019)

From the other side, talking about pure PLM becomes boring. Sometimes it looks like PLM is a monotheistic topic:

  • “What is the right definition of PLM ?” (I will give you the right one)
  • “We are the leading PLM vendor” (and they all are)
  • A PLM system should be using technology XYZ (etc., etc.)
  • Digital Transformation and IoT have come into the picture now

Some meetings with customers in the past three weeks and two different blog posts I read recently made me aware of this ambiguity between boring and fun.

PLM dictating Business is boring

Oleg Shilovitsky´s sequence of posts (and comments) starting with A single bill of materials in 6 steps was an example of the boring part. (Sorry Oleg, as you publish so many posts, there are many that I like and some I  can use as an example). When reading the BOM-related posts,  I noticed they are a typical example of an IT- or Academic view on PLM, in particular on the BOM topic.

questionWill these posts help you after reading them? Do they apply to your business? Alternatively, do you feel more confused as a prolific PLM blogger makes you aware of all the different options and makes you think you should use a single bill of materials?

I learned from my customers and coaching and mediating hundreds of PLM implementations that the single BOM discussion is one of the most confusing and complicated topics. Moreover, for sure if you address it from the IT-perspective.

The customer might say:
“Our BOM is already in ERP – so if it is a single BOM, you know where it is – goodbye !”.

A different approach is to start looking for the optimal process for this customer, addressing the bottlenecks and pains they currently face.  It will be no surprise that PLM best practices and technology are often the building blocks for the considered solution. If it will be a single BOM or a collection of structures evolving through time, this depends on the situation, not on the ultimate PLM system.

Note: meanwhile Oleg has further materialized his thinking through OpenBOM, and he has not lost his speed of publishing

Business dictating PLM is fun

Therefore I was happy to read Stephen Porter´s opinion and comments in: The PLM state: Penny-wise Pound Foolish Pricing and PLM (unfortunate this post has disappeared) where he passes a similar message like mine, from a different starting point, the pricing models of PLM Vendors. My favorite part is in his conclusion:

A PLM decision is typically a long term choice so make sure the vendor and partners have the staying power to grow with your company. Also make sure you are identifying the value drivers that are necessary for your company’s success and do not allow yourself to be swayed by the trendy short term technology

Management in companies can be confused by starting to think they just need PLM because they hear from the analysts, that it improves business. They need to think first to solve their business challenges and change the way they currently work to improve. Moreover, next look for the way to implement this change.

Not:e Stephen wrote at that time an interesting series of post and promised a revival. However I haven’t seen new posts. Did anyone of my readers see new materials that I missed?

Changing the way to work is the problem, not PLM.

It is not the friendly user-interface of PLM system XYZ or the advanced technical capabilities of PLM system ABC,  that will make a PLM implementation easier. Nothing is solved on the cloud or by using a mobile device. If there is no change when implementing PLM, why implement and build a system to lock yourself in even more?

abbThis is what Thomas Schmidt (VP Head of Operational Excellence and IS at ABB’s Power Products Division) told last year at PLM Innovation 2012 in Munich. He was one of the keynote speakers and surprised the audience by stating he did not need PLM!

He explained this by describing the business challenges ABB has to solve: Being a global company but acting around the world as a local company. He needed product simplification, part reduction among product lines around the world, compliance, and more.

Note: Thomas Schmidt meanwhile moved forward in his career, identifying himself as Experienced “Change Leader”, digital transformation, mentor and coach

Another customer in a whole different industry mentioned they were looking for improving global instant collaboration as the current information exchange is too slow and error-prone. Besides, they want to capitalize on the work done and make it accessible and reusable in the future, authoring tool independent. However, they do not call it PLM as in their business nobody uses PLM!

Both cases should make a PLM reseller´s mouths water (watertanden in Dutch), as these companies are looking for critical capabilities available in most of the PLM systems. However, none of these companies asked for a single BOM or a service-oriented architecture. They wanted to solve their business issues. Moreover, for sure, it will lead to implementing PLM capabilities when business and IT-people together define and decide on the right balance.

Unfortunate here we still see a function-feature approach – if it is not there, we will build it

Management take responsibility

Combining PLM and new business needs is the responsibility of management in these companies. It is crucial that a business issue (or a new strategy) is the driving force for a PLM implementation. PLM is not about automating what we have.

In too many situations, the management decides that a new strategy is required. One or more bright business leaders decide they need PLM (note -the strategy has now changed towards buying and implementing a system). Together with IT and after doing an extensive selection process, the selected PLM system (disconnected from the strategy) will be implemented.

I believe we read something about such a case recently

Moreover, this is the place where all PLM discussions come together:

  • why PLM projects are difficult
  • why it is unclear what PLM does.

PLM Vendors and Implementers are not connected anymore at this stage to the strategy or business. They implement technology and do what the customer project team tells them to do (or what they think is best for their business model).

Successful implementations are those where the business and management are actively involved during the whole process and the change.  Involvement requires a significant contribution from their side, often delegated to business and change consultants.

PLM Implementations usually lead to a crisis at some moment in time, when the business is not leading, and the focus is on IT and User Acceptance. In the optimal situation, business is driving IT. However, in most cases, due to lack of time and priorities from the business people, they delegate this activity to IT and the implementation team. So here it is a matter of luck if they will be successful: how experienced is the team?

Will they implement a new business strategy or just automate and implement the way the customer worked before, but now in a digital manner? Do we blame the software when people do not change?

Some notes here: I believe the disconnect between management/PLM vendors and on the other side meanwhile, people in business has become more prominent, due to the digital transformation hype. The hype is moving faster than the organization. Second point: I will not talk about people change anymore – organizations can change – people can adapt within a specific range. It is up to the organization where to push the limits.

 

Back to fun

imageI would not be so passionate about PLM if it was boring. However looking back the fun and enthusiasm does not come from PLM. The fun comes from a pro-active business approach knowing that first the motivating the people and preparing the change are defined, before implementing PLM practices

I believe the future success for PLM technologies is when we know to speak and address real business value and only then use (PLM) technologies to solve them.

PLM becomes is a  logical result not the start. And don´t underestimate: change is required. What do you think – is it a dream ?

????

Due to some physical inconvenience the upcoming weeks, I will not be able to write a full blog post at this time. Typing with one finger is not productive.
A video post could be an alternative, however for me, the disadvantage of a video message is that it requires the audience to follow all the information in a fixed speed – no fast or selective reading possible – hard to archive and store in context of other information. Putting pieces of information in a relevant context is a PLM-mission.

So this time my post from December 2008, where I predicted the future for 2050. I think the predictions were not too bad – you will recognize some trends and challenges still ahead. Some newer comments in italic green. I am curious to learn what you think after reading this post. Enjoy, and I am looking forward to your feedback

PLM in 2050

As the year ends (December 2008), I decided to take my crystal ball to see what would happen with PLM in the future.

It felt like a virtual experience and this is what I saw:

  • Data is not replicated any more – every piece of information that exists will have a Unique Universal ID; some people might call it the UUID. In 2020 this initiative became mature, thanks to the merger of some big PLM and ERP vendors, who brought this initiative to reality. This initiative reduced the exchange costs in supply chains dramatically and lead to bankruptcy for many companies providing translators and exchange software. (still the dream of a digital enterprise)
  • Companies store their data in ‘the cloud’ based on the previous concept. Only some old-fashioned companies still have their own data storage and exchange issues, as they are afraid someone will touch their data. Analysts compare this behavior with the situation in the year 1950, when people kept their money under a mattress, not trusting banks (and they were not always wrong) (we are getting there – sill some years to go)
  • After 3D, an entire virtual world, based on holography, became the next step for product development and understanding of products. Thanks to the revolutionary quantum-3D technology, this concept could be even applied to life sciences. Before ordering a product, customers could first experience and describe their needs in a virtual environment (to be replaced by virtual twin / VR / AR)
  • Finally the cumbersome keyboard and mouse were replaced by voice and eye-recognition.
    Initially voice recognition (Siri, Alexia please come to the PLM domain)
    http://www.youtube.com/watch?v=2Y_Jp6PxsSQand eye tracking (some time to go still)

    were cumbersome. Information was captured by talking to the system and capturing eye-movement when analyzing holograms. This made the life of engineers so much easier, as while analyzing and talking, their knowledge was stored and tagged for reuse. No need for designers to send old-fashioned emails or type their design decisions for future reuse (now moving towards AI)

  • Due to the hologram technology, the world became greener. People did not need to travel around the world, and the standard became virtual meetings with global teams(airlines discontinued business class). Even holidays could be experienced in the virtual world thanks to a Dutch initiative based on the experience with coffee. (not sure why I selected this movie. Sorry ….)
    http://www.youtube.com/watch?v=HUqWaOi8lYQThe whole IT infrastructure was powered by efficient solar energy, reducing the amount of carbon dioxide dramatically
  • Then with a shock, I noticed PLM did not longer exist. Companies were focusing on their core business processes. Systems/terms like PLM, ERP, and CRM did not longer exist. Some older people still remembered the battle between these systems to own the data and the political discomfort this gave inside companies (so true …)
  • As people were working so efficient, there was no need to work all week. There were community time slots, when everyone was active, but 50 percent of the time, people had the time to recreate (to re-create or recreate was the question). Some older French and German designers remembered the days when they had only 10 weeks holiday per year, unimaginable nowadays. (the dream remains)

As we still have more than 40  years to reach this future, I wish you all a successful and excellent 2009.

I am looking forward to be part of the green future next year.

%d bloggers like this: