You are currently browsing the tag archive for the ‘Digital Enterprise’ tag.

People, wherever you are, we are in a kind of lockdown. Some countries more restricted than others. Still, the challenge will be for most of us how to survive in two perhaps three months of being locked in your home and make the best of it. As I am not a virus expert, I will not give you any recommendations on this topic. As a PLM geek, I want to share with you the opportunities I see for the upcoming months.

A crisis is an opportunity

Most of us should be lucky that we do not live in the same situation as twenty years ago. At that time, internet connectivity was expensive and slow. Meaning working from home would mean isolation from the rest of the world. The positive point now is that we can be connected virtually without travel, without face-to-face meetings, and we are pushed to do so. This external push is an interesting point for me.

The traditional attitude for my PLM engagements was that face-to-face meetings are crucial for creating a human connection and trust. Now I ask myself is this a behavior of the past that should become obsolete in the future. Probably we cannot afford this approach anymore in the future if we take sustainability and the environment into consideration. We live now in a globally connected world, but should we act still in the old way?

Perhaps not. Let’s look at some of the examples that it is time to shift behaviors.

We might think in the Western world we know it all due to our dominance in the past hundred years. However, when you study history, you will see civilizations come to power and after hundreds of years, they lose power because they kill themselves internally. Apparently, a typical human property that will not disappear – still interesting to analyze when considering a globally connected world. Where is the point of gravity today?

Interestingly, the ancient Chinese population already knew that a crisis was an opportunity, as I am being told. The Chinese characters for crisis mean danger and opportunity, respectively, according to Wiki – see the image above. Joe Barkai was one of the first in my network that took action to explain that instead of focusing on the loss of what is happening now, we should take the opportunity to be better prepared for the future. You can read his post here: The Corona virus and your company’s brand. And these kinds of messages are popping up more frequently now. Let’s stay safe while thinking and preparing for the future.

Now a PLM related example.

Remember what the FFF is happening?

Two-three weeks ago, we had a vivid discussion in our PLM and CM community based on the famous FFF mnemonic.  What the FFF is happening was a post sharing my point of view, and there were a lot of reactions from different people.

The purpose of my post was to explain that the whole discussion was based on paradigms that drawings are defining the part. Because of that, we have a methodology to decide if YES or NO we need a new part number or revision. To me, this practice should no longer be a discussion.

A part has a unique identifier, and a document has a unique identifier. In PLM-systems, the information is managed by relations, no longer by identifiers – who knows the exact unique identifier? In a PLM-system information is connected, and the attributes of the part and document will tell you the details of the type of information. “Intelligent or meaningful” identifiers are in such an environment no longer relevant. Think about that…..

In the comments of my post, Jesse Leal was confirming this statement:

This in contrary to Joe Brouwer, who you might have noticed, always is spitting his opinion that the good old days of the draftsman are gone, Boeing made a tremendous mistake and that PLM is fake. This all combined with hyperlinks to his products and opinions. The comment below says it all:

Two points to observe in this response:

Hey, Bob, send me the new digital identifier”.

This statement assumes that if a person needs to retrieve information from someone else, they need to contact this person (Bob).

Bob then needs to drop his current work and answer to the response and send the latest version of a drawing?  This is old school. In a PLM-system,  information should be connected, and if Bob has released his latest drawing (no matter if it is FFF), any user could find the latest approved version, not even having to look at the identifier (which could be meaningless) but by following the relations between products, parts, and documents.

This is PLM!

One of the benefits, Bob does not get disturbed during the day by these kinds of questions and can focus on his critical work as an expert.

Second, if you need to sit with a designer to understand PLM, then you are probably talking with the wrong person. Designers work in the context of PDM. When we speak about PLM, we are talking about a broader scope beyond engineering and design.

This is a common mistake in a lot of marketing stories. Companies that focus on the design space only, some EBOM-integrations with CAD-systems, are most of the time focusing on PDM.  When Agile PLM came out (later Oracle E9) and later Aras without CAD-integrations, these companies were focusing on the flow of information inside the company, not necessarily driven by CAD. Of course, the traditional PLM companies combine CAD integration with other capabilities. Dassault Systèmes, Siemens, and PTC all have a strong relationship with their native CAD-systems. However, their offerings go way beyond CAD-integrations e.g. end-to-end governance, change processes and an item-centric backbone.

The diagram above explains the basics for the future. In a push-mode, the person in the middle has the responsibility to distribute information and ensure it remains accurate for all stakeholders. This makes this person crucial (good job security) but extremely inefficient compared to people working in the pull-mode, being responsible for getting the accurate data themselves. It may be clear the pull-mode is the model of a digital enterprise.

So if you have the time now, take this time to rethink how well your company is ready for a digital future. Companies that currently rely on Bob are in trouble as Bob is currently sitting at home. Companies that have learned to shift from the push-mode to the pull-mode could continue working as planned, as they do not need Bob. And don’t worry about your job. If you are in Bob’s position you will lose your job over time. However, when you keep on evolving, learning and adding value to your company, you will be always needed – don’t lock yourself in.

If you want to be inspired more in this area, read Jan Bosch’s post: This is not the end . Here Jan mentions the opportunity to move to digital practices (and more) – get out of our traditional patterns

 

What can you do?

Even though COVID-19 has, and will have, a dramatic impact on our society, this is also the moment to rewire some of our processes.  Because there was never time to think and act due to the running business. It reminded me of the financial crisis in 2008, when the market for PLM vendors was terrible, no significant sales for them as companies could not invest.

However, for me, 2008 was an extremely busy year,  thanks to all kinds of regulations from governments. There was time and budget to support employees to raise their skills and PLM was one of these domains. That year I conducted many workshops. It was also the year that I started my blog virtualdutchman.com.

Now we are in a similar situation and probably worse as now we are locked to our homes. However, we are also better connected. Imagine this situation without the internet. Now we can learn even better.

So let’s benefit from this connectivity and use the lockdown time to learn, think, and discuss with peers. Challenge and involve the management of your company how they see and lead to the future.


In that context, I am happy to spend on average one day per week on free conference calls if you need clarification or support for your PLM-related ideas.

Contact me through a personal message on LinkedIn, and we will find a way to connect.

 

Conclusion

This decade will be decisive for many of us. At the beginning of this year, I wrote PLM 2020- The next decade (4 challenges). With my narrow PLM-mind, I overlooked viruses. Bill Gates did not do that, as you can see from his 2015 TED talk: The next outbreak? We’re not ready.  Bill also explains that our traditional thinking patterns should change in a globally connected world.

I wish you all the time to think and educate yourself and prepare for a changed future. Stay safe inside, stay healthy, knowing for some of you it will be a big challenge.

One week ago, Yoann Maingon wrote an innocent post with the question: Has FFF killed?  The question was raised related to a 2014 problem at GM, where a changed part was causing fatal accidents.

The discussion started by Yoann and here my short extract. Assuming this problem was a configuration management issue and Yoann somehow indicated that the problem might be related to the fact that ERP-systems do not carry a revision on the part number – leading to an unnoticed change.  Therefore, he assumes there is a disconnect between the PLM-side (where we have parts with multiple lifecycle states and revisions) and ERP (where we have an industrial lifecycle – prototype/production).

He posted his thoughts, and then LinkedIn exploded (currently 116 comments), which means it is a topic that is of significant concern in our community. Next, if you read the comments, there are different viewpoints:

  • What does FFF really imply?
  • What about revisions of parts?
  • What are the best practices?

Let’s investigate these viewpoints with some comments

What does FFF really imply?

When we talk about FFF in engineering, we mean Form, Fit and Function – the three primary characteristics to describe a part  (source Wikipedia)

  • Form refers to such characteristics as external dimensions, weight, size, and visual appearance of a part or assembly. This is the element of FFF that is most affected by an engineer’s aesthetic choices, including enclosure, chassis, and control panel, that become the outward “face” of the product.
  • Fit refers to the ability of the part or feature to connect to, mate with, or join to another feature or part within an assembly. The “fit” allows the part to meet the required assembly tolerances to be useful.
  • Function is a criterion that is met when the part performs its stated purpose effectively and reliably. In an electronics product, for example, a function can depend on the solid-state components used, the software or firmware, and quite often on the features of the electronics enclosure selected.

One of the comments in Yoann’s post referred to Safe/Unsafe as a potential functional characteristic. I think this addition is not needed. Safety should be a requirement for the part, not a characteristic.

FFF was and still is an approach for engineers to decide if a new, improved version of the part would get a revision or needs a new part number.

I think before we dive deeper into the other viewpoints, it is crucial to define the part number a little more.

In a correct PLM data model, there are two types of part numbers. First, the internal part number that your company uses inside its engineering Bill of Materials to identify a part. This part number can be a meaningless part only to provide uniqueness inside the company.

In 2015 I wrote several posts related to best practices and data modeling for PLM. The most relevant posts to this discussion are here:

The part number can specify a part that needs to be manufactured according to specification, or it can be a part that needs to be purchased from an available supplier/manufacturer. The manufacturer part number is, most of the time, a meaningful number (6 – 7 characters) as these parts need to be ordered by your company. The manufacturer part number is the SKU for the manufacturer. As you can imagine in the manufacturer’s catalog, there isn’t a revision mentioned. In graphics, see the image below:

Your company might sell Product MP-323121 (note: the ID is meaningful to help the customer to order the product).

Internally there is a related EBOM that specifies the product. The EBOM top part is O122 (note: here, we can use a meaningless identifier as all is digitally connected).

For the manufacturing of O122, we need to resolve the EBOM according to its specifications. Therefore, for Part O124, the company needs to decide to purchase from their approved manufacturers either part ABC-21231 or XYZ-88818 (note: again, a meaningful ID as these companies are not digitally connected).

Now coming back to the FFF-discussion. For the orange parts, with a meaningful ID, no revision exists. However, if Assembly O122 is 100% FFF compatible, the Product ID MP-323121 will not change. It allows your company to optimize the EBOM and/or MBOM, meanwhile keeping 100% compatibility to the outside world. (note: the same principle applies to the two manufacturers for Part O124.)

In case Top Assembly O122 has new or changed parts – what should happen there?

At that moment, the definition has changed. The definitions, most of the time described in documents/drawings/models, are related information to the BOM. Therefore the Top Assembly O122 should get a new identifier. There is no need to name it a revision, it is a new data set in the PLM-system, again with a meaningless identifier as we are connected digitally,

What about revisions of parts?

Of course, the management of changes existed long before PLM-systems were introduced.

The specifications of a part were defined in drawings. The drawing contained all the information, not only the geometry definitions, but also specifications on how to manufacture the part.

For complex products, a considerable set of consistently related drawings would be released to manufacturing.  A release process with physical signatures on it.

At the same time, there was no discussion: the drawing represents the part. And as there was no digital connection, part numbers/drawing numbers were meaningful, often with the format of the drawing as part of the identifier.

In case changes were needed, for example, fixing a dimension or tolerance as discovered during manufacturing, the drawing had to be revised to remain consistent. First, in the original drawing, the issue or change was marked in red (redlining). Then engineering had to create a new version of the drawing.

Depending on the impact of change (here comes also the FFF-principle), people decided if a new part number was needed (FFF-change) or that the change only required an update of the drawing(s), meaning a revision.  If the difference was small (for example, adding a missing annotation), it could be called a minor change, all to be reflected in the drawing number, which equals the part number in this approach. So, when we talk about revisions of parts, we are talking about a document change.

A lousy practice from that approach is also that often manufacturing just redlines a drawing and keeps the redlined drawing as their source. It is too time-consuming or difficult to update the source drawing(s) through a change process. Engineering is not aware of this change, and when a later change comes through from engineering, these “fixes” might be missed as there is no traceability.

Generic example of a PLM data model and its relationsWhen PLM-systems were introduced, of course, companies did not want to disrupt their existing ways of working. Therefore, they were asking the PLM-editors to enable revisions on parts and so the PLM-editors did (or do).

Decoupling of parts and documents in a PLM data model

However, if you want to use the PLM-system in the best manner, you need to “decouple” the concept:  part number equals drawing number, combined with the possibility to start using meaningless identifiers, as relations between parts and drawings are managed in the PLM-system through relational links.

Relevant post related to the PLM data model are:

What are the best practices?

As some people mentioned in their comments to Yoann’s post, why do we have to answer this question as all is already well understood and described in best practices? I agree with that statement: Best Practices exist – so how to obtain them?

First, there is the whole framework of Configuration Management, which existed long before PLM-systems were introduced. If you follow their methodology, you can be (almost) guaranteed your information is consistent and correct. Configuration Management is crucial in areas where the impact of an error is enormous, like the GM-example Yoann referred to. Also, companies in the Aerospace and Defense industry are the ones that have strict configuration management in place.

Configuration management does not come for free. It requires an investment in skills, potentially a change in ways of working, and requires an overhead. Manufacturing companies that are creating less “risky” products often focus more on optimizing (= reducing) the cost of their internal processes instead of investing in proper methodologies to manage consistency.

If you want to learn more about CM, investigate the Institute of Process Excellence (IPX), the founders of the CM2 framework for Enterprise Configuration Management, and much more. Note: Their knowledge does not come for free, which I can understand. However, it also creates a barrier for the company’s further investment in CM as this kind of strategic investments are hard to sell at the management level by individuals in a company.

In the context of CM, I advise you to follow Martijn Dullaart, who is quite active in our social community. His latest blog post related to this thread is: It’s about Interchangeability and Traceability

With the introduction of PLM-system, these companies and the PLM-editors created the opportunity to implement configuration management in their system.

The data inside the system would be the “single version of the truth.” Unfortunately, this was most of the time, just a sales strategy, falsely giving the impression that information is under control now. Last year I wrote several posts related to the relation between PLM and CM, starting from PLM and Configuration Management – a happy marriage?

If you are interested in another resource for information related to these topics, have a look at the website from Jörg Eisenträger who also collected his best practices for PLM and CM for sharing (thanks Paul van der Ree for the link)

Don’t expect best practices from your PLM-vendors as their role is to sell software. It is the continuous discussion between:

  • A PLM-system that forces companies to work according to embedded methodology (hard to sell/implement but idealistically correct)

And

  • A flexible PLM-system that allows you to build and configure anything (easy to sell/challenging to implement correctly, depending on “wise” decisions)

The Future

Even though most companies are working drawing-centric, with or without a linked PLM-backbone for BOM-management, the next upcoming challenge is to evolve to model-based practices. The current CM-practices still talk about documents, although documents are already electronic datasets in that context. The future, however, in a model-based enterprise evolves related to connected models, 3D Models, but also simulation and software models, with different lifecycles and pace of change. For the model-based enterprise, we need to develop digital best practices that guarantee the same level of quality, however, executed and/or supported by (AI) Artificial Intelligence. AI is needed as human beings cannot physically analyze and understand all the impact of a change in such an environment.

Conclusion

The FFF-discussion illustrates that building a consistent framework within PLM is not an easy goal to achieve. My blog buddy Oleg Shilovitsky would claim that we consultants create the complexity. PLM-editors will never solve this complexity, it is up to your company’s mission to invest in knowledge to understand why and how to reduce the complexity. With this post and the related links and discussions, I hope more clarity will help you to make “wise” decisions.

At the beginning of this week, I was attending the 9th edition of the PI conference in London. Where it started as a popular conference with 300 – 400 attendees at its best, we were now back to a smaller number of approximately 100 attendees.

It illustrates that PLM as a standalone topic is no longer attracts a broad audience as Marketkey (the organization of the conference) confirms. The intention is that future conferences will be focusing on the broader scope of PLM, where business transformation will be one of the main streams.

In this post, I will share my highlights of the conference, knowing that other sessions might have been valuable too, but I had to make a choice.

It is about people

Armin Prommersberger, CTO from DIRAC and the chairman of the conference, made a great point: “What we will discuss in the upcoming two days, it is all about people not about technology.”

I am not sure if this opening has influenced the mood of the conference, as when I look back to what was the central theme: It is all about how we deal with people when explaining, implementing and justifying PLM.

AI at the Forefront of a Digital Transformation

Muhannad Alomari from R2 Data Labs as a separate unit within Rolls Royce to explore and provide data innovation started with his keynote speech sharing the AI initiatives within his team.

He talked about several projects where AI will become crucial.

For example, the EHM program related to engine behavior. How to detect anomalies, how to establish predictive maintenance and maximize the time an airplane engine is in operation. Interesting to mention is that Muhannad explained that most simulation models are based on simplified simulation models, not accurate enough to discover anomalies.

Modeling in the PLM world with feedback from reality

Machine learning and feedback loops are crucial to optimize the models both for the discovery of irregularities and, of course, to improve understanding of the engine behavior and predict maintenance. Currently, maintenance is defined based on the worst-case scenario for the engine, which in reality, of course, will not be the case for most engines. There is a lot (millions) to gain here for a company.

Interesting to mention is that Muhannad gave a realistic view of the current status of Artificial Intelligence (AI). AI is currently still dumb – it is a set of algorithms that need to be adapted whenever new patterns are discovered. Deep learning is still not there – currently, we still need human beings for that.

This was in contrast with the session from Kalypso later with the title: Supercharge your PLM with advanced analytics. It was a typical example of where a realistic story (R2 Data Labs) shows such a big difference with what is sold by PLM vendors or implementers. Kalypso introduced Product Lifecycle Intelligence (PLI) – you can see the dream on the left (click on the image to enlarge).

Combine PLM with Analytics, and you have Intelligence.  My main comment is, knowing from the field the first three phases in most companies have a lack of data quality and consistency. Therefore any “Intelligence” probably will be based on unreliable sources. Not an issue if you are working in the domain of politics, however when it comes to direct cost and quality implications, it can be a significant risk. We still have a way to go before we have a reliable PLM data backbone for analytics.

 

Keeping PLM Momentum after a Successful Campaign

Susanna Mäentausta from Kemira in Finland gave an exciting update of their PLM project. Where in 2019, she shared with us their PLM roadmap (see my 2019 post: The weekend after PI PLMx London 2019); this time, Susanna shared with us how they are keeping the PLM momentum.

Often PLM implementations are started based on a hypothetical business case (I talked about this in my post The PLM ROI Myth). But then, when you implement PLM, you need to take care you provide proof points to motivate the management. And this is exactly what the PLM team in Kemira has been doing. Often management believes that after the first investment, the project is done (“We bought the software – so we are done”) however the business and process change that will deliver the value is not reported.

Susanna shared with us how they defined measurable KPIs for two reasons.  First, to motivate the management that there are business progress and benefits, however, it is a journey. And secondary the facts are used to kill the legends that “Before PLM we were much faster or efficient.” These types of legends are often expressed loudly by persons who consider PLM as an overhead (killing their freedom) instead of a way to be more efficient in business. In the end, for a company, the business is more important than the person’s belief.

On the question for Susanna, what she would have done better with hindsight, she answered: “Communicate, communicate, communicate.” A response I fully support as often PLM teams are too busy completing their day-to-day work, that there is no spare time for communication. Crucial to achieving a business change.

My agreement: PLM needs facts based during implementation and support combined with the understanding we are dealing with people and their emotions too. Both need full attention.

Acceleration Digitalization at Stora Enso

Samuli Savo, Chief Digital Officer at Stora Enso, explained the principles of innovation, related to digitalization at his company. Stora Enso, a Swedish/Finish company, historically one of the largest forestry companies in the world as well as one of the most significant paper and packaging producers, is working on a transformation to become the renewable materials company. For me, he made two vital points on how Stora Enso’s digitalization’s journey is organized.

He pleads for experimentation funded by corporate as in the experimental stage, as it does not make sense to have a business case. First DO and then ANALYZE, where many companies have to policy first to ANALYZE and then DO, killing innovative thinking.

The second point was the active process to challenge startups to solve business challenges they foresee and, combined with a governance process for startups, allow these companies to be supported and become embedded within member companies of the Combient Foundry, like Stora Enso. By doing such in a structured way, the outcome must lead to innovation.

I was thinking about the hybrid enterprise model that I have been explaining in the past. Great story.

Cyber-security and Future Mobility

Out of interest, I followed the session from Madeline Cheah, Cybersecurity Innovation Lead at HORIBA MIRA. She gave an excellent and well-structured overview. Madeline leads the cybersecurity research program. Part of this job is investigating ways to prevent vehicles from being attacked.  In particular, when it comes to connected and autonomous vehicles. How to keep them secure.

She discussed the known gaps are and the cybersecurity implications of future mobility so extensive that I even doubted will there ever be an autonomous vehicle on the road. So much to define and explore. She looked at it from the perspective of the Internet of Everything, where Everything is divided into Things, Data, Processes, and People. Still, a lot of work to do, see image below

Good Times Ahead: Delay Mitigation Through a Plan for Every Part

Ian Quest, director at Quick Release, gave an overview of what their company aims to be. You could translate it as the plumbers of the automotive industry Where in the ideal world information should be flowing from design to release, there are many bottlenecks, leakages, hiccups that need to be resolved as the image shows.

Where their customers often do not have the time and expertise to fix these issues, Quick Release brings in various skillsets and common sense. For example, how to deal with the Bill of Materials, Configuration Management, and many other areas that you need to address with methodology first instead of (vendor-based) technology. I believe there is a significant need for this type of company in the PLM-domain.

The second part, presented by Nick Solly, with a focus on their QRonos tool, was perhaps a little too much a focus on the capabilities of the tool. Ian Quest, in his introduction,  already made the correct statement:

The QRonos tool, which is more or less a reporting tool, illustrates again that when people care about reliable data (planning, tasks, parts, deliverables, …..), you can improve your business significantly by creating visibility to delays or bottlenecks. The value lies in measurable activities and from there, learn to predict or enhance – see R2 Labs, Kemira and the PLI dream.

Conclusion

It is clear that a typical PLM conference is no longer a technology festival – it is about people. People are trying to change or improve their business. Trying to learn from each other, knowing that the technical concepts and technology are there.

I am looking forward to the upcoming PI events where this change will become more apparent.

 

Last week I shared my thoughts related to my observation that the ROI of PLM is not directly visible or measurable, and I explained why. Also, I explained that the alignment of an organization requires a myth to make it happen. A majority of readers agreed with these observations. Some others either misinterpreted the headlines or twisted the story in favor of their opinion.

A few came from Oleg Shilovitsky and as Oleg is quite open in his discussions, it allows me to follow-up on his statements. Other people might share similar thoughts but they haven’t had the time or opportunity to be vocal. Feel free to share your thoughts/experiences too.

Some misinterpretations from Oleg’s post: PLM circa 2020 – How to stop selling Myths

  • The title “How to stop selling Myths” is the first misinterpretation.
    We are not selling myths – more below.
  • “Jos Voskuil’s recommendation is to create a myth. In his PLM ROI Myths article, he suggests that you should not work on a business case, value, or even technology” is the second misinterpretation, you still need a business case, you need value and you need technology.

And I got some feedback from Lionel Grealou, who’s post was a catalyst for me to write the PLM ROI Myth post. I agree I took some shortcuts based on his blog post. You can read his comments here. The misinterpretation is:

  • “Good luck getting your CFO approve the business change or PLM investment based on some “myth” propaganda :-)” as it is the opposite, make your plan, support your plan with a business case and then use the myth to align

I am glad about these statements as they allow me to be more precise, avoiding misperceptions/myth-perceptions.

A Myth is bad

Some people might think that a myth is bad, as the myth is most of the time abstract.  I think these people do not realize that there a lot of myths that they are following; it is a typical social human behavior to respond to myths. Some myths:

  • How can you be religious without believing in myths?
  • In this country/world, you can become anything if you want?
  • In the past, life was better
  • I make this country great again

The reason human beings need myths is that without them, it is impossible to align people around abstract themes. Try for each of the myths above to create an end-to-end logical story based on factual and concrete information. Impossible!

Read Yuval Harari’s book Sapiens about the power of myths. Read Steven Pinker’s book Enlightenment Now to understand that statistics show a lot of current myths are false. However, this does not mean a myth is bad. Human beings are driven by social influences and myths – it is our brain.

Unless you have no social interaction, you might be immune to myths. With brings me to quoting Oleg once more time:

“A long time ago when I was too naive and too technical, I thought that the best product (or technology) always wins. Well… I was wrong. “

I went through the same experience, having studied physics and mathematics makes you think extremely logical. Something I enjoyed while developing software. Later, when I started my journey as the virtualdutchman mediating in PLM implementations, I discovered logical alone does not work in businesses. The majority of decisions are done based on “gut feelings” still presented as reasonable cases.

Unless you have an audience of Vulcans, like Mr. Spock, you need to deal with the human brain. Consider the myth as the envelope to pass the PLM-project to the management. C-level acts by myths as so far I haven’t seen C-level management spending serious time on understanding PLM. I will end with a quote from Paul Empringham:

I sometimes wish companies would spend 6 months+ to educate themselves on what it takes to deliver incremental PLM success BEFORE engaging with software providers

You don’t need a business case

Lionel is also skeptical about some “Myth-propaganda” and I agree with him. The Myth is the envelope, inside needs to be something valuable, the strategy, the plan, and the business case. Here I want to stress one more time that most business cases for PLM are focusing on tool and collaboration efficiency. And from there projecting benefits. However, how well can we predict the future?

If you implement a process, let’s assume BOM-collaboration done with Excel by BOM-collaboration based on an Excel-on-the-cloud-like solution, you can measure the differences, assuming you can measure people’s efficiency. I guess this is what Oleg means when he explains OpenBOM has a real business case.

However, if you change the intent for people to work differently, for example, consult your supplier or manufacturing earlier in the design process, you touch human behavior. Why should I consult someone before I finish my job, I am measured on output not on collaboration or proactive response? Here is the real ROI challenge.

I have participated in dozens of business cases and at the end, they all look like the graph below:

The ROI is fantastic – after a little more than 2 years, we have a positive ROI, and the ROI only gets bigger. So if you trust the numbers, you would be a fool not to approve this project. Right?

And here comes the C-level gut-feeling. If I have a positive feeling (I follow the myth), then I will approve. If I do not like it, I will say I do not trust the numbers.

Needless to say that if there was a business case without ROI, we do not need to meet the C-level. Unless, and it happens incidental, at C-level, there was already a decision we need PLM from Vendor X because we played golf together, we are condemned together or we believe the same myths.

In reality, the old Gartner graph from realized benefits says it all. The impact of culture, processes, and people can make or break a plan.

You do not need an abstract story for PLM

Some people believe PLM on its own is a myth. You just need the right technology and people will start using it, spreading it out and see how we have improved business. Sometimes email is used as an example. Email is popular because you can with limited effort, collaborate with people, no matter where they are. Now twenty years later, companies are complaining about the lack of traceability, the lack of knowledge and understanding related to their products and processes.

PLM will always have the complexity of supporting traceability combined with real-time collaboration. If you focus only on traceability, people will complain that they are not a counter clerk. If you focus solely on collaboration, you miss the knowledge build-up and traceability.

That’s why PLM is a mix of governance, optimized processes to guarantee quality and collaboration, combined with a methodology to tune the existing processes implemented in tools that allow people to be confident and efficient. You cannot translate a business strategy into a function-feature list for a tool.

Conclusion

Myths are part of the human social alignment of large groups of people. If a Myth is true or false, I will not judge. You can use the Myth as an envelope to package your business case. The business case should always be a combination of new ways of working (organizational change), optimized processes and finally, the best tools. A PLM tool-only business case is to my opinion far from realistic

 

Now preparing for PI PLMx London on 3-4 February – discussing Myths, Single BOMs and the PLM Green Alliance

In my previous post, I shared my observations from the past 10 years related to PLM. It was about globalization and digitization becoming part of our daily business. In the domain of PLM, the coordinated approach has become the most common practice.

Now let’s look at the challenges for the upcoming decade, as to my opinion, the next decade is going to be decisive for people, companies and even our current ways of living. So let’s start with the challenges from easy to difficult

Challenge 1: Connected PLM

Implementing an end-to-end digital strategy, including PLM, is probably business-wise the biggest challenge. I described the future vision for PLM to enable the digital twin –How PLM, ALM, and BIM converge thanks to the digital twin.

Initially, we will implement a digital twin for capital-intensive assets, like satellites, airplanes, turbines, buildings, plants, and even our own Earth – the most valuable asset we have. To have an efficient digital continuity of information, information needs to be stored in connected models with shared parameters. Any conversion from format A to format B will block the actual data to be used in another context – therefore, standards are crucial. When I described the connected enterprise, this is the ultimate goal to be reached in 10 (or more) years. It will be data-driven and model-based

Getting to connected PLM will not be the next step in evolution. It will be disruptive for organizations to maintain and optimize the past (coordinated) and meanwhile develop and learn the future (connected). Have a look at my presentation at PLM Roadmap PDT conference to understand the dual approach needed to maintain “old” PLM and work on the future.

Interesting also my blog buddy Oleg Shilovitsky looked back on the past decade (here) and looked forward to 2030 (here). Oleg looks at these topics from a different perspective; however, I think we agree on the future quoting his conclusion:

PLM 2030 is a giant online environment connecting people, companies, and services together in a big network. It might sound like a super dream. But let me give you an idea of why I think it is possible. We live in a world of connected information today.

 

Challenge 2: Generation change

At this moment, large organizations are mostly organized and managed by hierarchical silos, e.g., the marketing department, the R&D department, Manufacturing, Service, Customer Relations, and potentially more.

Each of these silos has its P&L (Profit & Loss) targets and is optimizing itself accordingly. Depending on the size of the company, there will be various layers of middle management. Your level in the organization depends most of the time on your years of experience and visibility.

The result of this type of organization is the lack of “horizontal flow” crucial for a connected enterprise. Besides, the top of the organization is currently full of people educated and thinking linear/analog, not fully understanding the full impact of digital transformation for their organization. So when will the change start?

In particular, in modern manufacturing organizations, the middle management needs to transform and dissolve as empowered multidisciplinary teams will do the job. I wrote about this challenge last year: The Middle Management dilemma. And as mentioned by several others – It will be: Transform or Die for traditionally managed companies.

The good news is that the old generation is retiring in the upcoming decade, creating space for digital natives. To make it a smooth transition, the experts currently working in the silos will be missed for their experience – they should start coaching the young generation now.

 

Challenge 3: Sustainability of the planet.

The biggest challenge for the upcoming decade will be adapting our lifestyles/products to create a sustainable planet for the future. While mainly the US and Western Europe have been building a society based on unlimited growth, the effect of this lifestyle has become visible to the world. We consume with the only limit of money and create waste and landfill (plastics and more) form which the earth will not recover if we continue in this way. When I say “we,” I mean the group of fortunate people that grew up in a wealthy society. If you want to discover how blessed you are (or not), just have a look at the global rich list to determine your position.

Now thanks to globalization, other countries start to develop their economies too and become wealthy enough to replicate the US/European lifestyle. We are overconsuming the natural resources this earth has, and we drop them as waste – preferably not in our backyard but either in the ocean or at fewer wealth countries.

We have to start thinking circular and PLM can play a role in this. From linear to circular.

In my blog post related to PLM Roadmap/PDT Europe – day 1,  I described Graham Aid’s (Ragn-Sells) session:

Enabling the Circular Economy for Long Term Prosperity.

He mentioned several examples where traditional thinking just leads to more waste, instead of starting from the beginning with a sustainable model to bring products to the market.

Combined with our lifestyle, there is a debate on how the carbon dioxide we produce influences the climate and the atmosphere. I am not a scientist, but I believe in science and not in conspiracies. So there is a problem. In 1970 when scientists discovered the effect of CFK on the Ozone-layer of the atmosphere, we ultimately “fixed” the issue. That time without social media we still trusted scientists – read more about it here: The Ozone hole

I believe mankind will be intelligent enough to “fix” the upcoming climate issues if we trust in science and act based on science. If we depend on politicians and lobbyists, we will see crazy measures that make no sense, for example, the concept of “biofuel.” We need to use our scientific brains to address sustainability for the future of our (single) earth.

Therefore, together with Rich McFall (the initiator), Oleg Shilovitsky, and Bjorn Fidjeland (PLM-peers), we launched the PLM Green Alliance, where we will try to focus on sharing ideas, discussion related to PLM and PLM-related technologies to create a network of innovative companies/ideas. We are in the early stages of this initiative and are looking for ways to make it an active alliance. Insights, stories, and support are welcome. More to come this year (and decade).

 

Challenge 4: The Human brain

The biggest challenge for the upcoming decade will be the human brain. Even though we believe we are rational, it is mainly our primitive brain that drives our decisions. Thinking Fast and Slow from Daniel Kahneman is a must-read in this area. Or Predictably Irrational: The Hidden Forces that shape our decisions.  Note: these books are “old” books from years ago. However, due to globalization and social connectivity, they have become actual.

Our brain does not like to waste energy. If we see the information that confirms our way of thinking, we do not look further. Social media like Facebook are using their algorithms to help you to “discover” even more information that you like. Social media do not care about facts; they care about clicks for advertisers. Of course, controversial headers or pictures get the right attention. Facts are no longer relevant, and we will see this phenomenon probably this year again in the US presidential elections.

The challenge for implementing PLM and acting against human-influenced Climate Change is that we have to use our “thinking slow” mode combined with a general trust in science. I recommend reading Enlightenment now from Steven Pinker. I respect Steven Pinker for the many books I have read from him in the past. Enlightenment Now is perhaps a challenging book to complete. However, it illustrates that a lot of the pessimistic thinking of our time has no fundamental grounds. As a global society, we have been making a lot of progress in the past century. You would not go back to the past anymore.

Back to PLM.

PLM is not a “wonder tool/concept,” and its success is mainly depending on a long-term vision, organizational change, culture, and then the tools. It is not a surprise that it is hard for our brains to decide on a roadmap for PLM. In 2015 I wrote about the similarity of PLM and acting against Climate Change  – read it here: PLM and Global Warming

In the upcoming PI PLMx London conference, I will lead a Think Tank session related to Getting PLM on the Executive’s agenda. Getting PLM on an executive agenda is about connecting to the brain and not about a hypothetical business case only.  Even at exec level, decisions are made by “gut feeling” – the way the human brain decides. See you in London or more about this topic in a month.

Conclusion

The next decade will have enormous challenges – more than in the past decades. These challenges are caused by our lifestyles AND the effects of digitization. Understanding and realizing our biases caused by our brains is crucial.  There is no black and white truth (single version of the truth) in our complex society.

I encourage you to keep the dialogue open and to avoid to live in a silo.

In recent years, more and more PLM customers approached me with questions related to the usage of product information for downstream publishing. To be fair, this is not my area of expertise for the moment. However, with the mindset of a connected enterprise, this topic will come up.

For that reason, I have a strategic partnership with Squadra, a Dutch-based company, providing the same coaching model as TacIT; however, they have their roots in PIM and MDM.

Together we believe we can deliver a meaningful answer on the question: What are the complementary roles of PLM and PIM? In this post, our first joint introduction.

Note: The topic is not new. Already in 2005, Jim Brown from Tech-Clarity published a white-paper: The Complementary Roles of PIM and PLM. This all before digitization and connectivity became massive.

Let’s start with the abbreviations, the TLAs (Three-Letter-Acronyms) and their related domains

PLM – level 1
(Product Lifecycle Management – push)

For PLM, I want to stay close to the current definitions. It is the strategic approach to provide a governance infrastructure to deliver a product to the market. Starting from an early concept phase till manufacturing and in its extended definition also during its operational phase.
The focus with PLM is to reduce time to market by ensuring quality, cost, and delivery through more and more a virtual product definition, therefore being able to decide upfront for the best design choices, manufacturing options with the lowest cost. In the retail world, own-brand products are creating a need for PLM.

The above image is nicely summarizing the expected benefits of a traditional PLM implementation.

 

MDM (Master Data Management)

When product data is shared in an enterprise among multiple systems, there is a need for Master Data Management (MDM). Master Data Management focuses on a governance approach that information stored in various systems has the same meaning and shared values where relevant.

MDM guards and streamlines the way master data is entered, processed, guarded, and changed within the company, resulting in one single version of the truth and enabling different departments and systems to stay synced regarding their crucial data.

Interestingly, in the not-so-digital world of PLM, you do not see PLM vendors working on an MDM-approach. They do not care about an end-to-end connected strategy yet. I wrote about this topic in 2017 here: Master Data Management and PLM.

PIM (Product Information Management)

The need for PIM starts to become evident when selling products through various business channels. If you are a specialized machine manufacturer, your product information for potential customers might be very basic and based on a few highlights.

However, due to digitization and global connectivity, product information now becomes crucial to be available in real-time, wherever your customers are in the world.

In a competitive world, with an omnichannel strategy, you cannot survive without having your PIM streamlined and managed.

 

Product Innovation Platforms (PLM – Level 2 – Pull)

With the introduction of Product Innovation Platforms as described by CIMdata and Gartner, the borders of PLM, PIM, and MDM might become vague, as they might be all part of the same platform, therefore reducing the immediate need for an MDM-environment.  For example, companies like Propel, Stibo, and Oracle are building a joint PLM-PIM portfolio.

Let’s dive more profound in the two scenarios that we meet the most in business, PLM driving PIM (my comfort zone) and PIM driving the need for PLM (Squadra’s s area of expertise).

PLM driving PIM

Traditionally PLM (Product Lifecycle Management) has been focusing on several aspects of the product lifecycle. Here is an excellent definition for traditional PLM:

PLM is a collection of best practices, dependent per industry to increase product revenue, reduce product-related costs and maximize the value of the product portfolio  (source 2PLM)

This definition shows that PLM is a business strategy, not necessarily a system, but an infrastructure/approach to:

  • ensure shorter time to market with the right quality (increasing product revenue)
  • efficiently (reduce product-related costs – resources and scrap)
  • deliver products that bring the best market revenue (maximize the value of the product portfolio)

The information handled by traditional PLM consists mostly of design data, i.e., specifications, manufacturing drawings, 3D Models, and Bill of Materials (physical part definitions) combined with version and revision management. In elaborate environments combined with processes supporting configuration management.

PLM data is more focused on internal processes and quality than on targeting the company’s customers. Sometimes the 3D Design data is used as a base to create lightweight 3D graphics for quotations and catalogs, combining it with relevant sales data. Traditional marketing was representing the voice of the customer.

PLM implementations are more and more providing an enterprise backbone for product data. As a result of this expansion, there is a wish to support sales and catalogs, more efficiently, sharing master data from creation till publishing, combining the product portfolio with sales and service information in a digital way.

In particular, due to globalization, there was a need to make information globally available in different languages without a significant overhead of resources to manage the data or manage the disconnect from the real product data.

Companies that have realized the need for connected data understood that Product Master Data Management is more than only the engineering/manufacturing view. Product Master Data Management is also relevant to the sales and services view. Historically done by companies as a customized extension on their PLM-system, now more and more interfacing with specialized PIM-systems. Proprietary PLM-PIM interfaces exist. Hopefully, with digital transformation, a more standardized approach will appear.

 

PIM driving the need for PLM

Because of changes in the retail market, the need for information in the publishing processes is also changing. Retailers also need to comply with new rules and legislation. The source of the required product information is often in the design process of the product.

In parallel, there is an ongoing market trend to have more and more private label products in the (wholesale and retail) assortments. This means a growing number of retailers and wholesalers will become producers and will have their own Ideation and innovation process.

A good example is ingredients and recipe information in the food retail sector. This information needs to be provided now by suppliers or by their own brand department that owns the design process of the product. Similar to RoHS or REACH compliance in the industry.

Retail and Wholesale can tackle own brands reasonably well with their PIM systems (or Excels), making use of workflows and product statuses. However, over the years, the information demands have increased, and a need for more sophisticated lifecycle management has emerged and, therefore the need for PLM (in this case, PLM also stands for Private Label Management).

In the image below, illustrates a PLM layer and a PIM layer, all leading towards rich product information for the end-users (either B2B or B2C).

In the fast-moving consumer goods (FMCG) world, most innovative products are coming from manufacturers. They have pipelines with lots of ideas resulting in a limited number of sellable products. In the Wholesale and Retail business, the Private Label development process usually has a smaller funnel but a high pressure on time to market, therefore, a higher need for efficiency in the product data chain.

Technological changes, like 3D Printing, also change the information requirements in the retail and wholesale sectors. 3D printing can be used for creating spare parts on-demand, therefore changing the information flow in processes dramatically. Technical drawings and models that were created in the design process, used for mass production, are now needed in the retail process closer to the end customer.

These examples make it clear that more and more information is needed for publication in the sales process and therefore needs to be present in PIM systems. This information needs to be collected and available during the PLM release process. A seamless connection between the product release and sales processes will support the changing requirements and will reduce errors and rework in on data.

PLM and PIM are two practices that need to go hand in hand like a relay baton in athletics. Companies that are using both tools must also organize themselves in a way that processes are integrated, and data governance is in place to keep things running smoothly.

 

Conclusion

Market changes and digital transformation force us to work in value streams along the whole product lifecycle ensuring quality and time to market. PLM and PIM will be connected domains in the future, to enable smooth product go-to-market. Important is the use of data standards (PLM and PIM should speak a common language) – best based on industry standards so that cross-company communication on product data is possible.

What do you think? Do you see PLM and PIM getting together too, in your business?

Please share in the comments.

 

 

 

 

 

The usage of standards has been a recurring topic the past 10 months, probably came back to the surface at PI PLMx Chicago during the PLM Leaders panel discussion. If you want to refresh the debate, Oleg Shilovitsky posted an overview: What vendors are thinking about PLM standards – Aras, Dassault Systemes, Onshape, Oracle PLM, Propel PLM, SAP, Siemens PLM.

It is clear for vendors when they would actively support standards they reduce their competitive advantage, after all, you are opening your systems to connect to other vendor solutions, reducing the chance to sell adjacent functionality. We call it vendor lock-in. If you think this approach only counts for PLM, I would suggest you open your Apple (iPhone) and think about vendor lock-in for a moment.

Vendors will only adhere to standards when pushed by their customers, and that is why we have a wide variety of standards in the engineering domain.

Take the example of JT as a standard viewing format, heavily pushed by Siemens for the German automotive industry to be able to work downstream with CATIA and NX models. There was a JT-version (v9.5) that reached ISO 1306 alignment, but after that, Siemens changed JT (v10) again to optimize their own exchange scenarios, and the standard was lost.

And as customers did not complain (too much), the divergence continued. So it clear  vendors will not maintain standards out of charity as your business does not work for charity either (or do you ?). So I do not blame them is there is no push from their customers to maintain them.

What about standards?

The discussion related to standards flared up around the IpX ConX19 conference and a debate between Oleg & Hakan Kardan (EuroSTEP) where Hakan suggested that PLCS could be a standard data model for the digital thread – you can read Oleg’s view here: Do we need a standard like PLCS to build a digital thread.

Oleg’s opening sentence made me immediately stop reading further as more and more I am tired of this type of framing if you want to do a serious discussion based on arguments. Such a statement is called framing and in particular in politics we see the bad examples of framing.

Standards are like toothbrushes, a good idea, but no one wants to use anyone else’s. The history of engineering and manufacturing software is full of stories about standards.

This opening sentence says all about the mindset related to standards – it is a one-liner – not a fact. It could have been a tweet in this society of experts.

Still later,I read the blog post and learned Oleg has no arguments to depreciate PLCS, however as he does not know the details, he will probably not use it. The main challenge of standards: you need to spend time to understand and adhere to them and agree on following them. Otherwise, you get the same diversion of JT again or similar examples.

However, I might have been wrong in my conclusion as Oleg did some thinking on a Sunday and came with an excellent post: What would happen if PLM Vendors agree about data standards. Here Oleg is making the comparison with a standard in the digital world, established by Google, Microsoft, Yahoo, and Yandex : Schema.org: Evolution of Structured Data on the Web.

There is a need for semantic mapping and understanding in the day-to-day-world, and this understanding makes you realize the same is needed for PLM. That was one of the reasons why I wrote in the past (2015) a series of posts related to the importance of a PLM data model:

All these posts were aimed to help companies and implementers to make the right choices for an item-centric PLM implementation. At that time – 2015, item-centric was the current PLM best practice. I learned from my engagements in the past 15 years, in particular when you have a flexible modeling tool like SmarTeam or nowadays Aras, making the right data model decisions are crucial for future growth.

Who needs standards?

First of all, as long as you stay in your controlled environment, you do not need standards. In particular, in the Aerospace and Automotive industry, the OEMs defined the software versions to be used, and the supply chain had to adhere to their chosen formats. Even this narrow definition was not complete enough as a 3D CAD model needed to be exported for simulation or manufacturing purposes. There was not a single vendor working on a single CAD model definition at that time. So the need for standards emerged as there was a need to exchange data.

Data exchange is the driving force behind standards.

In a second stage also neutral format data storage became an important point – how to save for 75 years an aircraft definition.

Oil & Gas / Building – Construction

These two industries both had the need for standards. The Oil & Gas industry relies on EPC (Engineering / Procurement / Construction)  companies that build plants or platforms. Then the owner/operator takes over the operation and needs a hand-over of all the relevant information. However if this information would be delivered in the application-specific formats the EPC companies have used, the owner/operator would require various software environments and skills, just to have access to the data.

Therefore if the data is delivered in a standard format (ISO 15926) and the exchange follows CFIHOS (Capital Facilities Information Hand Over Specification) this exchange can be done more automated between the EPC and Owner/Operator environment, leading to lower overall cost of delivering and maintaining the information combined with a higher quality. For that reason, the Oil & Gas industry has invested already for a long time in standards as their plants/platform have a long lifecycle.

And the same is happening in the construction industry. Initially Autodesk and Bentley were fighting to become the vendor-standard and ultimately the IFC-standard has taken a lot from the Autodesk-world, but has become a neutral standard for all parties involved in a construction project to share and exchange data. In particular for the construction industry,  the cloud has been an accelerator for collaboration.

So standards are needed where companies/people exchange information

For the same reason in most global companies, English became the standard language. If you needed to learn all the languages spoken in a worldwide organization, you would not have time for business. Therefore everyone making some effort to communicate in one standard language is the best way to operate.

And this is the same for a future data-driven environment – we cannot afford for every exchange to go to the native format from the receiver or source – common neutral (or winning) standards will ultimately also come up in the world of manufacturing data exchange and IoT.

Companies need to push

This is probably the blocking issue for standards. Developing standards, using standards require an effort without immediate ROI. So why not use vendor-formats/models and create custom point-to-point interface as we only need one or two interfaces?  Companies delivering products with a long lifecycle know that the current data formats are not guaranteed for the future, so they push for standards (aerospace/defense/ oil & gas/construction/ infrastructure).

3D PDF Model

Other companies are looking for short term results, and standards are slowing them down. However as soon as they need to exchange data with their Eco-system (suppliers/ customers) an existing standard will make their business more scalable. The lack of standards is one of the inhibitors for Model-Based Definition or the Model-Based Enterprise – see also my post on this topic: Model-Based – Connecting Engineering and Manufacturing

When we would imagine the Digital Enterprise of the future, information will be connected through data streams and models. In a digital enterprise file conversions and proprietary formats will impede the flow of data and create non-value added work. For example if we look to current “Digital Twin” concepts, the 3D-representation of the twin is recreated again instead of a neutral 3D-model continuity. This because companies currently work in a coordinated manner. In perhaps 10 years from now we will reach maturity of a model-based enterprise, which only can exist based on standards. If the standards are based on one dominating platform or based on a merger of standards will be the question.

To discuss this question and how to bridge from the past to the future I am looking forward meeting you at the upcoming PLM Roadmap & PDT 2019 EMEA conference on 13-14 November in Paris, France. Download the program here: PLM for Professionals – Product Lifecycle Innovation

Conclusion

I believe PLM Standards will emerge when building and optimizing a digital enterprise. We need to keep on pushing and actively working for meaningful standards as they are crucial to avoid a lock-in of your data. Potentially creating dead-ends and massive inefficiencies.  The future is about connected Eco-systems, and the leanest companies will survive. Standards do not need to be extraordinarily well-defined and can start from a high-level alignment as we saw from schema.org. Keep on investing and contributing to standards and related discussion to create a shared learning path.

Thanks Oleg Shilovitsky to keep the topic alive.

p.s. I had not time to read and process your PLM Data Commodizitation post

 

Last week I read Verdi Ogewell’ s article:  PTC puts the Needle to the Digital Thread on Engineering.com where Verdi raised the question (and concluded) who is the most visionary PLM CEO – Bernard Charles from Dassault Systemes or Jim Heppelman from PTC. Unfortunate again, an advertorial creating more haziness around modern PLM than adding value.

People need education and Engineering.com is/was a respected site for me, as they state in their Engineering.com/about statement:

Valuable Content for Busy Engineers. Engineering.com was founded on the simple mission to help engineers be better.

Unfortunate this is not the case in the PLM domain anymore. In June, we saw an article related to the failing PLM migration at Ericsson – see The PLM migration dilemma. Besides the fact that a big-bang migration had failed at Ericsson, the majority of the article was based on rumors and suggestions, putting the sponsor of this article in a better perspective.

Of course, Engineering.com needs sponsoring to host their content, and vendors are willing to spend marketing money on that. However, it would be fairer to mention in a footnote who sponsored the article – although per article you can guess. Some more sincere editors or bloggers mention their sponsoring that might have influenced their opinion.

Now, why did the article PTC puts the Needle to the Digital Thread made me react ?

Does a visionary CEO pay off?

It can be great to have a visionary CEO however, do they make the company and their products/services more successful? For every successful visionary CEO, there are perhaps ten failing visionary CEOs as the stock market or their customers did not catch their vision.

There is no lack of PLM vision as Peter Bilello mapped in 2014 when imagining the gaps between vision, available technology, and implementations at companies (leaders and followers). See below:

The tremendous gap between vision and implementations is the topic that concerns me the most. Modern PLM is about making data available across the enterprise or even across the company’s ecosystem. It is about data democratization that allows information to flow and to be presented in context, without the need to recreate this information again.

And here the marketing starts. Verdi writes:

PTC’s Internet of Things (IoT), Industrial Internet of Things (IIoT), digital twin and augmented reality (AR) investments, as well as the collaboration with Rockwell Automation in the factory automation arena, have definitely placed the company in a leading position in digital product realization, distribution and aftermarket services

With this marketing sentence, we are eager to learn why

“With AR, for example, we can improve the quality control of the engines,” added Volvo Group’s Bertrand Felix, during an on-stage interview by Jim Heppelmann. Heppelmann then went down to a Volvo truck with the engine lifted out of its compartment. Using a tablet, he was able to show how the software identified the individual engine, the parts that were included, and he could also pick up the 3D models of each component and at the same time check that everything was included and in the right place.

Impressive – is it real?

The point is that this is the whole chain for digital product realization–development and manufacturing–that the Volvo Group has chosen to focus on. Sub-components have been set up that will build the chain, much is still in the pilot stage, and a lot remains to be done. But there is a plan, and the steps forward are imminent.

OK, so it is a pilot, and a lot remains to be done – but there is a plan. I am curious about the details of that plan, as a little later, we learn from the CAD story:

The Pro/ENGINEER “inheritor” Creo (engine, chassis) is mainly used for CAD and creation of digital twins, but as previously noted, Dassault Systémes’ CATIA is also still used. Just as in many other large industrial organizations, Autodesk’s AutoCAD is also represented for simpler design solutions.

There goes the efficient digital dream. Design data coming from CATIA needs to be recreated in Creo for digital twin support. Data conversion or recreation is an expensive exercise and needs to be reliable and affordable as the value of the digital twin is gone once the data is incorrect.

In a digital enterprise, you do not want silos to work with their own formats, you want a digital thread based on (neutral) models that share metadata/parameters from design to service.

So I dropped the article and noticed Oleg had already commented faster than me in his post: Does PLM industry need a visionary pageant? Oleg refers also to CIMdata, as they confirmed in 2018 that the concept of a platform for product innovation (PIP), or the beyond PLM is far from reality in companies. Most of the time, a PLM-implementation is mainly a beyond PDM environment, not really delivering product data downstream.

I am wholly aligned with Oleg’s  technical conclusion:

What is my(Oleg’s) conclusion? PLM industry doesn’t need another round of visionary pageants. I’d call democratization, downstream usage and openness as biggest challenges and opportunities in PLM applications. Recent decades of platform development demonstrated the important role network platforms played in the development of global systems and services. PLM paradigm change from isolated vertical platforms to open network services required to bring PLM to the next level. Just my thoughts..

My comments to Oleg’s post:

(Jos) I fully agree we do not need more visionary PLM pageants. It is not about technology and therefore I have to disagree with your point about Aras. You call it democratization and openness of data a crucial point – and here I agree – be it that we probably disagree about how to reach this – through standards or through more technology. My main point to be made (this post ) is that we need visionary companies that implement and rethink their processes and are willing to invest resources in that effort. Most digital transformation projects related to PLM fail because the existing status quo/ middle management has no incentive to change. More thoughts to come

And this the central part of my argumentation – it is not about technology (only).

Organizational structures are blocking digital transformation

Since 2014 I have been following several larger manufacturing companies on their path from pushing products to the market in a linear mode towards a customer-driven, more agile, fast responding enterprise. As this is done by taking benefit of digital technologies, we call this process: digital transformation.

(image depicting GE’s digital thread)

What I have learned from these larger enterprises, and both Volvo Trucks and GE as examples, that there is a vision for an end result. For GE, it is the virtual twin of their engines monitored and improved by their Predix platform. For Volvo Trucks, we saw the vision in the quote from Verdi’s article before.

However, these companies are failing in creating a horizontal mindset inside their companies. Data can only be efficient used downstream if there is a willingness to work on collecting the relevant data upstream and delivering this information in an accessible format, preferably data-driven.

The Middle Management Dilemma

And this leads to my reference to middle management. Middle managers learn about the C-level vision and are pushed to make this vision happen. However, they are measured and driven to solve these demands, mainly within their own division or discipline. Yes, they might create goodwill for others, but when it comes to money spent or changing people responsibilities, the status quo will remain.

I wrote about this challenge in The Middle Management dilemma. Digital transformation, of course, is enabled by digital technologies, but it does not mean the technology is creating the transformation. The crucial fact lies in making companies more flexible in their operations, yet establishing better and new contacts with customers.

It is interesting to see that the future of businesses is looking into agile, multidisciplinary teams that can deliver incremental innovations to the company’s portfolio. Somehow going back to the startup culture inside a more significant enterprise. Having worked with several startups, you see the outcome-focus as a whole in the beginning – everyone contributes. Then when the size of the company grows, middle-management is introduced, and most likely silos are created as the middle management gets their own profit & loss targets.

Digital Transformation myths debunked

This week Helmut Romer (thanks Helmut) pointed me to the following HBR-article: Digital does not need to be disruptive where the following myths are debunked:

  1. Myth: Digital requires radical disruption of the value proposition.
    Reality: It usually means using digital tools to better serve the known customer need.
  2. Myth: Digital will replace physical
    Reality: It is a “both/and.”
  3. Myth: Digital involves buying start-ups.
    Reality: It involves protecting start-ups.
  4. Myth: Digital is about technology.
    Reality: It’s about the customer
  5. Myth: Digital requires overhauling legacy systems.
    Reality: It’s more often about incremental bridging.

If you want to understand these five debunked myths, take your time to read the full article, very much aligned with my argumentation, albeit it that my focus is more on the PLM domain.

Conclusions

Vendor sponsoring at Engineering.com has not improved the quality of their PLM articles and creates misleading messages. Especially as the sponsor is not mentioned, and the sponsor is selling technology – the vision gap is too big with reality to compete around a vision.

Transforming companies to take benefit of new technologies requires an end-to-end vision and mindset based on achievable, incremental learning steps. The way your middle management is managed and measured needs to be reworked as the focus is on horizontal flow and understanding of customer/market-oriented processes.

 

This is the moment of the year, where at least in my region, most people take some time off to disconnect from their day-to-day business.  For me, it is never a full disconnect as PLM became my passion, and you should never switch off your passion.

On August 1st, 1999, I started my company TacIT, the same year the acronym PLM was born. I wanted to focus on knowledge management, therefore the name TacIT.  Being dragged into the SmarTeam world with a unique position interfacing between R&D, implementers and customers I found the unique sweet spot, helping me to see all aspects from PLM – the vendor position, the implementer’s view, the customer’s end-user, and management view.

It has been, and still, is 20 years of learning and have been sharing most in the past ten years through my blog. What I have learned is that the more you know, the more you understand that situations are not black and white. See one of my favorite blog pictures below.

So there is enough to overthink during the holidays. Some of my upcoming points:

From coordinated to connected

Instead of using the over-hyped term: Digital Transformation, I believe companies should learn to work in a connected mode, which has become the standard in our daily life. Connected means that information needs to be stored in databases somewhere, combined with openness and standards to make data accessible. For more transactional environments, like CRM, MES, and ERP, the connected mode is not new.

In the domain of product development and selling, we have still a long learning path to go as the majority of organizations is relying on documents, be it Excels, Drawings (PDF) and reports. The fact that they are stored in electronic file formats does not mean that they are accessible. There is still manpower needed to create these artifacts or to extract the required information from them.

The challenge for modern PLM is to establish new best practices around a model-based approach for systems engineering (MBSE), for engineering to manufacturing (MBD/MBE) and operations (Digital Twins). All these best practices should be generic and connected ultimately.  I wrote about these topics in the past, have a look at:

PLM Vendors are showing pieces of the puzzle, but it is up to the implementers to establish the puzzle, without knowing in detail what the end result will be. This is the same journey of Columbus. He had a boat and a target towards the unknown. He discovered a country with a small population, nowadays a country full of immigrants who call themselves natives.

However, the result was an impressive transformation.

Reading about transformation

Last year I read several books to get more insight into what motivates us, and how can we motivate people to change. In one way, it is disappointing to learn that we civilized human beings most of the time to not make rational decisions but act based on our per-historic brain.

 

Thinking, Fast and Slow from Daniel Kahneman was one of the first books in that direction as a must-read to understand our personal thinking and decision processes.

 

 

 

I read Idiot Brain: What Your Head Is Really Up To from Dean Burnett, where he explains this how our brain appears to be sabotaging our life, and what on earth it is really up to. Interesting to read but could be a little more comprehensive

 

I got more excited from Dan Ariely”s book: Predictably Irrational: The Hidden Forces That Shape Our Decisions as it was structured around topics where we handle completely irrational but predictable. And this predictability is used by people (sales/politicians/ management) to drive your actions. Useful to realize when you recognize the situation

 

These three books also illustrate the flaws of our modern time – we communicate fast (preferable through tweets) – we decide fast based on our gut feelings – so you realize towards what kind of world we are heading.  Going through a transformation should be considered as a slow, learning process. Like reading a book – it takes time to digest.

Once you are aiming at a business transformation for your company or supporting a company in its transformation, the following books were insightful:

Leading Digital: Turning Technology into Business Transformation by George Westerman, Didier Bonnet and Andrew McAfee is maybe not the most inspiring book, however as it stays close to what we experience in our day-to-day-life it is for sure a book to read to get a foundational understanding of business transformation.

 

The book I liked the most recent was Leading Transformation: How to Take Charge of Your Company’s Future by Nathan Furr, Kyle Nel, Thomas Zoega Ramsoy as it gives examples of transformation addressing parts of the irrational brain to get a transformation story. I believe in storytelling instead of business cases for transformation. I wrote about it in my blog post: PLM Measurable or a myth referring to Yuval Harari’s book Homo Sapiens

Note: I am starting my holidays now with a small basket of e-books. If you have any recommendations for books that I must read – please write them in the comments of this blog

Discussing transformation

After the summer holidays, I plan to have fruitful discussions around topics close to PLM. Working on a post and starting a conversation related to PLM, PIM, and Master Data Management. The borders between these domains are perhaps getting vaguer in a digital enterprise.

Further, I am looking forward to a discussion around the value of PLM assisting companies in developing sustainable products. A sustainable and probably circular economy is required to keep this earth a place to live for everybody. The whole discussion around climate change, however, is worrying as we should be Thinking – not fast and slow – but balanced.

A circular economy has been several times a topic during the joint CIMdata PLM Roadmap and PDT conferences, which bring me to the final point.

On 13th and 14th November this year I will participate again in the upcoming PLM Roadmap and PDT conference. This time in La Defense, Paris, France. I will share my experiences from working with companies trying to understand and implement pieces of a digital transformation related to PLM.

There will be inspiring presentations from other speakers, all working on some of the aspects of moving to facets of a connected enterprise. It is not a marketing event, it is done by professionals, serving professionals. Therefore I hope if you are passioned about the new aspects of PLM, no matter how you name label them, come and join, discuss and most of all, learn.

Conclusion

 

Modern life is about continuous learning  – make it a habit. Even a holiday is again a way to learn to disconnect.

How disconnected I was you will see after the holidays.

 

 

 

I am writing this post during the Easter weekend in the Netherlands. Easter / Passover / Pascha / are religious festivities that happen around this time, depending on full moons, etc. I am not the expert here, however, what I like about Easter is that is it is an optimistic religious celebration, connecting history, the “dark days,” and the celebration of new life.

Of course, my PLM-twisted brain never stops associating and looking into an analogy, I saw last week a LinkedIn post from Mark Reisig, about Aras ACE 2019 opening with the following statement:

Digital Transformation – it used to be called PLM,” said Aras CEO Peter Schroer, as he opened the conference with some thoughts around attaining sustainable Digital Transformation and owning the lifecycle.

Was this my Easter Egg surprise? I thought we were in the middle of the PLM Renaissance as some other vendors and consultants talk about this era. Have a look at a recent Engineering.com TV-report: Turning PLM on its head

All jokes aside, the speech from Peter Schroer contained some interesting statements and I want to elaborate on them in this post as the space to comment in LinkedIn is not designed for a long answer.

PLM is Digital Transformation?

In the past few years, there has been a discussion if the acronym PLM (Product Lifecycle Management) is perhaps outdated. PTC claimed thanks to IoT (Internet of Things) now PLM equals IoT, as you can read in  Mark Taber’s 2018 guest article in Digital Engineering: IoT Equals PLM.
Note: Mark is PTC’s vice president of marketing and go-to-market marketing according to the bio at the bottom of the article. So a lot of marketing words, which  strengthens the believers of the old world, that everything new is probably marketing.

Also during the PDT conferences, we discussed if PLM should be replaced by a new acronym and I participated in that discussion too – my Nov 2018 postWill MBSE be the new PLM instead of IoT? is a reflection of my thoughts at that time.

For me, Digital Transformation is a metamorphosis from a document-driven, sequential processes towards data-driven, iterative processes. The metamorphosis example used a lot at this moment, is the one from Caterpillar towards the Butterfly. This process is not easy when it comes to PLM-related information, as I described in my PI PLMx 2019 London Presentation and blog post: The Challenges of a Connected Ecosystem for PLM. The question is even: Will there be a full metamorphosis at the end or will we keep on working in two different modes of operations?

However, Digital Transformation does not change the PLM domain. Even after a successful digital transformation, there will be PLM. The only significant difference in the future – PLM boarders will not be so evident anymore when implementing capabilities in a system or a platform. The upcoming of digital platforms will dissolve or fade the traditional PLM-mapped capabilities.

You can see these differences already by taking an in-depth look at how Oracle, SAP or Propel address PLM. Each of them starts from a core platform with different PLM-flavored extensions, sometimes very different from the traditional PLM Vendors. So Digital transformation is not the replacement of PLM.

Back to Peter Schroer’s rebuttal of some myths. Note: DX stands for Digital Transformation

Myth #1: DX leverages disruptive tech

Peter Schroer:

 It’s easy to get excited about AI, AR, and the 3D visual experience. However, let’s be real. The first step is to get rid of your spreadsheets and paper documentation – to get an accurate product data baseline. We’re not just talking a digital CAD model, but data that includes access to performance data, as-built parts, and previous maintenance work history for everyone from technicians to product managers

Here I am fully aligned with Peter. There are a lot of fancy features discussed by marketing teams, however, when working in the field with companies, the main challenge is to get an organization digital aligned, sharing data accessible along the whole lifecycle with the right quality.

This means you need to have a management team, understanding the need for data governance, data quality and understanding the shift from data ownership to data accountability.  This will only happen with the right mix of vision, strategy and the execution of the strategy – marketing does not make it happen

 

Myth #2: DX results in increased market share, revenue, and profit

Peter Schroer:

Though there’s a lot of talk about it – there isn’t yet any compelling data which proves this to be true. Our goal at Aras is to make our products safer and faster. To support a whole suite of industrial applications to extend your DX strategy quite a bit further.

Here I agree and disagree, depending on the context of this statement. Some companies have gone through a digital transformation and therefore increased their market share, revenue, and profit. If you read books like Leading Transformation or Leading Digital, you will find examples of companies that have gone through successful digital transformations. However, you might also discover that most of these companies haven’t transformed their PLM-domain, but other parts of their businesses.

Also, it is interesting to read a 2017 McKinsey post: The case for digital reinvention, where you will get the confirmation that a lot of digital initiatives did not bring more top-line revenue and most of the times lead to extra costs. Interesting to see where companies focus their digital strategies – picture below:

Where only 2 percent of the respondents were focusing on supply chains, this is, according to the authors of the article, one of the areas with the highest potential ROI. And digital supply chains are closely related to modern PLM – so this is an area with enough work to do by all PLM practitioners– connecting ecosystems (in real-time)

Myth #3: Market leaders are the most successful at DX

Peter Schroer:

If your company is hugely profitable at the moment, it’s highly likely that your organization is NOT focused on Digital Transformation. The lifespan of S&P 500 companies continuing to shrink below 20 years.

How to Attain Sustainable Digital Transformation

– Stop buying disposable systems. It’s about an adaptable platform – it needs to change as your company changes.

– Think incremental. Do not lose momentum. Continuous change is a multi-phase journey. If you are in or completed phase I, then that means there is a phase II, a phase III, and so on.

– Align people & processes.  Mistakes will happen, “the tech side is only 50% of DX” – Aras CEO.

Here I agree with Peter on the business side, be it that some of the current market leaders are already digital. Look at Apple, Google, and Amazon. However, the majority of large enterprises have severe problems with various aspects of a digital transformation as the started in the past before digital technologies became affordable..

Digitization allows information to flow without barriers within an organization, leading to rapid insights and almost direct communication with your customers, your supply chain or other divisions within your company. This drives the need to learn and build new, lean processes and get people aligned to them. Learning to work in a different mode.

And this is extremely difficult for a market leader – as market leader fear for the outside changing world is often not felt. Between the C-level vision and people working in the company, there are several layers of middle management. These layers were created to structure and stabilize the old ways of working.

I wrote about the middle management challenge in my last blog post: The Middle Management dilemma. Almost in the same week there was an article from McKinsey: How companies can help midlevel managers navigate agile transformations.
Conclusion: It is not (only) about technology as some of the tech geeks may think.

Conclusion

Behind the myths addressed by Peter Schroer, there is a complex transformation on-going. Probably not a metamorphosis. With the Easter spirit in mind connected to PLM, I believe digital transformations are possible – Not as a miracle but driven by insights into all aspects. I hope this post gave you some more ideas and please read the connected articles – they are quite relevant if you want to discover what’s below the surface.

%d bloggers like this: