You are currently browsing the tag archive for the ‘Digital PLM’ tag.

In the last two weeks, three events were leading to this post.

First, I read John Stark’s recent book Products2019. A must-read for anyone who wants to understand the full reach of product lifecycle related activities. See my recent post: Products2019, a must-read if you are new to PLM

Afterwards, I talked with John, discussing the lack of knowledge and teaching of PLM, not to be confused by PLM capabilities and features.

Second, I participated in an exciting PI DX USA 2020 event. Some of the sessions and most of the roundtables provided insights to me and, hopefully, many other participants. You can get an impression in the post: The Weekend after PI DX 2020 USA.

A small disappointment in that event was the closing session with six vendors, as I wrote. I know it is evident when you put a group of vendors in the arena, it will be about scoring points instead of finding alignment. Still, having criticism does not mean blaming, and I am always open to having a dialogue. For that reason, I am grateful for their sponsorship and contribution.

Oleg Shilovitsky mentioned cleverly that this statement is a contradiction.

“How can you accuse PLM vendors of having a limited view on PLM and thanking them for their contribution?”

I hope the above explanation says it all, combined with the fact that I grew up in a Dutch culture of not hiding friction, meanwhile being respectful to others.

We cannot simplify PLM by just a better tool or technology or by 3D for everybody. There are so many more people and processes related to product lifecycle management involved in this domain if you want a real conference, however many of them will not sponsor events.

It is well illustrated in John Stark’s book. Many disciplines are involved in the product lifecycle. Therefore, if you only focus on what you can do with your tool, it will lead to an incomplete understanding.

If your tool is a hammer, you hope to see nails everywhere around you to demonstrate your value

The thirds event was a LinkedIn post from John Stark  – 16 groups needing Product Lifecycle Knowledge, which for me was a logical follow-up on the previous two events. I promised John to go through these 16 groups and provide my thoughts.

Please read his post first as I will not rewrite what has been said by John already.

CEOs and CTOs

John suggested that they should read his book, which might take more than eight hours.  CEOs and CTOs, most of the time, do not read this type of book with so many details, so probably mission impossible.

They want to keep up with the significant trends and need to think about future business (model).

New digital and technical capabilities allow companies to move from a linear, coordinated business towards a resilient, connected business. This requires exploring future business models and working methods by experimenting in real-life, not Proof of Concept. Creating a learning culture and allowing experiments to fail is crucial, as you only learn by failing.

CDO, CIOs and Digital Transformation Executives

They are the crucial people to help the business to imagine what digital technologies can do. They should educate the board and the business teams about the power of having reliable, real-time data available for everyone connected. Instead of standardizing on systems and optimizing the siloes, they should assist and lead in new infrastructure for connected services, end-to-end flows delivered on connected platforms.

These concepts won’t be realized soon. However, doing nothing is a big risk, as the traditional business will decline in a competitive environment. Time to act.

Departmental Managers

These are the people that should worry about their job in the long term. Their current mission might be to optimize their department within its own Profit & Loss budget. The future is about optimizing the information flow for the whole value chain, including suppliers and customers.

I wrote about it in “The Middle Management Dilemma.” Departmental Managers should become more team leaders inspiring and supporting the team members instead of controlling the numbers.

Products Managers

This is a crucial role for the future, assuming a product manager is not only responsible for the marketing or development side of the product but also gets responsibility for understanding what happens with the product during production and sales performance. Understanding the full lifecycle performance and cost should be their mission, supported by a digital infrastructure.

Product Developers

They should read the book Products2019 to be aware there is so much related to their work. From this understanding, a product developer should ask the question:

“What can I do better to serve my internal and external customers ?”

This question will no arise in a hierarchical organization where people are controlled by managers that have a mission to optimize their silo. Product Developers should be trained and coached to operate in a broader context, which should be part of your company’s mission.  Too many people complain about usability in their authoring and data management systems without having a holistic understanding of why you need change processes and configuration management.

Product Lifecycle Management (PLM) deployers

Here I have a little bit of the challenge that this might be read as PLM-system users. However, it should be clear that we mean here people using product data at any moment along the product lifecycle, not necessarily in a single system.

This is again related to your company’s management culture. In the ideal world, people work with a purpose and get informed on how their contribution fits the company’s strategy and execution.

Unfortunately, in most hierarchical organizations, the strategy and total overview get lost, and people become measured resources.

New Hires and others

John continues with five other groups within the organization. I will not comment on them, as the answers are similar to the ones above – it is about organization and culture.

Educators and Students

This topic is very close to my heart, and one of the reasons I continue blogging about PLM practices. There is not enough attention to product development methodology or processes. Engineers can get many years of education in specific domains, like product design principles, available tools and technologies, performing physical and logical simulations.

Not so much time is spent on educating current best practices, business models for product lifecycle management.

Check in your country how many vendor-independent methodology-oriented training you can find. Perhaps the only consistent organization I know is CIMdata, where the challenge is that they deliver training to companies after students have graduated. It would be great if education institutes would embed serious time for product lifecycle management topics in their curriculum. The challenge, of course, the time and budget needed to create materials and, coming next, prioritizing this topic on the overall agenda.

I am happy to participate to a Specialized Master education program aiming at the Products and Buildings Digital Engineering Manager (INGENUM). This program organized by Arts Et Metiers in France helps create the overview for understanding PLM and BIM – in the French language as before COVID-19 this was an on-site training course in Paris.

Hopefully, there are more institutes offering PLM eductation – feel free to add them in the comments of this post.

Consultants, Integrators and Software Company Employees

Of course, it would be nice if everyone in these groups understands the total flow and processes within an organization and how they relate to each other. Too often, I have seen experts in a specific domain, for example, a 3D CAD-system having no clue about revisioning, the relation of CAD to the BOM, or the fundamentals of configuration management.

Consultants, Integrators and Software Company Employees have their own challenges as their business model is often looking for specialized skills they can sell to their clients, where a broader and general knowledge will come from experience on-the-job.

And if you are three years working full-time on a single project or perhaps work in three projects, your broader knowledge does not grow fast. You might become the hammer that sees nails everywhere.

For that reason, I recommend everyone in my ecosystem to invest your personal time to read related topics of interest. Read LinkedIn-posts from others and learn to differentiate between marketing messages and people willing to share experiences. Don’t waste your time on the marketing messages and react and participate in the other discussions. A “Like” is not enough. Ask questions or add your insights.

In the context of my personal learning, I mentioned that I participated in the DigitalTwin-conference in the Netherlands this week. Unfortunately, due to the partial lockdown, mainly a virtual event.

I got several new insights that I will share with you soon. An event that illustrated Digital Twin as a buzzword might be hype, however several of the participants illustrated examples of where they applied or plan to apply Digital Twin concepts. A great touch with reality.

Another upcoming conference that will start next week in the PLM Roadmap 2020 – PDT conference. The theme: Digital Thread—the PLM Professionals’ Path to Delivering Innovation, Efficiency, and Quality is not a marketing theme as you can learn from the agenda. Step by step we are learning here from each other.

 

Conclusion

John Stark started with the question of who should need Product Lifecycle Knowledge. In general, Knowledge is power, and it does not come for free. Either by consultancy, reading or training. Related to Product Lifecycle Management, everyone must understand the bigger picture. For executives as they will need to steer the company in the right direction. For everyone else to streamline the company and enjoy working in a profitable environment where you contribute and can even inspire others.

An organization is like a human body; you cannot have individual cells or organs that optimize themselves only – we have a name for that disease. Want to learn more? Read this poem: Who should be the boss?

 

 

After the series about “Learning from the past,” it is time to start looking towards the future.  I learned from several discussions that I am probably working most of the time with advanced companies. I believe this would motivate companies that lag behind even to look into the future even more.

If you look into the future for your company, you need new or better business outcomes. That should be the driver for your company. A company does not need PLM or a Digital Twin. A company might want to reduce its time to market, improve collaboration between all stakeholders. These objectives can be realized by different ways of working and an IT-infrastructure to allow these processes to become digital and connected.

That is the “game”. Coming back to the future of PLM.  We do not need a discussion about definitions; I leave this to the academics and vendors. We will see the same applies to the concept of a Digital Twin.

My statement: The digital twin is not new. Everybody can have their own digital twin as long as you interpret the definition differently. Does this sound like the PLM definition?

The definition

I like to follow the Gartner definition:

A digital twin is a digital representation of a real-world entity or system. The implementation of a digital twin is an encapsulated software object or model that mirrors a unique physical object, process, organization, person, or other abstraction. Data from multiple digital twins can be aggregated for a composite view across a number of real-world entities, such as a power plant or a city, and their related processes.

As you see, not a narrow definition. Now we will look at the different types of interpretations.

Single-purpose siloed Digital Twins

  1. Simple – data only

One of the most straightforward applications of a digital twin is, for example, my Garmin Connect environment. When cycling, my device registers performance parameters (speed, cadence, power, heartbeat, location). After every trip, I can analyze my performance. I can see changes in my overall performance; compare my performance with others in my category (weight, age, sex).

Based on that, I can decide if I want to improve my performance. My personal business goal is to maintain and improve my overall performance, knowing I cannot stop aging by upgrading my body.

On November 4th, 2020, I am participating in the (almost virtual) Digital Twin conference organized by Bits&Chips in the Netherlands. In the context of human performance, I look forward to Natal van Riel’s presentation: Towards the metabolic digital twin – for sure, this direction is not simple. Natal is a full professor at the Technical University in Eindhoven, the “smart city” in the Netherlands

  1. Medium – data and operating models

Many connected devices in the world use the same principle. An airplane engine, an industrial robot, a wind turbine, a medical device, and a train carriage; all track the performance based on this connection between physical and virtual, based on some sort of digital connectivity.

The business case here is also monitoring performance, predict maintenance, and upgrade the product when needed.

This is the domain of Asset Lifecycle Management, a practice that exists for decades. Based on financial and performance models, the optimal balance between maintaining and overhaul has to be found. Repairs are disruptive and can be extremely costly. A manufacturing site that cannot produce can costs millions per day. Connecting data between the physical and the virtual model allows us to have real-time insights and be proactive. It becomes a digital twin.

  1. Advanced – data and connected 3D model

The ditial twin we see the most in marketing videos is a virtual twin, using a 3D-representation for understanding and navigation.  The 3D-representation provides a Virtual Reality (VR) environment with connected data. When pointing at the virtual components, information might appear, or some animation takes place.

Building such a virtual representation is a significant effort; therefore, there needs to be a serious business case.

The simplest business case is to use the virtual twin for training purposes. A flight simulator provides a virtual environment and behavior as-if you are flying in the physical airplane – the behavior model behind the simulator should match as good as possible the real behavior. However, as it is a model, it will never be 100 % reality and requires updates when new findings or product changes appear.

A virtual model of a platform or plant can be used for training on Standard Operating Procedures (SOPs). In the physical world, there is no place or time to conduct such training. Here the complexity might be lower. There is a 3D Model; however, serious updates can only be expected after a major maintenance or overhaul activity.

These practices are not new either and are used in places where the physical training cannot be done.

More challenging is the Augmented Reality (AR) use case. Here the virtual model, most of the time, a lightweight 3D Model, connects to real-time data coming from other sources. For example, AR can be used when an engineer has to service a machine. The AR-environment might project actual data from the machine, indicate service points and service procedures.

The positive side of the business case is clear for such an opportunity, ensuring service engineers always work with the right information in a real-time context. The main obstacle for implementing AR, in reality, is the access to data, the presentation of the data and keeping the data in the AR-environment matching the reality.

And although there are 3D Models in use, they are, to my knowledge, always created in siloes, not yet connected to their design sources.Have a look at the Digital Twin conference from Bits&Chips, as mentioned before.

Several of the cases mentioned above will be discussed here. The conference’s target is to share real cases concluded by Q & A sessions, crucial for a virtual event.

Connected Virtual Twins along the product lifecycle

So far, we have been discussing the virtual twin concept, where we connect a product/system/person in the physical world to a virtual model. Now let us zoom in on the virtual twins relevant for the early parts of the product lifecycle, the manufacturing twin, and the development twin. This image from Siemens illustrates the concept:

On slides they imagine a complete integrated framework, which is the future vision. Let us first zoom in on the individual connected twins.

The digital production twin

This is the area of virtual manufacturing and creating a virtual model of the manufacturing plant. Virtual manufacturing planning is not a new topic. DELMIA (Dassault Systèmes) and Tecnomatix (Siemens) are already for a long time offering virtual manufacturing planning solutions.

At that time, the business case was based on the fact that the definition of a manufacturing plant and process done virtually allows you to optimize the plant before investing in physical assets.

Saving money as there is no costly prototype phase to optimize production. In a virtual world, you can perform many trade-off studies without extra costs. That was the past (and for many companies still the current situation).

With the need to be more flexible in manufacturing to address individual customer orders without increasing the overhead of delivering these customer-specific solutions, there is a need for a configurable plant that can produce these individual products (batch size 1).

This is where the virtual plant model comes into the picture again. Instead of having a virtual model to define the ultimate physical plant, now the virtual model remains an active model to propose and configure the production process for each of these individual products in the physical plant.

This is partly what Industry 4.0 is about. Using a model-based approach to configure the plant and its assets in a connected manner. The digital production twin drives the execution of the physical plant. The factory has to change from a static factory to a dynamic “smart” factory.

In the domain of Industry 4.0, companies are reporting progress. However, to my experience, the main challenge is still that the product source data is not yet built in a model-based, configurable manner. Therefore, requiring manual rework. This is the area of Model-Based Definition, and I have been writing about this aspect several times. Latest post: Model-Based: Connecting Engineering and Manufacturing

The business case for this type of digital twin, of course, is to be able to customer-specific products with extremely competitive speed and reduced cost compared to standard. It could be your company’s survival strategy. As it is hard to predict the future, as we see from COVID-19, it is still crucial to anticipate the future, instead of waiting.

The digital development twin

Before a product gets manufactured, there is a product development process. In the past, this was pure mechanical with some electronic components. Nowadays, many companies are actually manufacturing systems as the software controlling the product plays a significant role. In this context, the model-based systems engineering approach is the upcoming approach to defining and testing a system virtually before committing to the physical world.

Model-Based Systems Engineering can define a single complex product and perform all kinds of analysis on the system even before there is a physical system in place.  I will explain more about model-based systems engineering in future posts. In this context, I want to stress that having a model-based system engineering environment combined with modularity (do not confuse it with model-based) is a solid foundation for dealing with unique custom products. Solutions can be configured and validated against their requirements already during the engineering phase.

The business case for the digital development twin is easy to make. Shorter time to market, improved and validated quality, and reduced engineering hours and costs compared to traditional ways of working. To achieve these results,  for sure, you need to change your ways of working and the tools you are using. So it won’t be that easy!

For those interested in Industry 4.0 and the Model-Based System Engineering approach, join me at the upcoming PLM Road Map 2020 and PDT 2020 conference on 17-18-19 November. As you can see from the agenda, a lot of attention to the Digital Twin and Model-Based approaches.

Three digital half-days with hopefully a lot to learn and stay with our feet on the ground.  In particular, I am looking forward to Marc Halpern’s keynote speech: Digital Thread: Be Careful What you Wish For, It Just Might Come True

Conclusion

It has been very noisy on the internet related to product features and technologies, probably due to COVIC-19 and therefore disrupted interactions between all of us – vendors, implementers and companies trying to adjust their future. The Digital Twin concept is an excellent framing for a concept that everyone can relate to. Choose your business case and then look for the best matching twin.

On March 22 this year, I wrote Time to Think (and act differently) in de middle of a changing world. We were entering a lockdown in the Netherlands due to the COVID-19 virus. As it was such a disruptive change, it was an opportunity to adapt their current ways of working.

The reason for that post was my experience when discussing PLM-initiatives with companies. Often they have no time to sit down, discuss and plan their PLM targets as needed. Crucial people are too busy, leading to an implementation of a system that, in the best case, creates (some) benefits.

The well-known cartoon says it all. We are often too busy doing business as usual, making us feel comfortable. Only when it is too late, people are forced to act.  As the second COVID-19 wave seems to start in the Netherlands, I want to look back on what has happened so far in my eco-system.

Virtual Conferences

As people could not travel anymore, traditional PLM-conferences could not be organized anymore. What was going to be the new future for conferences? TECHNIA, apparently clairvoyant, organized their virtual PLM Innovation Forum as one of the first, end of April.

A more sustainable type of PLM-conference was already a part of their plans, given the carbon footprint a traditional conference induces.  The virtual conference showed that being prepared for a virtual conference pays off during a pandemic with over 1000 participants.

Being first does not always mean being the best,  as we have to learn. While preparing my session for the conference, I felt the same excitement as for a traditional conference. You can read about my initial experience here: The weekend after the PLM Innovation Forum.

Some weeks later, having attended some other virtual conferences, I realized that some points should be addressed/solved:

  • Video conferencing is a must – without seeing people talking, it becomes a podcast.
  • Do not plan long conference days. It is hard to sit behind a screen for a full day. A condensed program makes it easier to attend.
  • Virtual conferences mean that they can be attended live from almost all around the globe. Therefore, finding the right timeslots is crucial for the audience – combined with the previous point – shorter programs.
  • Playing prerecorded sessions without a Q&A session should be avoided. It does not add value.
  • A conference is about networking and discussion – I have not seen a solution for this yet. Fifty percent of the conference value for me comes from face-to-face discussions and coffee meetings. A virtual conference needs to have private chat opportunities between attendees.

In the last quarter of this year, I will present at several merely local conferences, sometimes a mix between “live” with a limited number of attendees, if it will be allowed.

And then there is the upcoming PLM Road Map & PDT Fall 2020 (virtual) conference on 17-18-19 November.

This conference has always been my favorite conference thanks to its continued focus on sharing experiences, most of the time, based on industry standards. We discuss topics and learn from each other. See my previous posts: The weekend after 2019 Day 1, 2019 Day 2, 2018 Day 1, 2018 Day2, 2017 Day 1, 2017 Day 2, etc.

The theme Digital Thread—the PLM Professionals’ Path to Delivering Innovation, Efficiency, and Quality has nothing to do with marketing. You can have a look at the full schedule here. Although there is a lot of buzz around Digital Thread, presenters discuss the reality and their plans

Later in this post, see the paragraph Digital Thread is not a BOM, I will elaborate on this theme.

Getting tired?

I discovered I am getting tired as I am missing face-to-face interaction with people. Working from home, having video calls, is probably a very sustainable way of working.  However, non-planned social interaction, meeting each other at the coffee machine, or during the breaks at a conference or workshop, is also crucial for informal interaction.

Apparently, several others in my eco-system are struggling too. I noticed a tsunami of webinars and blog posts where many of them were an attempt to be noticed. Probably the same reason: traditionally businesses have stalled. And it is all about Digital Transformation and SaaS at this moment. Meaningless if there is no interaction.

In this context, I liked Jan Bosch’s statement in his article: Does data-driven decision-making make you boring? An article not directly addressing the PLM-market; however, there is a lot of overlap related to people’s reluctance to imagine a different future.

My favorite quote:

 I still meet people that continue to express beliefs about the world, their industry, their customers or their own performance that simply aren’t true. Although some, like Steve Jobs, were known for their “reality distortion field,” for virtually all of us, just wishing for something to be true doesn’t make it so. As William Edwards Deming famously said: in God we trust; all others must bring data.

I fully concur with this statement and always get suspicious when someone claims the truth.

Still, there are some diamonds.

I enjoyed all episodes from Minerva PLM TV – Jennifer Moore started these series in the early COVID19-days (coincidence?). She was able to have a collection of interviews with known and less-known people in the PLM-domain. As most of them were vendor-independent, these episodes are a great resource to get educated.

The last episode with Angela Ippisch illustrates how often PLM in companies depends on a few enthusiastic persons, who have the energy to educate themselves. Angela mentions there is a lot of information on the internet; the challenge is to separate the useful information from marketing.

I have been publishing the past five months a series of posts under the joint theme learning from the past to understand the future. In these posts, I explained the evolution from PDM to PLM, resulting in the current item-centric approach with an EBOM, MBOM, and SBOM.

On purpose, one post per every two weeks – to avoid information overflow. Looking back, it took more posts than expected, and they are an illustration of the many different angles there are in the PLM domain – not a single truth.

Digital Thread is not a BOM

I want to address this point because I realized that in the whole blogging world there appear to be two worlds when discussing PLM terminology. Oleg Shilovitsky, CEO@OpenBOM, claims that Digital Thread and Digital Twin topics are just fancy marketing terms. I was even more surprised to read his post: 3 Reasons Why You Should Avoid Using The Word “Model” In PLM. Read the comments and discussion in these posts (if LinkedIn allows you to navigate)

Oleg’s posts have for me most of the time, always something to discuss. I would be happier if other people with different backgrounds would participate in these discussions too – A “Like” is not a discussion. The risk in a virtual world is that it becomes a person-to-person debate, and we have seen the damage such debates can do for an entire community.

In the discussion we had related to Digital Thread and BOM, I realized that when we talk about traditional products, the BOM and the Digital Thread might be the same. This is how we historically released products to the market. Once produced, there were no more changes. In these situations, you could state a PLM-backbone based on BOM-structures/views, the EBOM, MBOM, and SBOM provide a Digital Thread.

The different interpretation comes when talking about products that contain software defining its behavior. Like a computer, the operating system can be updated on the fly; meanwhile, the mechanical system remains the same. To specify and certify the behavior of the computer, we cannot rely on the BOM anymore.

Having software in the BOM and revise the BOM every time there is a software change is a mission impossible. A mistake suggested ten years ago when we started to realize the different release cycles of hardware and software. Still, it is all about the traceability of all information related to a product along its whole lifecycle.

In a connected environment, we need to manage relationships between the BOM and relations to other artifacts. Managing these relations in a connected environment is what I would call the Digital Thread – a layer above PLM. While writing this post, I saw Matthias Ahrens’ post stating the same (click on the image to see the post)

When we discuss managing all the relations, we touch the domain of Configuration Management.  Martijn Dullaart/Martin Haket’s picture shares the same mindset – here, CM is the overlapping layer.

However, in their diagram, it is not a system picture; the different systems do not need to be connected. Configuration Management is the discipline that maintains the correct definition of every product – CM maintains the Thread. When it becomes connected, it is a Digital Thread.

As I have reached my 1500 words, I will not zoom in on the PLM and Model discussion – build your opinion yourself. We have to realize that the word Model always requires a context. Perhaps many of us coming from the traditional PDM/PLM world (managing CAD data) think about CAD models. As I studied physics before even touching CAD, I grew up with a different connotation

Lars Taxén’s comment in this discussion perhaps says it all (click on the image to read it). If you want to learn and discuss more about the Digital Thread and Models, register for the PLM Roadmap & PDT2020 event as many of the sessions are in this context (and not about 3D CAD).

Conclusion

I noticed I am getting tired of all the information streams crying for my attention and look forward to real social discussions, not broadcasted. Time to think differently requires such discussion, and feel free to contact me if you want to reflect on your thoughts. My next action will be a new series named Painting the future to stay motivated. (As we understand the past).

This time a short post (for me) as I am in the middle the series “Learning from the past to understand the future” and currently collecting information for next week’s post. However, recently Rob Ferrone, the original Digital Plumber, pointed me to an interesting post from Scott Taylor, the Data Whisperer.

In code: The Virtual Dutchman discovered the Data Whisperer thanks to the original Digital Plumber.

Scott’s article with the title: “Data Management Hasn’t Failed, but Data Management Storytelling Has” matches precisely the discussion we have in the PLM community.

Please read his article, and just replace the words Data Management by PLM, and it could have been written for our community. In a way, PLM is a specific application of data management, so not a real surprise.

Scott’s conclusions give food for thought in the PLM community:

To win over business stakeholders, Data Management leadership must craft a compelling narrative that builds urgency, reinvigorates enthusiasm, and evangelizes WHY their programs enable the strategic intentions of their enterprise. If the business leaders whose support and engagement you seek do not understand and accept the WHY, they will not care about the HOW. When communicating to executive leadership, skip the technical details, the feature functionality, and the reference architecture and focus on:

  • Establishing an accessible vocabulary
  • Harmonizing to a common voice
  • Illuminating the business vision

When you tell your Data Management story with that perspective, it can end happily ever after.

It all resonates well with what I described in the PLM ROI Myth – it is clear that when people hear the word Myth, they have a bad connotation, same btw for PLM.

The fact that we still need to learn storytelling is because most of us are so much focused on technology and sometimes on discovering the new name for PLM in the future.

Last week I pointed to a survey from the PLMIG (PLM Interest Group) and XLifcycle, inviting you to help to define the future definition of PLM.

You are still welcome here: Towards a digital future: the evolving role of PLM in the future digital world.

Also, I saw a great interview with Martin Eigner on Minerva PLM TV interview by Jennifer Moore. Martin is well known in the PLM world and has done foundational work for our community

. According to Jennifer, he is considered as The Godfather of PLM.  This tittle fits nicely in today’s post. Those who have seen his presentations in recent years will remember Martin is talking about SysLM (System Lifecycle Management) as the future for PLM.

It is an interesting recording to watch – click on the image above to see it. Martin explains nicely why we often do not get the positive feedback from PLM implementations – starting at minute 13 for those who cannot wait.

In the interview, you will discover we often talk too much about our discipline capabilities where the real discussion should be talking business. Strategy and objectives are discussed and decided at the management level of a company. By using storytelling, we can connect to these business objectives.

The end result will be more likely that a company understands why to invest significantly in PLM as now PLM is part of its competitiveness and future continuity.

Conclusion

I shared links to two interesting posts from the last weeks. Studying them will help you to create a broader view. We have to learn to tell the right story. People do not want PLM – they have personal objectives. Companies have business objectives, and they might lead to the need for a new and changing PLM. Connecting to the management in an organization, therefore, is crucial.

Next week again more about learning from the past to understand the future

It’s the beginning of the year. Companies are starting new initiatives, and one of them is potentially the next PLM-project. There is a common understanding that implementing PLM requires a business case with ROI and measurable results. Let me explain why this understanding is a myth and requires a myth.

I was triggered by a re-post from Lionel Grealou, titled: Defining the PLM Business Case.  Knowing Lionel is quite active in PLM and digital transformation, I was a little surprised by the content of the post. Then I noticed the post was from January 2015, already 5 years old. Clearly, the world has changed (perhaps the leadership has not changed).

So I took this post as a starting point to make my case.

In 2015, we were in the early days of digital transformation. Many PLM-projects were considered as traditional linear projects. There is the AS-IS situation, there is the TO-BE situation. Next, we know the  (linear) path to the solution and we can describe the project and its expected benefits.

It works if you understand and measure exactly the AS-IS situation and know almost entirely the TO-BE situation (misperception #1).

However,  implementing PLM is not about installing a new transactional system. PLM implementations deal with changing ways-of-working and therefore implementing PLM takes time as it is not just a switch of systems. Lionel was addressing this point:

“The inherent risks associated with any long term business benefit driven projects include the capability of the organization to maintain a valid business case with a benefit realization forecast that remains above the initial baseline. The more rework is required or if the program delivery slips, the more the business case gets eroded and the longer the payback period.”

Interestingly here is the mentioning ..the business case gets eroded – this is most of the time the case. Lionel proposes to track business benefits. Also, he mentions the justification of the PLM-project could be done by considering PLM as a business transformation tool (misperception #2) or a way to mitigate risk,s due to unsupported IT-solutions (misperception #3).

Let’s dive into these misperceptions

#1 Compare the TO-BE and the AS-IS situation

Two points here.

  1. Does your company measure the AS-IS situation? Do you know how your company performs when it comes to PLM related processes? The percentage of time spent by engineers for searching for data has been investigated – however, PLM goes beyond engineering. What about product management, marketing, manufacturing, and service?  Typical performance indicators mentioned are:
    • Time To Market (can you measure?)
    • Developing the right product – better market responsiveness (can you measure?)
    • Multidisciplinary collaboration (can you measure?)
  2. Do you know the exact TO-BE situation? In particular, when you implement PLM, it is likely to be in the scope of a digital transformation. If you implement to automate and consolidate existing processes, you might be able to calculate the expected benefits. However, you do not want to freeze your organization’s processes. You need to implement a reliable product data infrastructure that allows you to enhance, change, or add new processes when required. In particular, for PLM, digital transformation does not have a clear target picture and scope yet. We are all learning.

#2 PLM is a business transformation tool

Imagine you install the best product innovation platform relevant for your business and selected by your favorite consultancy firm. It might be a serious investment; however, we are talking about the future of the company, and the future is in digital platforms. So nothing can go wrong now.

Does this read like a joke? Yes, it is, however, this is how many companies have justified their PLM investment. First, they select the best tool (according to their criteria, according to their perception), and then business transformation can start. Later in time, the implementation might not be so successful; the vendor and/or implementer will be blamed. Read: The PLM blame game

When you go to PLM conferences, you will often hear the same mantras: Have a vision, Have C-level sponsoring/involved, No Big Bang, it is a business project, not an IT-project, and more. And vendor-sponsored sessions always talk about amazing fast implementations (or did they mean installing the POC ?)

However, most of the time, C-level approves the budget without understanding the full implications (expecting the tool will do the work); business is too busy or does not get enough allocated time to supporting implementation (expecting the tool will do the work). So often the PLM-project becomes an IT-project executed mainly by the cheapest implementation partner (expecting the tool will do the work). Again this is not a joke!

A business transformation can only be successful if you agree on a vision and a learning path. The learning path will expose the fact that future value streams require horizontal thinking and reallocation of responsibilities – breaking the silos, creating streams.

Small teams can demonstrate these benefits without disrupting the current organization. However, over time the new ways of working should become the standard, therefore requiring different types of skills (people), different ways of working (different KPIs and P&L for departments), and ultimate different tools.

As mentioned before, many PLM-projects start from the tools – a guarantee for discomfort and/or failure.

#3 – mitigate risks due to unsupported IT-solutions

Often PLM-projects are started because the legacy environment becomes outdated. Either because the hardware infrastructure is no longer supported/affordable or the software code dependencies on the latest operating systems are no longer guaranteed.

A typical approach to solve this is a big-bang project – the new PLM system needs to contain all the old data and meanwhile, to justify the project, the new PLM system needs to bring additional business value. The latter part is most of the time not difficult to identify as traditional PLM implementations most of the time were in reality cPDM environments with a focus on engineering only.

However, the legacy migration can have such a significant impact on the new PLM-system that it destroys the potential for the future. I wrote about this issue in The PLM Migration Dilemma

How to approach PLM ROI?

A PLM-project never will get a budget or approval from the board when there is no financial business case. Building the right financial business case for PLM is a skill that is often overlooked. During the upcoming PI PLMx London conference (3 – 4 February), I will moderate a Focus Group where we will discuss how to get PLM on the Exec’s agenda.

Two of my main experiences:

  • Connect your PLM-project to the business strategy. As mentioned before, isolated PLM fails most of the time because business transformation, organizational change and the targeted outcome are not included. If PLM is not linked to an actual business strategy, it will be considered as a costly IT-project with all its bad connotations. Have a look at my older post: PLM, ROI and disappearing jobs
  • Create a Myth. Perhaps the word Myth is exaggerated – it is about an understandable vision. Myth connects nicely to the observations from behavioral experts that our brain does not decide on numbers but by emotion. Big decisions and big themes in the world or in a company need a myth: “Make our company great again” could be the tagline. In such a case people get aligned without a deep understanding of what is the impact or business case; the myth will do the work – no need for a detailed business case. A typical human behavior, see also my post: PLM as a myth.

Conclusion

There should never be a business case uniquely for PLM – it should always be in the context of a business strategy requiring new ways of working and new tools. In business, we believe that having a solid business case is the foundation for success. Sometimes an overwhelming set of details and numbers can give the impression that the business case is solid.  Consultancy firms are experts in this area to build a business case based on emotion. They know how to combine numbers with a myth. Therefore look at their approach – don’t be too technical / too financial. If the myth will hold, at the end depends on the people and organization, not on the investments in tools and services.

 

 

For me, the joint conference from CIMdata and Eurostep is always a conference to look forward too. The conference is not as massive as PLM-Vendor conferences (slick presentations and happy faces); it is more a collection of PLM-practitioners (this time a 100+) with the intent to discuss and share their understanding and challenges, independent from specific vendor capabilities or features.  And because of its size a great place to network with everyone.

Day 1 was more a business/methodology view on PLM and Day 2 more in-depth focusing on standards and BIM. In this post, the highlights from the first day.

The State of PLM

 

 

Peter Bilello, CIMdata’s president, kicked of with a review of the current state of the PLM industry. Peter mentioned the PLM-market grew by 9.4 % to $47.8 billion (more than the expected 7 %). Good for the PLM Vendors and implementers.

However, Peter also mentioned that despite higher spending, PLM is still considered as a solution for engineering, often implemented as PDM/CAD data management. Traditional organizational structures, marketing, engineering, manufacturing, quality were defined in the previous century and are measured as such.

This traditional approach blocks the roll-out of PLM across these disciplines. Who is the owner of PLM or where is the responsibility for a certain dataset are questions to solve. PLM needs to transform to deliver end-to-end support instead of remaining the engineering silo. Are we still talking about PLM in the future? See Peter’s takeaways below:

 

 

We do not want to open the discussion if the the name PLM should change – too many debates – however unfortunate too much framing in the past too.

The Multi View BOM

 

 

Fred Feru from Airbus presented a status the Aerospace & Defense PLM action group are working on: How to improve and standardize on a PLM solution for multi-view BOM management, in particular, the interaction between the EBOM and MBOM. See below:

 

You might think this is a topic already solved when you speak with your PLM-vendor. However, all existing solutions at the participant implementations rely on customizations and vary per company. The target is to come up with common requirements that need to be addressed in the standard methodology. Initial alignment on terminology was already a first required step as before you standardize, you need to have a common dictionary. Moreover, a typical situation in EVERY PLM implementation.

 

 

An initial version was shared with the PLM Editors for feedback and after iterations and agreement to come with a solution that can be implemented without customization. If you are interested in the details, you can read the current status here with Appendix A en Appendix B.

 

Enabling the Circular Economy for Long Term Prosperity

Graham Aid gave a fascinating presentation related to the potentials and flaws of creating a circular economy. Although Graham was not a PLM-expert (till he left this conference), as he is the Strategy and Innovation Coordinator for the Ragn-Sells Group, which performs environmental services and recycling across Sweden, Norway, Denmark, and Estonia. Have a look at their website here.

 

 

Graham shared with us the fact that despite logical arguments for a circular economy – it is more profitable at the end – however, our short term thinking and bias block us from doing the right things for future generations.

Look at the missing link for a closed resource-lifecycle view below.

Graham shared weird examples where scarce materials for the future currently were getting cheaper, and therefore there is no desire for recycling them. A sound barrier with rubble could contain more copper than copper ore in a mine.

In the PLM-domain, there is also an opportunity for supporting and working on more sustainable products and services. It is a mindset and can be a profitable business model. In the PDT 2014 conference, there was a session on circular product development with Xerox as the best example. Circular product development but also Product As A Service can be activities that contribute to a more sustainable world. Graham’s presentation was inspiring for our PLM community and hopefully planted a few seeds for the future. As it is all about thinking long-term.

 

 

With the PLM Green Alliance, I hope we will be able to create a larger audience and participation for a sustainable future. More about the PLM Green Alliance next week.

 

The Fundamental Role of PLM in Data-driven Product Portfolio Management

 

 

Hannu Hannila (Polar) presented his study related to data-driven product portfolio management and why it should be connected to PLM.  For many companies, it is a challenge to understand which products are performing well and where to invest. These choices are often supported by Data Damagement as Hannu called it.

An example below:

The result of this fragmented approach is that organizations make their decisions on subjective data and emotions. Where the assumption is that 20 % of the products a company is selling is related to 80 % of the revenue, Hannu found in his research companies where only 10 % of the products were contributing to the revenue. As PPM (Product Portfolio Management)  often is based on big emotions – who shouts the loudest mentality, influenced by the company’s pet products and influence by the HIPPO (HIghest Paid Person in the Office).  So how to get a better rationale?

 

 

Hannu explained a data-driven framework that would provide the right analytics on management level, depending on overall data governance from all disciplines and systems.  See below:

I liked Hannu’s conclusions as it aligns with my findings:

  • To be data-driven, you need Master Data Management and Data Governance
  • Product Portfolio Management is the driving discipline for PLM, and in a modern digital enterprise, it should be connected.

Sponsor sessions

Sponsors are always needed to keep a conference affordable for the attendees.  The sponsor sessions on day 1 were of good quality.  Here a quick overview and a link if you want to invest further

 

 

Configit – explaining the value of a configurator that connects marketing, technical and sales, introducing CLM (Configuration Lifecycle Management) – a new TLA

 

 

Aras – explaining their view on what we consider the digital thread

 

 

Variantum – explaining their CPQ solution as part of a larger suite of cloud offerings

 

 

Quick Release – bringing common sense to PLM implementations, similar to what I am doing as PLM coach – focusing on the flow of information

 

 

SAP – explaining the change in focus when a company moves toward a product as a service model

 

 

SharePLM – A unique company addressing the importance of PLM training delivered through eLearning

Conclusion

The first day was an easy to digest conference with a good quality of presentations. I only shared 50 % of the session as we already reached 1000+ words.  The evening I enjoyed the joint dinner, being able to network and discuss in depth with participants and finished with a social network event organized by SharePLM. Next week part 2.

In recent years, more and more PLM customers approached me with questions related to the usage of product information for downstream publishing. To be fair, this is not my area of expertise for the moment. However, with the mindset of a connected enterprise, this topic will come up.

For that reason, I have a strategic partnership with Squadra, a Dutch-based company, providing the same coaching model as TacIT; however, they have their roots in PIM and MDM.

Together we believe we can deliver a meaningful answer on the question: What are the complementary roles of PLM and PIM? In this post, our first joint introduction.

Note: The topic is not new. Already in 2005, Jim Brown from Tech-Clarity published a white-paper: The Complementary Roles of PIM and PLM. This all before digitization and connectivity became massive.

Let’s start with the abbreviations, the TLAs (Three-Letter-Acronyms) and their related domains

PLM – level 1
(Product Lifecycle Management – push)

For PLM, I want to stay close to the current definitions. It is the strategic approach to provide a governance infrastructure to deliver a product to the market. Starting from an early concept phase till manufacturing and in its extended definition also during its operational phase.
The focus with PLM is to reduce time to market by ensuring quality, cost, and delivery through more and more a virtual product definition, therefore being able to decide upfront for the best design choices, manufacturing options with the lowest cost. In the retail world, own-brand products are creating a need for PLM.

The above image is nicely summarizing the expected benefits of a traditional PLM implementation.

 

MDM (Master Data Management)

When product data is shared in an enterprise among multiple systems, there is a need for Master Data Management (MDM). Master Data Management focuses on a governance approach that information stored in various systems has the same meaning and shared values where relevant.

MDM guards and streamlines the way master data is entered, processed, guarded, and changed within the company, resulting in one single version of the truth and enabling different departments and systems to stay synced regarding their crucial data.

Interestingly, in the not-so-digital world of PLM, you do not see PLM vendors working on an MDM-approach. They do not care about an end-to-end connected strategy yet. I wrote about this topic in 2017 here: Master Data Management and PLM.

PIM (Product Information Management)

The need for PIM starts to become evident when selling products through various business channels. If you are a specialized machine manufacturer, your product information for potential customers might be very basic and based on a few highlights.

However, due to digitization and global connectivity, product information now becomes crucial to be available in real-time, wherever your customers are in the world.

In a competitive world, with an omnichannel strategy, you cannot survive without having your PIM streamlined and managed.

 

Product Innovation Platforms (PLM – Level 2 – Pull)

With the introduction of Product Innovation Platforms as described by CIMdata and Gartner, the borders of PLM, PIM, and MDM might become vague, as they might be all part of the same platform, therefore reducing the immediate need for an MDM-environment.  For example, companies like Propel, Stibo, and Oracle are building a joint PLM-PIM portfolio.

Let’s dive more profound in the two scenarios that we meet the most in business, PLM driving PIM (my comfort zone) and PIM driving the need for PLM (Squadra’s s area of expertise).

PLM driving PIM

Traditionally PLM (Product Lifecycle Management) has been focusing on several aspects of the product lifecycle. Here is an excellent definition for traditional PLM:

PLM is a collection of best practices, dependent per industry to increase product revenue, reduce product-related costs and maximize the value of the product portfolio  (source 2PLM)

This definition shows that PLM is a business strategy, not necessarily a system, but an infrastructure/approach to:

  • ensure shorter time to market with the right quality (increasing product revenue)
  • efficiently (reduce product-related costs – resources and scrap)
  • deliver products that bring the best market revenue (maximize the value of the product portfolio)

The information handled by traditional PLM consists mostly of design data, i.e., specifications, manufacturing drawings, 3D Models, and Bill of Materials (physical part definitions) combined with version and revision management. In elaborate environments combined with processes supporting configuration management.

PLM data is more focused on internal processes and quality than on targeting the company’s customers. Sometimes the 3D Design data is used as a base to create lightweight 3D graphics for quotations and catalogs, combining it with relevant sales data. Traditional marketing was representing the voice of the customer.

PLM implementations are more and more providing an enterprise backbone for product data. As a result of this expansion, there is a wish to support sales and catalogs, more efficiently, sharing master data from creation till publishing, combining the product portfolio with sales and service information in a digital way.

In particular, due to globalization, there was a need to make information globally available in different languages without a significant overhead of resources to manage the data or manage the disconnect from the real product data.

Companies that have realized the need for connected data understood that Product Master Data Management is more than only the engineering/manufacturing view. Product Master Data Management is also relevant to the sales and services view. Historically done by companies as a customized extension on their PLM-system, now more and more interfacing with specialized PIM-systems. Proprietary PLM-PIM interfaces exist. Hopefully, with digital transformation, a more standardized approach will appear.

 

PIM driving the need for PLM

Because of changes in the retail market, the need for information in the publishing processes is also changing. Retailers also need to comply with new rules and legislation. The source of the required product information is often in the design process of the product.

In parallel, there is an ongoing market trend to have more and more private label products in the (wholesale and retail) assortments. This means a growing number of retailers and wholesalers will become producers and will have their own Ideation and innovation process.

A good example is ingredients and recipe information in the food retail sector. This information needs to be provided now by suppliers or by their own brand department that owns the design process of the product. Similar to RoHS or REACH compliance in the industry.

Retail and Wholesale can tackle own brands reasonably well with their PIM systems (or Excels), making use of workflows and product statuses. However, over the years, the information demands have increased, and a need for more sophisticated lifecycle management has emerged and, therefore the need for PLM (in this case, PLM also stands for Private Label Management).

In the image below, illustrates a PLM layer and a PIM layer, all leading towards rich product information for the end-users (either B2B or B2C).

In the fast-moving consumer goods (FMCG) world, most innovative products are coming from manufacturers. They have pipelines with lots of ideas resulting in a limited number of sellable products. In the Wholesale and Retail business, the Private Label development process usually has a smaller funnel but a high pressure on time to market, therefore, a higher need for efficiency in the product data chain.

Technological changes, like 3D Printing, also change the information requirements in the retail and wholesale sectors. 3D printing can be used for creating spare parts on-demand, therefore changing the information flow in processes dramatically. Technical drawings and models that were created in the design process, used for mass production, are now needed in the retail process closer to the end customer.

These examples make it clear that more and more information is needed for publication in the sales process and therefore needs to be present in PIM systems. This information needs to be collected and available during the PLM release process. A seamless connection between the product release and sales processes will support the changing requirements and will reduce errors and rework in on data.

PLM and PIM are two practices that need to go hand in hand like a relay baton in athletics. Companies that are using both tools must also organize themselves in a way that processes are integrated, and data governance is in place to keep things running smoothly.

 

Conclusion

Market changes and digital transformation force us to work in value streams along the whole product lifecycle ensuring quality and time to market. PLM and PIM will be connected domains in the future, to enable smooth product go-to-market. Important is the use of data standards (PLM and PIM should speak a common language) – best based on industry standards so that cross-company communication on product data is possible.

What do you think? Do you see PLM and PIM getting together too, in your business?

Please share in the comments.

 

 

 

 

 

Last week I read Verdi Ogewell’ s article:  PTC puts the Needle to the Digital Thread on Engineering.com where Verdi raised the question (and concluded) who is the most visionary PLM CEO – Bernard Charles from Dassault Systemes or Jim Heppelman from PTC. Unfortunate again, an advertorial creating more haziness around modern PLM than adding value.

People need education and Engineering.com is/was a respected site for me, as they state in their Engineering.com/about statement:

Valuable Content for Busy Engineers. Engineering.com was founded on the simple mission to help engineers be better.

Unfortunate this is not the case in the PLM domain anymore. In June, we saw an article related to the failing PLM migration at Ericsson – see The PLM migration dilemma. Besides the fact that a big-bang migration had failed at Ericsson, the majority of the article was based on rumors and suggestions, putting the sponsor of this article in a better perspective.

Of course, Engineering.com needs sponsoring to host their content, and vendors are willing to spend marketing money on that. However, it would be fairer to mention in a footnote who sponsored the article – although per article you can guess. Some more sincere editors or bloggers mention their sponsoring that might have influenced their opinion.

Now, why did the article PTC puts the Needle to the Digital Thread made me react ?

Does a visionary CEO pay off?

It can be great to have a visionary CEO however, do they make the company and their products/services more successful? For every successful visionary CEO, there are perhaps ten failing visionary CEOs as the stock market or their customers did not catch their vision.

There is no lack of PLM vision as Peter Bilello mapped in 2014 when imagining the gaps between vision, available technology, and implementations at companies (leaders and followers). See below:

The tremendous gap between vision and implementations is the topic that concerns me the most. Modern PLM is about making data available across the enterprise or even across the company’s ecosystem. It is about data democratization that allows information to flow and to be presented in context, without the need to recreate this information again.

And here the marketing starts. Verdi writes:

PTC’s Internet of Things (IoT), Industrial Internet of Things (IIoT), digital twin and augmented reality (AR) investments, as well as the collaboration with Rockwell Automation in the factory automation arena, have definitely placed the company in a leading position in digital product realization, distribution and aftermarket services

With this marketing sentence, we are eager to learn why

“With AR, for example, we can improve the quality control of the engines,” added Volvo Group’s Bertrand Felix, during an on-stage interview by Jim Heppelmann. Heppelmann then went down to a Volvo truck with the engine lifted out of its compartment. Using a tablet, he was able to show how the software identified the individual engine, the parts that were included, and he could also pick up the 3D models of each component and at the same time check that everything was included and in the right place.

Impressive – is it real?

The point is that this is the whole chain for digital product realization–development and manufacturing–that the Volvo Group has chosen to focus on. Sub-components have been set up that will build the chain, much is still in the pilot stage, and a lot remains to be done. But there is a plan, and the steps forward are imminent.

OK, so it is a pilot, and a lot remains to be done – but there is a plan. I am curious about the details of that plan, as a little later, we learn from the CAD story:

The Pro/ENGINEER “inheritor” Creo (engine, chassis) is mainly used for CAD and creation of digital twins, but as previously noted, Dassault Systémes’ CATIA is also still used. Just as in many other large industrial organizations, Autodesk’s AutoCAD is also represented for simpler design solutions.

There goes the efficient digital dream. Design data coming from CATIA needs to be recreated in Creo for digital twin support. Data conversion or recreation is an expensive exercise and needs to be reliable and affordable as the value of the digital twin is gone once the data is incorrect.

In a digital enterprise, you do not want silos to work with their own formats, you want a digital thread based on (neutral) models that share metadata/parameters from design to service.

So I dropped the article and noticed Oleg had already commented faster than me in his post: Does PLM industry need a visionary pageant? Oleg refers also to CIMdata, as they confirmed in 2018 that the concept of a platform for product innovation (PIP), or the beyond PLM is far from reality in companies. Most of the time, a PLM-implementation is mainly a beyond PDM environment, not really delivering product data downstream.

I am wholly aligned with Oleg’s  technical conclusion:

What is my(Oleg’s) conclusion? PLM industry doesn’t need another round of visionary pageants. I’d call democratization, downstream usage and openness as biggest challenges and opportunities in PLM applications. Recent decades of platform development demonstrated the important role network platforms played in the development of global systems and services. PLM paradigm change from isolated vertical platforms to open network services required to bring PLM to the next level. Just my thoughts..

My comments to Oleg’s post:

(Jos) I fully agree we do not need more visionary PLM pageants. It is not about technology and therefore I have to disagree with your point about Aras. You call it democratization and openness of data a crucial point – and here I agree – be it that we probably disagree about how to reach this – through standards or through more technology. My main point to be made (this post ) is that we need visionary companies that implement and rethink their processes and are willing to invest resources in that effort. Most digital transformation projects related to PLM fail because the existing status quo/ middle management has no incentive to change. More thoughts to come

And this the central part of my argumentation – it is not about technology (only).

Organizational structures are blocking digital transformation

Since 2014 I have been following several larger manufacturing companies on their path from pushing products to the market in a linear mode towards a customer-driven, more agile, fast responding enterprise. As this is done by taking benefit of digital technologies, we call this process: digital transformation.

(image depicting GE’s digital thread)

What I have learned from these larger enterprises, and both Volvo Trucks and GE as examples, that there is a vision for an end result. For GE, it is the virtual twin of their engines monitored and improved by their Predix platform. For Volvo Trucks, we saw the vision in the quote from Verdi’s article before.

However, these companies are failing in creating a horizontal mindset inside their companies. Data can only be efficient used downstream if there is a willingness to work on collecting the relevant data upstream and delivering this information in an accessible format, preferably data-driven.

The Middle Management Dilemma

And this leads to my reference to middle management. Middle managers learn about the C-level vision and are pushed to make this vision happen. However, they are measured and driven to solve these demands, mainly within their own division or discipline. Yes, they might create goodwill for others, but when it comes to money spent or changing people responsibilities, the status quo will remain.

I wrote about this challenge in The Middle Management dilemma. Digital transformation, of course, is enabled by digital technologies, but it does not mean the technology is creating the transformation. The crucial fact lies in making companies more flexible in their operations, yet establishing better and new contacts with customers.

It is interesting to see that the future of businesses is looking into agile, multidisciplinary teams that can deliver incremental innovations to the company’s portfolio. Somehow going back to the startup culture inside a more significant enterprise. Having worked with several startups, you see the outcome-focus as a whole in the beginning – everyone contributes. Then when the size of the company grows, middle-management is introduced, and most likely silos are created as the middle management gets their own profit & loss targets.

Digital Transformation myths debunked

This week Helmut Romer (thanks Helmut) pointed me to the following HBR-article: Digital does not need to be disruptive where the following myths are debunked:

  1. Myth: Digital requires radical disruption of the value proposition.
    Reality: It usually means using digital tools to better serve the known customer need.
  2. Myth: Digital will replace physical
    Reality: It is a “both/and.”
  3. Myth: Digital involves buying start-ups.
    Reality: It involves protecting start-ups.
  4. Myth: Digital is about technology.
    Reality: It’s about the customer
  5. Myth: Digital requires overhauling legacy systems.
    Reality: It’s more often about incremental bridging.

If you want to understand these five debunked myths, take your time to read the full article, very much aligned with my argumentation, albeit it that my focus is more on the PLM domain.

Conclusions

Vendor sponsoring at Engineering.com has not improved the quality of their PLM articles and creates misleading messages. Especially as the sponsor is not mentioned, and the sponsor is selling technology – the vision gap is too big with reality to compete around a vision.

Transforming companies to take benefit of new technologies requires an end-to-end vision and mindset based on achievable, incremental learning steps. The way your middle management is managed and measured needs to be reworked as the focus is on horizontal flow and understanding of customer/market-oriented processes.

 

Three weeks ago, I closed my PLM-twisted mind for a short holiday. Meanwhile, some interesting posts appeared about the PLM journey.

  • Is it a journey?
  • Should the journey be measurable?
  • And what kind of journey could you imagine?

Together these posts formed a base for a decent discussion amongst the readers.  I like these discussions. For me, the purpose of blogging is not the same as tweeting. It is not about just making noise so others will chime in or react (tweeting), it is about sharing an opinion, and if more people are interested, the discussion can start. And a discussion is not about right or false, as many conversations happen to be nowadays, it is about learning.

Let’s start with the relevant posts.

How to measure PLM?

The initial discussion started with Oleg Shilovitsky’s post about the need to measure the value of PLM. As Oleg mentions in his comments:

“During the last decades, I learned that every company that measured what they do was winning the business and succeeded (let’s count Google, Amazon, etc ..)”

This is an interesting statement, just measure! The motto people are using for digital businesses. In particular for the fast-moving software business. Sounds great, so let’s measure PLM. What can we measure with PLM? Oleg suggests as an example:

“Let’s say before PLM implemented a specific process, sales needed 2 days to get a quote. After PLM process implementation, it is 15 min.”

So what does this result tell us? Your sales can do 64 times more sales quotes. Do we need fewer salespeople now? We do not know from this KPI what is the real value for the company. This because there are so many other dependencies related to this process, and that makes PLM different from, for example, ERP. We do not talk about optimizing a process as Oleg might suggest below:

“Some of my PLM friends like to say – PLM is a journey and not some kind of software. Well, I’m not sure to agree about “journey,” but I can take PLM as a process. A process, which includes all stages of product development, manufacturing, support, and maintenance.”

Note: I do not want to be picky on Oleg, as he is provoking us all many times with just his thoughts. Moreover, several of them are a good points for discussion. So please dive into his LinkedIn posts and follow the conversation.

In Oleg’s follow-up post on measuring the value, he continued with Can we measure the PLM-journey which summarizes the comments from the previous post with a kind of awkward conclusion:

What is my conclusion? It is a time for PLM get out of old fashion guessing and strategizing and move into digital form of thinking – calculating everything. Modern digital businesses are strongly focused on the calculation and measurement of everything. Performance of websites, metrics of application usage, user experience, efficiency, AB testing of everything. Measurement of PLM related activity sounds like no brainier decision to me. Just my thoughts…

I think all of us agree that there needs to be a kind of indicative measurement in place to justify investments in place. There must be expected benefits that solve current business problems or bottlenecks.

My points that I want to share with you are:

  • It is hard to measure non-comparable ways of working – how do you measure collaboration?
  • Do you know what to measure? – engineering/innovation is not an ERP process
  • People and culture have so much impact on the results – how do you measure your company’s capability to adapt to new ways of working?

Meanwhile, we continue our journey…

Is PLM a never-ending journey?

In the context of the discussion related to the PLM journey, I assume Chad Jackson from Lifecycle Insights added his 3 minutes of thoughts. You can watch the video here:

Vlogging seems to become more prevalent in the US. The issue for me is that vlogs only touch the surface, and they are hard to scan for interesting reusable content. Something you miss when you are an experienced speed-reader. I like written content as it is easier to pick and share relevant pieces, like what I am doing now in this post.

Chad states that as long as PLM delivers quantified value, PLM could be expanding. This sounds like a journey, and I could align here. The only additional thought I would like to add to this point is that it is not necessary expanding all the time, it is also about a continuous change in the world and therefore your organization. So instead of expanding, there might be a need to do things differently: Have you noticed PLM is changing.

Next Chad mentions organizational fatigue. I understand the point – our society and business are currently changing extremely fast, which causes people to long for the past. A typical behavior I observe everywhere: in the past, everything was better. However, if companies would go back and operate like in the past, they would be out of business. We moved from the paper drawing board to 3D CAD, managing it through PDM and PLM to remain significant. So there is always a journey.

Fatigue comes from choosing the wrong directions, having a reactive culture – instead of being inspired and motivated to reach the next stage, the current stage is causing already so much stress. Due to the reactive culture, people cannot imagine a better future – they are too busy. I believe it is about culture and inspiration that makes companies successful – not by just measuring.  For avoiding change, think about the boiling frog metaphor, and you see what I mean

 

Upgrading to PLM when PDM falls short

At the same time, Jim Brown from Tech-Clarity published a PTC-sponsored eBook: Upgrading to PLM when PDM fall short, in which as he states:

This eBook explains how to recognize that you’ve outgrown PDM and offers several options to find the data and process management capabilities your company needs, whether it’s time to find a more capable PDM or upgrade to PLM. It also provides practical advice on what to look for in a PLM solution, to ensure a successful implementation, and in a software partner.

Jim is mentioning various business drivers that can drive this upgrade path. Enlarge the image to the left. I challenge all the believers in measurable digital results to imagine which KPIs they would use and how they can be related to pure PLM.

Here the upgrade process is aiming at replacing PDM by PLM something PLM vendors like. Immediate a significant numbers of licenses for the same basic PDM functionality – for your company hard to justify there is no additional value.

In many situations, I have seen that this type of PDM upgrade projects became advanced PDM projects – not PLM. The new PLM system was introduced in the engineering department and became an even bigger silo than before as other disciplines/departments were not willing to work with this new “monster” and preferred their own system. They believe that PLM is a system to be purchased and implemented, which is killing for a real PLM strategy.

Therefore I liked Oleg Shilovitsky’s post: 3 Reasons for Not Growing Existing PDM Into the Full PLM System.  Where Oleg’s points were probably more technology-driven, the value of this post was extended in the discussion. It became a discussion where various people and different opinions which I would like to have in real-time. The way LinkedIn filters/prioritizes comments makes it hard to have a chronological view of the discussion.

Still, if you are interested and have time for a puzzle, follow this discussion and add your thoughts

Conclusion

During my holidays, there was a vivid discussion related to the PLM value and journey. Looking back, it is clear we are part of a PLM journey. Some do not take part in the journey and keep on hanging to the past, those who understand the journey are all seeing different Points Of Interests – the characteristics of a journey

In my previous post, the PLM blame game, I briefly mentioned that there are two delivery models for PLM. One approach based on a PLM system, that contains predefined business logic and functionality, promoting to use the system as much as possible out-of-the-box (OOTB) somehow driving toward a certain rigidness or the other approach where the PLM capabilities need to be developed on top of a customizable infrastructure, providing more flexibility. I believe there has been a debate about this topic over more than 15 years without a decisive conclusion. Therefore I will take you through the pros and cons of both approaches illustrated by examples from the field.

PLM started as a toolkit

The initial cPDM/PLM systems were toolkits for several reasons. In the early days, scalable connectivity was not available or way too expensive for a standard collaboration approach. Engineering information, mostly design files, needed to be shared globally in an efficient manner, and the PLM backbone was often a centralized repository for CAD-data. Bill of Materials handling in PLM was often at a basic level, as either the ERP-system (mostly Aerospace/Defense) or home-grown developed BOM-systems(Automotive) were in place for manufacturing.

Depending on the business needs of the company, the target was too connect as much as possible engineering data sources to the PLM backbone – PLM originated from engineering and is still considered by many people as an engineering solution. For connectivity interfaces and integrations needed to be developed in a time that application integration frameworks were primitive and complicated. This made PLM implementations complex and expensive, so only the large automotive and aerospace/defense companies could afford to invest in such systems. And a lot of tuition fees spent to achieve results. Many of these environments are still operational as they became too risky to touch, as I described in my post: The PLM Migration Dilemma.

The birth of OOTB

Around the year 2000, there was the first development of OOTB PLM. There was Agile (later acquired by Oracle) focusing on the high-tech and medical industry. Instead of document management, they focused on the scenario from bringing the BOM from engineering to manufacturing based on a relatively fixed scenario – therefore fast to implement and fast to validate. The last point, in particular, is crucial in regulated medical environments.

At that time, I was working with SmarTeam on the development of templates for various industries, with a similar mindset. A predefined template would lead to faster implementations and therefore reducing the implementation costs. The challenge with SmarTeam, however, was that is was very easy to customize, based on Microsoft technology and wizards for data modeling and UI design.

This was not a benefit for OOTB-delivery as SmarTeam was implemented through Value Added Resellers, and their major revenue came from providing services to their customers. So it was easy to reprogram the concepts of the templates and use them as your unique selling points towards a customer. A similar situation is now happening with Aras – the primary implementation skills are at the implementing companies, and their revenue does not come from software (maintenance).

The result is that each implementer considers another implementer as a competitor and they are not willing to give up their IP to the software company.

SmarTeam resellers were not eager to deliver their IP back to SmarTeam to get it embedded in the product as it would reduce their unique selling points. I assume the same happens currently in the Aras channel – it might be called Open Source however probably it is only high-level infrastructure.

Around 2006 many of the main PLM-vendors had their various mid-market offerings, and I contributed at that time to the SmarTeam Engineering Express – a preconfigured solution that was rapid to implement if you wanted.

Although the SmarTeam Engineering Express was an excellent sales tool, the resellers that started to implement the software began to customize the environment as fast as possible in their own preferred manner. For two reasons: the customer most of the time had different current practices and secondly the money come from services. So why say No to a customer if you can say Yes?

OOTB and modules

Initially, for the leading PLM Vendors, their mid-market templates were not just aiming at the mid-market. All companies wanted to have a standardized PLM-system with as little as possible customizations. This meant for the PLM vendors that they had to package their functionality into modules, sometimes addressing industry-specific capabilities, sometimes areas of interfaces (CAD and ERP integrations) as a module or generic governance capabilities like portfolio management, project management, and change management.

The principles behind the modules were that they need to deliver data model capabilities combined with business logic/behavior. Otherwise, the value of the module would be not relevant. And this causes a challenge. The more business logic a module delivers, the more the company that implements the module needs to adapt to more generic practices. This requires business change management, people need to be motivated to work differently. And who is eager to make people work differently? Almost nobody,  as it is an intensive coaching job that cannot be done by the vendors (they sell software), often cannot be done by the implementers (they do not have the broad set of skills needed) or by the companies (they do not have the free resources for that). Precisely the principles behind the PLM Blame Game.

OOTB modularity advantages

The first advantage of modularity in the PLM software is that you only buy the software pieces that you really need. However, most companies do not see PLM as a journey, so they agree on a budget to start, and then every module that was not identified before becomes a cost issue. Main reason because the implementation teams focus on delivering capabilities at that stage, not at providing value-based metrics.

The second potential advantage of PLM modularity is the fact that these modules supposed to be complementary to the other modules as they should have been developed in the context of each other. In reality, this is not always the case. Yes, the modules fit nicely on a single PowerPoint slide, however, when it comes to reality, there are separate systems with a minimum of integration with the core. However, the advantage is that the PLM software provider now becomes responsible for upgradability or extendibility of the provided functionality, which is a serious point to consider.

The third advantage from the OOTB modular approach is that it forces the PLM vendor to invest in your industry and future needed capabilities, for example, digital twins, AR/VR, and model-based ways of working. Some skeptic people might say PLM vendors create problems to solve that do not exist yet, optimists might say they invest in imagining the future, which can only happen by trial-and-error. In a digital enterprise, it is: think big, start small, fail fast, and scale quickly.

OOTB modularity disadvantages

Most of the OOTB modularity disadvantages will be advantages in the toolkit approach, therefore discussed in the next paragraph. One downside from the OOTB modular approach is the disconnect between the people developing the modules and the implementers in the field. Often modules are developed based on some leading customer experiences (the big ones), where the majority of usage in the field is targeting smaller companies where people have multiple roles, the typical SMB approach. SMB implementations are often not visible at the PLM Vendor R&D level as they are hidden through the Value Added Reseller network and/or usually too small to become apparent.

Toolkit advantages

The most significant advantage of a PLM toolkit approach is that the implementation can be a journey. Starting with a clear business need, for example in modern PLM, create a digital thread and then once this is achieved dive deeper in areas of the lifecycle that require improvement. And increased functionality is only linked to the number of users, not to extra costs for a new module.

However, if the development of additional functionality becomes massive, you have the risk that low license costs are nullified by development costs.

The second advantage of a PLM toolkit approach is that the implementer and users will have a better relationship in delivering capabilities and therefore, a higher chance of acceptance. The implementer builds what the customer is asking for.

However, as Henry Ford said, if I would ask my customers what they wanted, they would ask for faster horses.

Toolkit considerations

There are several points where a PLM toolkit can be an advantage but also a disadvantage, very much depending on various characteristics of your company and your implementation team. Let’s review some of them:

Innovative: a toolkit does not provide an innovative way of working immediately. The toolkit can have an infrastructure to deliver innovative capabilities, even as small demonstrations, the implementation, and methodology to implement this innovative way of working needs to come from either your company’s resources or your implementer’s skills.

Uniqueness: with a toolkit approach, you can build a unique PLM infrastructure that makes you more competitive than the other. Don’t share your IP and best practices to be more competitive. This approach can be valid if you truly have a competing plan here. Otherwise, the risk might be you are creating a legacy for your company that will slow you down later in time.

Performance: this is a crucial topic if you want to scale your solution to the enterprise level. I spent a lot of time in the past analyzing and supporting SmarTeam implementers and template developers on their journey to optimize their solutions. Choosing the right algorithms, the right data modeling choices are crucial.

Sometimes I came into a situation where the customer blamed SmarTeam because customizations were possible – you can read about this example in an old LinkedIn post: the importance of a PLM data model

Experience: When you plan to implement PLM “big” with a toolkit approach, experience becomes crucial as initial design decisions and scope are significant for future extensions and maintainability. Beautiful implementations can become a burden after five years as design decisions were not documented or analyzed. Having experience or an experienced partner/coach can help you in these situations. In general, it is sporadic for a company to have internally experienced PLM implementers as it is not their core business to implement PLM. Experienced PLM implementers vary from size and skills – make the right choice.

 

Conclusion

After writing this post, I still cannot write a final verdict from my side what is the best approach. Personally, I like the PLM toolkit approach as I have been working in the PLM domain for twenty years seeing and experiencing good and best practices. The OOTB-box approach represents many of these best practices and therefore are a safe path to follow. The undecisive points are who are the people involved and what is your business model. It needs to be an end-to-end coherent approach, no matter which option you choose.

 

 

 

%d bloggers like this: