You are currently browsing the tag archive for the ‘Digital PLM’ tag.

 

Earth GIF - Find & Share on GIPHY

At this moment we are in the middle of the year. Usually for me a quiet time and a good time to reflect on what has happened so far and to look forward.

Three themes triggered me to write this half-year:

  • The changing roles of (PLM) consultancy
  • The disruptive effect of digital transformation on legacy PLM
  • The Model-driven approaches

A short summary per theme here with links to the original posts for those who haven’t followed the sequence.

The changing roles of (PLM) consultancy

Triggered by Oleg Shilovitsky’s post Why traditional PLM ranking is dead. PLM ranking 2.0 a discussion started related to the changing roles of PLM choice and the roles of a consultant.  Oleg and I agreed that using the word dead in a post title is a way to catch extra attention. And as many people do not read more than the introduction, this is a way to frame ideas (not invented by us, look at your newspaper and social media posts).  Please take your time and read this post till the end.

Oleg and I concluded that the traditional PLM status reports provided by consultancy firms are no longer is relevant. They focus on the big vendors, in a status-quo and most of them are 80 % the same on their core PLM capabilities. The challenge comes in how to select a PLM approach for your company.

Here Oleg and I differ in opinion. I am more looking at PLM from a business transformation point of view, how to improve your business with new ways of working. The role of a consultant is crucial here as the consultant can help to formalize the company’s vision and areas to focus on for PLM. The value of the PLM consultant is to bring experience from other companies instead of inventing new strategies per company. And yes, a consultant should get paid for this added value.

Oleg believes more in the bottom-up approach where new technology will enable users to work differently and empower themselves to improve their business (without calling it PLM). More or less concluding there is no need for a PLM consultant as the users will decide themselves about the value of the selected technology. In the context of Oleg’s position as CEO/Co-founder of OpenBOM, it is a logical statement, fighting for the same budget.

The discussion ended during the PLMx conference in Hamburg, where Oleg and I met with an audience recorded by MarketKey. You can find the recording Panel Discussion: Digital Transformation and the Future of PLM Consulting here.
Unfortunate, like many discussions, no conclusion. My conclusion remains the same – companies need PLM coaching !

The related post to this topic are:

 

The disruptive effect of digital transformation on legacy PLM

A topic that I have discussed the past two years is that current PLM is not compatible with a modern data-driven PLM. Note: data-driven PLM is still “under-development”. Where in most companies the definition of the products is stored in documents / files, I believe that in order to manage the complexity of products, hardware and software in the future, there is a need to organize data related to models not to files. See also: From Item-centric to model-centric ?

For a company it is extremely difficult to have two approaches in parallel as the first reaction is: “let’s convert the old data to the new environment”.

This statement has been proven impossible in most of the engagements I am involved in and here I introduced the bimodal approach as a way to keep the legacy going (mode 1) and scale-up for the new environment (mode 2).

A bimodal approach is sometimes acceptable when the PLM software comes from two different vendors. Sometimes this is also called the overlay approach – the old system remains in place and a new overlay is created to connect the legacy PLM system and potentially other systems like ALM or MBSE environments. For example some of the success stories for Aras complementing Siemens PLM.

Like the bimodal approach the overlay approach creates the illusion that in the near future the old legacy PLM will disappear. I partly share that illusion when you consider the near future a period of 5 – 10+ years depending on the company’s active products. Faster is not realistic.

And related to bimodal, I now prefer to use the terminology used by McKinsey: our insights/toward an integrated technology operating model in the context of PLM.

The challenge is that PLM vendors are reluctant to support a bimodal approach for their own legacy PLM as then suddenly this vendor becomes responsible for all connectivity between mode 1 and mode 2 data – every vendors wants to sell only the latest.

I will elaborate on this topic during the PDT Europe conference in Stuttgart – Oct 25th . No posts on this topic this year (yet) as I am discussing, learning and collecting examples from the field. What kept me relative busy was the next topic:

The Model-driven approaches

Most of my blogging time I spent on explaining the meaning behind a modern model-driven approach and its three main aspects: Model-Based Systems Engineering, Model-Based Definition and Digital Twins. As some of these aspects are still in the hype phase, it was interesting to see the two different opinions are popping up. On one side people claiming the world is still flat (2D), considering model-based approaches just another hype, caused by the vendors. There is apparently no need for digital continuity. If you look into the reactions from certain people, you might come to the conclusion it is impossible to have a dialogue, throwing opinions is not a discussion..

One of the reasons might be that people reacting strongly have never experienced model-based efforts in their life and just chime in or they might have a business reason not to agree to model-based approached as it does not align with their business? It is like the people benefiting from the climate change theory – will the vote against it when facts are known ? Just my thoughts.

There is also another group, to which I am connected, that is quite active in learning and formalizing model-based approaches. This in order to move forward towards a digital enterprise where information is connected and flowing related to various models (behavior models, simulation models, software models, 3D Models, operational models, etc., etc.) . This group of people is discussing standards and how to use and enhance them. They discuss and analyze with arguments and share lessons learned. One of the best upcoming events in that context is the joined CIMdata PLM Road Map EMEA and the PDT Europe 2018 – look at the agenda following the image link and you should get involved too – if you really care.

 

And if you are looking into your agenda for a wider, less geeky type of conference, consider the PI PLMx CHICAGO 2018 conference on Nov 5 and 6. The agenda provides a wider range of sessions, however I am sure you can find the people interested in discussing model-based learnings there too, in particular in this context Stream 2: Supporting the Digital Value Chain

My related posts to model-based this year were:

Conclusion

I spent a lot of time demystifying some of PLM-related themes. The challenge remains, like in the non-PLM world, that it is hard to get educated by blog posts as you might get over-informed by (vendor-related) posts all surfing somewhere on the hype curve. Do not look at the catchy title – investigate and take time to understand HOW things will this work for you or your company. There are enough people explaining WHAT they do, but HOW it fit in a current organization needs to be solved first. Therefore the above three themes.

Advertisements

This is my concluding post related to the various aspects of the model-driven enterprise. We went through Model-Based Systems Engineering (MBSE) where the focus was on using models (functional / logical / physical / simulations) to define complex product (systems). Next we discussed Model Based Definition / Model-Based Enterprise (MBD/MBE), where the focus was on data continuity between engineering and manufacturing by using the 3D Model as a master for design, manufacturing and eventually service information.

And last time we looked at the Digital Twin from its operational side, where the Digital Twin was applied for collecting and tuning physical assets in operation, which is not a typical PLM domain to my opinion.

Now we will focus on two areas where the Digital Twin touches aspects of PLM – the most challenging one and the most over-hyped areas I believe. These two areas are:

  • The Digital Twin used to virtually define and optimize a new product/system or even a system of systems. For example, defining a new production line.
  • The Digital Twin used to be the virtual replica of an asset in operation. For example, a turbine or engine.

Digital Twin to define a new Product/System

There might be some conceptual overlap if you compare the MBSE approach and the Digital Twin concept to define a new product or system to deliver. For me the differentiation would be that MBSE is used to master and define a complex system from the R&D point of view – unknown solution concepts – use hardware or software?  Unknown constraints to be refined and optimized in an iterative manner.

In the Digital Twin concept, it is more about a defining a system that should work in the field. How to combine various systems into a working solution and each of the systems has already a pre-defined set of behavioral / operational parameters, which could be 3D related but also performance related.

You would define and analyze the new solution virtual to discover the ideal solution for performance, costs, feasibility and maintenance. Working in the context of a virtual model might take more time than traditional ways of working, however once the models are in place analyzing the solution and optimizing it takes hours instead of weeks, assuming the virtual model is based on a digital thread, not a sequential process of creating and passing documents/files. Virtual solutions allow a company to optimize the solution upfront instead of costly fixing during delivery, commissioning and maintenance.

Why aren’t we doing this already? It takes more skilled engineers instead of cheaper fixers downstream. The fact that we are used to fixing it later is also an inhibitor for change. Management needs to trust and understand the economic value instead of trying to reduce the number of engineers as they are expensive and hard to plan.

In the construction industry, companies are discovering the power of BIM (Building Information Model) , introduced to enhance the efficiency and productivity of all stakeholders involved. Massive benefits can be achieved if the construction of the building and its future behavior and maintenance can be optimized virtually compared to fixing it in an expensive way in reality when issues pop up.

The same concept applies to process plants or manufacturing plants where you could virtually run the (manufacturing) process. If the design is done with all the behavior defined (hardware-in-the-loop simulation and software-in-the-loop) a solution has been virtually tested and rapidly delivered with no late discoveries and costly fixes.

Of course it requires new ways of working. Working with digital connected models is not what engineering learn during their education time – we have just started this journey. Therefore organizations should explore on a smaller scale how to create a full Digital Twin based on connected data – this is the ultimate base for the next purpose.

Digital Twin to match a product/system in the field

When you are after the topic of a Digital Twin through the materials provided by the various software vendors, you see all kinds of previews what is possible. Augmented Reality, Virtual Reality and more. All these presentations show that clicking somewhere in a 3D Model Space relevant information pops-up. Where does this relevant information come from?

Most of the time information is re-entered in a new environment, sometimes derived from CAD but all the metadata comes from people collecting and validating data. Not the type of work we promote for a modern digital enterprise. These inefficiencies are good for learning and demos but in a final stage a company cannot afford silos where data is collected and entered again disconnected from the source.

The main problem: Legacy PLM information is stored in documents (drawings / excels) and not intended to be shared downstream with full quality.
Read also: Why PLM is the forgotten domain in digital transformation.

If a company has already implemented an end-to-end Digital Twin to deliver the solution as described in the previous section, we can understand the data has been entered somewhere during the design and delivery process and thanks to a digital continuity it is there.

How many companies have done this already? For sure not the companies that are already a long time in business as their current silos and legacy processes do not cater for digital continuity. By appointing a Chief Digital Officer, the journey might start, the biggest risk the Chief Digital Officer will be running another silo in the organization.

So where does PLM support the concept of the Digital Twin operating in the field?

For me, the IoT part of the Digital Twin is not the core of a PLM. Defining the right sensors, controls and software are the first areas where IoT is used to define the measurable/controllable behavior of a Digital Twin. This topic has been discussed in the previous section.

The second part where PLM gets involved is twofold:

  • Processing data from an individual twin
  • Processing data from a collection of similar twins

Processing data from an individual twin

Data collected from an individual twin or collection of twins can be analyzed to extract or discover failure opportunities. An R&D organization is interested in learning what is happening in the field with their products. These analyses lead to better and more competitive solutions.

Predictive maintenance is not necessarily a part of that.  When you know that certain parts will fail between 10.000 and 20.000 operating hours, you want to optimize the moment of providing service to reduce downtime of the process and you do not want to replace parts way too early.


The R&D part related to predictive maintenance could be that R&D develops sensors inside this serviceable part that signal the need for maintenance in a much smaller time from – maintenance needed within 100 hours instead of a bandwidth of 10.000 hours. Or R&D could develop new parts that need less service and guarantee a longer up-time.

For an R&D department the information from an individual Digital Twin might be only relevant if the Physical Twin is complex to repair and downtime for each individual too high. Imagine a jet engine, a turbine in a power plant or similar. Here a Digital Twin will allow service and R&D to prepare maintenance and simulate and optimize the actions for the physical world before.

The five potential platforms of a digital enterprise

The second part where R&D will be interested in, is in the behavior of similar products/systems in the field combined with their environmental conditions. In this way, R&D can discover improvement points for the whole range and give incremental innovation. The challenge for this R&D organization is to find a logical placeholder in their PLM environment to collect commonalities related to the individual modules or components. This is not an ERP or MES domain.

Concepts of a logical product structure are already known in the oil & gas, process or nuclear industry and in 2017 I wrote about PLM for Owners/Operators mentioning Bjorn Fidjeland has always been active in this domain, you can find his concepts at plmPartner here  or as an eLearning course at SharePLM.

To conclude:

  • This post is way too long (sorry)
  • PLM is not dead – it evolves into one of the crucial platforms for the future – The Product Innovation Platform
  • Current BOM-centric approach within PLM is blocking progress to a full digital thread

More to come after the holidays (a European habit) with additional topics related to the digital enterprise

 

A month ago I announced to write a series of posts related to the various facets of Model-Based. As I do not want to write a book for a limited audience, I still believe blog posts are an excellent way to share knowledge and experience to a wider audience. Remember PLM is about sharing!

There are three downsides to this approach:

  • you have to chunk the information into pieces; my aim is not to exceed 1000 words per post
  • Isolated posts can be taken out of context (in a positive or negative way)
  • you do not become rich and famous for selling your book

Model-Based ways of working are a hot topic and crucial for a modern digital enterprise.  The modern digital enterprise does not exist yet to my knowledge, but the vision is there. Strategic consultancy firms are all active exploring and explaining the potential benefits – I have mentioned McKinsey / Accenture / Capgemini before.

In the domain of PLM, there is a bigger challenge as here we are suffering from the fact that the word “Model” immediately gets associated with a 3D Model. In addition to the 3D CAD Model, there is still a lot of useful legacy data that does not match with the concepts of a digital enterprise. I wrote and spoke about this topic a year ago. Among others at PI 2017 Berlin and you can  check this presentation on SlideShare: How digital transformation affects PLM

Back to the various aspects of Model-Based

My first post: Model-Based – an introduction described my intentions what I wanted to explain.  I got some interesting feedback and insights from my readers . Some of the people who responded understood that the crucial characteristic of the model-based enterprise is to use models to master a complex environment. Business Models, Mathematical Models, System Models are all part of a model-based enterprise, and none of them have a necessary relation to the 3D CAD model.

Why Model-Based?

Because this is an approach to master complex environments ! If you are studying the concepts for a digital enterprise model, it is complex. Artificial intelligence, predictive actions all need a model to deliver. The interaction and response related to my first blog post did not show any problems – only a positive mindset to further explore. For example, if you read this blog post from Contact, you will see the message came across very well: Model-Based in  Model-Based Systems Engineering – what’s up ?

Where the confusion started

My second post: Why Model-Based? The 3D CAD Model  was related to model-based, focusing on the various aspects related to the 3D CAD model, without going into all the details. In particular, in the PLM world, there is a lot of discussion around Model-Based Design or Model-Based Definition, where new concepts are discussed to connect engineering and manufacturing in an efficient and modern data-driven way. Lifecycle Insights, Action Engineering, Engineering.com, PTC,   Tech-Clarity and many more companies are publishing information related to the model-based engineering phase.

Here is was surprised by Oleg’s blog with his post Model-Based Confusion in 3D CAD and PLM.

If you read his post, you get the impression that the model-based approach is just a marketing issue instead of a significant change towards a digital enterprise. I quote:

Here is the thing… I don’t see much difference between saying PLM-CAD integration sharing data and information for downstream processes and “model-driven” data sharing. It might be a terminology thing, but data is managed by CAD-PLM tools today and accessed by people and other services. This is how things are working today. If model-driven is an approach to replace 2D drawings, I can see it. However, 2D replacement is something that I’ve heard 20 years ago. However, 2D drawings are still massively used by manufacturing companies despite some promises made by CAD vendors long time ago.

I was surprised by the simplicity of this quote. As if CAD vendors are responsible for new ways of working. In particular, automotive and aerospace companies are pushing for a model-based connection between engineering and manufacturing to increase quality, time to market and reduced handling costs. The model-based definition is not just a marketing issue as you can read from benefits reported by Jennifer Herron (Re-use your CAD – the model-based CAD handbook – describing practices and benefits already in 2013) or Tech-Clarity (The How-To Guide for adopting model-based definition – describing practices and benefits – sponsored by SolidWorks)

Oleg’s post unleashed several reactions of people who shared his opinion (read the comments here). They are all confused, t is all about marketing / let’s not change / too complex. Responses you usually hear from a generation that does not feel and understand the new approaches of a digital enterprise. If you are in the field working with multiple customers trying to understand the benefits of model-based definition, you would not worry about terminology – you would try to understand it and make it work.

Model-Based – just marketing?

In his post, Oleg refers to CIMdata’ s explanation of the various aspects of model-based in the context of PLM. Instead of referring to the meaning of the various acronyms, Peter Bilello (CIMdata) presented at the latest PDT conference (Oct 2017 – Gothenburg) an excellent story related to the various aspects of the model-based aspects, actually the whole conference was dedicated to the various aspects of a Model-Based Enterprise illustrates that it is not a vendor marketing issue. You can read my comments from the vendor-neutral conference here: The weekend after PDT Europe 2017 Part 1 and Part 2.

There were some dialogues on LinkedIn this weekend, and I promised to publish this post first before continuing on the other aspects of a model-based enterprise.  Just today Oleg published a secondary post related to this topic: Model-Based marketing in CAD and PLM, where again the tone and blame is to the PLM/CAD vendors, as you can see from his conclusion:

I can see “mode-based” as a new and very interesting wave of marketing in 3D CAD and PLM.  However, it is not pure marketing and it has some rational. The rational part of model-based approach is to have information model combined from 3D design and all connected data element. Such model can be used as a foundation for design, engineering, manufacturing, support, maintenance. Pretty much everything we do. It is hard to create such model and it is hard to combine a functional solution from existing packages and products. You should think how to combine multiple CAD systems, PLM platforms and many other things together. It requires standards. It requires from people to change. And it requires changing of status quo. New approaches in data management can change siloed world of 3D CAD and PLM. It is hard, but nothing to do with slides that will bring shiny words “model-base”. Without changing of technology and people, it will remain as a history of marketing

Again it shows the narrow mindset on the future of a model-based enterprise. When it comes to standards I recommend you to register and watch CIMdata’s educational webinar called: Model-Based Enterprise and Standards – you need to register. John MacKrell CIMdata’s chairman gives an excellent overview and status of model-based enterprise initiative.  After having studied and digested all the links in this post, I challenge you to make your mind up. The picture below comes from John’s presentation, an illustration where we are with model-based definition currently

 

Conclusion

The challenge of modern businesses is that too often we conclude too fast on complex issues or we frame new developments because they do not fit our purpose. You know it from politics. Be aware it is also valid in the world of PLM. Innovation and a path to a modern digital enterprise do not come easy – you need to invest and learn all the aspects. To be continued (and I do not have all the answers either)

The recent years I have been mentioning several times addressing the term model-based in the context of a modern, digital enterprise. Posts like: Digital PLM requires a model-based enterprise (Sept 2016) or Item-Centric or Model-Centric (Sept 2017) describe some of the aspects of a model-based approach. And if you follow the PLM vendors in their marketing messages, everyone seems to be looking for a model-based environment.

This is however in big contrast with reality in the field. In February this year I moderated a focus group related to PLM and the Model-Based approach and the main conclusion from the audience was that everyone was looking at it, and only a few started practicing. Therefore, I promised to provide some step-by-step education related to model-based as like PLM we need to get a grip on what it means and how it impacts your company. As I am not an academic person, it will be a little bit like model-based for dummies, however as model-based in all aspects is not yet a wide-spread common practice, we are all learning.

What is a Model?

The word Model has various meanings and this is often the first confusion when people speak about Model-Based. The two main interpretations in the context of PLM are:

  • A Model represents a 3D CAD Model – a virtual definition of a physical product
  • A Model represents a scientific / mathematical model

And although these are the two main interpretations there are more aspects to look at model-based in the context of a digital enterprise. Let’s explore the 3D CAD Model first

The role of the 3D CAD Model in a digital enterprise

Just designing a product in 3D and then generating 2D drawings for manufacturing is not really game-changing and bringing big benefits. 3D Models provide a better understanding of the product, mechanical simulations allow the engineer to discover clashes and/or conflicts and this approach will contribute to a better understanding of the form & fit of a product. Old generations of designers know how to read a 2D drawing and in their mind understand the 3D Model.

Modern generations of designers are no longer trained to start from 2D, so their way of thinking is related 3D modeling. Unfortunate businesses, in particular when acting in Eco-systems with suppliers, still rely on the 2D definition as the legal document.  The 3D Model has brought some quality improvements and these benefits already justify most of the companies to design in 3D, still it is not the revolution a model-based enterprise can bring.

A model-based enterprise has to rely on data, so the 3D Model should rely on parameters that allow other applications to read them. These parameters can contribute to simulation analysis and product optimization or they can contribute to manufacturing. In both cases the parameters provide data continuity between the various disciplines, eliminating the need to create new representations in different formats. I will come back in a future post to the requirements for the 3D CAD model in the context of the model-based enterprise, where I will zoom in on Model-Based Definition and the concepts of Industry 4.0.

The role of mathematical models in a digital enterprise

The mathematical model of a product allows companies to analyze and optimize the behavior of a product. When companies design a product they often start from a conceptual model and by running simulations they can optimize the product and define low-level requirements within a range that optimizes the product performance. The relation between design and simulation in a virtual model is crucial to be as efficient as possible. In the current ways of working, often design and simulation are not integrated and therefore the amount of simulations is relative low, as time-to-market is the key driver to introduce a new product.

In a digital enterprise, design and simulations are linked through parameters, allowing companies to iterate and select the optimal solution for the market quickly. This part is closely related to model-based systems engineering (MBSE) , where the focus is on defining complex systems. In the context of MBSE I will also zoom in on the relation between hardware and software, which at the end will deliver the desired functionality for the customer. Again in this part we will zoom in on the importance of having a parameter model, to ensure digital continuity.

Digital Twin

There is still a debate if the Digital Twin is part of PLM or should be connected to PLM. A digital twin can be based on a set of parameters that represent the product performance in the field. There is no need to have a 3D representation, despite the fact that many marketing videos always show a virtual image to visualize the twin.

Depending on the business desire, there can be various digital twins for the same products in the field, all depending on the parameters that you want to monitor. Again it is about passing parameters, in this case from the field back to R&D and these parameters should be passed in a digital manner. In a future post I will zoom in on the targets and benefits of the digital twin.

Conclusion

There are various aspects to consider related to “model-based”.  The common thread for each of the aspects is related to PARAMETERS.  The more you can work with parameters to connect the various usages of a product/system, the closer you are related to the digital enterprise. The real advantages of a digital enterprise are speed (information available in real-time), end-to-end visibility (as data is not locked in files / closed systems).

PARAMETERS the objects to create digital continuity

 

 

 

 

When PLM – Product Lifecycle Management – was introduced, one of the main drivers was to provide an infrastructure for collaboration and for sharing product information across the whole lifecycle. The top picture shows my impression of what PLM could mean for an organization at that time. The PLM circle was showing a sequential process from concept, through planning, development, manufacturing towards after sales and/or services when relevant. PLM would provide centralization and continuity of data. Through this continuity we could break down the information silos in a company.

Why do we want to break down the silos?

You might ask yourself what is wrong with silos if they perform in a consistent matter? Oleg Shilovitsky recently wrote about it: How PLM can separate data and organization silos.  Read the post for the full details, I will stay at Oleg’s conclusion:

Keep process and organizational silos, but break data silos. This is should be a new mantra by new PLM organization in 21st century. How to help designers, manufacturing planners and support engineers to stay on the same BOM? By resolving this problem, organization will preserve current functional structure, but will make their decisions extremely data drive and efficient. The new role of PLM is to keep organizational and process silos, but connect data silos. This is a place where new cloud based multi-tenant technologies will play key role in the future organization transformation from the vision of no silo extended enterprise to organized functional silos connected by common understanding of data.

When I read this post I had so much to comment, which lead to this post. Let me share my thoughts related to this conclusion and hopefully it helps in future discussions. Feel free to join the discussion:

Keep process and organizational silos, but break data silos. This is should be a new mantra by new PLM organization in 21st century

For me “Keep process and organizational silos ….. “ is exactly the current state of classical PLM, where PLM concepts are implemented to provide data continuity within a siloed organization. When you can stay close to the existing processes the implementation becomes easier. Less business change needed and mainly a focus on efficiency gains by creating access to information.

Most companies do not want to build their data continuity themselves and therefore select and implement a PLM system that provides the data continuity, currently mainly around the various BOM-views. By selecting a PLM system, you have a lot of data integration done for you by the vendor. Perhaps not as user-friendly as every user would expect, however no company has been able to build a 100% user-friendly PLM system yet, which is the big challenge for all enterprise systems. Therefore PLM vendors provide a lot of data continuity for you without the need for your company to take responsibility for this.

And if you know SAP, they go even further. Their mantra is that when using SAP PLM, you even do not need to integrate with ERP.  You can still have long discussions with companies when it comes to PLM and ERP integrations.  The main complexity is not the technical interface but the agreement who is responsible for which data sets during the product lifecycle. This should be clarified even before you start talking about a technical implementation. SAP claims that this effort is not needed in their environment, however they just shift the problem more towards the CAD-side. Engineers do not feel comfortable with SAP PLM when engineering is driving the success of the company. It is like the Swiss knife; every tool is there but do you want to use it for your daily work?

In theory a company does not need to buy a PLM system. You could build your own PLM-system, based on existing infrastructure capabilities. CAD integrations might be trickier, however this you could solve by connecting to their native environments.  For example, Microsoft presented at several PDT conferences an end-to-end PLM story based on Microsoft technology.  Microsoft “talks PLM” during these conferences, but does not deliver a PLM-system – they deliver the technologies.

The real 21st-century paradigm

What is really needed for the 21st century is to break down the organizational silos as current ways of working are becoming less and less applicable to a modern enterprise. The usage of software has the major impact on how we can work in the future. Software does not follow the linear product process. Software comes with incremental deliveries all the time and yes the software requires still hardware to perform. Modern enterprises try to become agile, being able to react quickly to trends and innovation options to bring higher and different value to their customers.  Related to product innovation this means that the linear, sequential go-to-market process is too slow, requires too much data manipulation by non-value added activities.

All leading companies in the industry are learning to work in a more agile mode with multidisciplinary teams that work like startups. Find an incremental benefit, rapidly develop test and interact with the market and deliver it. These teams require real-time data coming from all stakeholders, therefore the need for data continuity. But also the need for data quality as there is no time to validate data all the time – too expensive – too slow.

Probably these teams will not collaborate along the various BOM-views, but more along digital models, both describing product specifications and system behavior. The BOM is not the best interface to share system information. The model-based enterprise with its various representations is more likely to be the backbone for the new future in the 21st century. I wrote about this several times, e.g. item-centric or model-centric.

And New cloud-based multi-tenant technologies …

As Oleg writes in his conclusion:

This is a place where new cloud-based multi-tenant technologies will play key role in the future organization transformation from the vision of no silo extended enterprise to organized functional silos connected by common understanding of data.

From the academic point of view, I see the beauty of new cloud-based multi-tenant technologies. Quickly build an environment that provides information for specific roles within the organization – however will this view be complete enough?  What about data dictionaries or is every integration a customization?

When talking with companies in the real world, they are not driven by technology – they are driven by processes. They do not like to break down the silos as it creates discomfort and the need for business transformation. And there is no clear answer at this moment. What is clear that leading companies invest in business change first before looking into the technology.

Conclusion

Sometimes too much academic and wishful thinking from technology providers is creating excitement.  Technology is not the biggest game changer for the 21st century. It will be the new ways of working and business models related to a digital enterprise that require breaking organizational silos. And these new processes will create the demand for new technologies, not the other way around.

Break down the walls !

At the moment this post is published I have had time to digest the latest PLMx conference organized by MarketKey. See the agenda here. For me it was a conference with mixed feelings this time and I will share more details a little further on.

Networking during the conference was excellent, good quality of conversations, however the number of people attending was smaller than previous conferences, perhaps due to too much diversification in the PI conferences?

There were several inspiring sessions and as I participated in three sessions myself, I missed a lot of potential exciting sessions that were in parallel at the same time. I believe four parallel tracks is too much and downloading the presentations later does not give you the real story.  Now some of the notable sessions I attended:

Building a Better Urban Mobility Future

The first keynote session was meant to inspire us and think of solving issues differently. Lewis Horne from a Swedish automotive startup explained their different approach to designing an electrical vehicle. Not based on classical paradigms – you do not need a steering wheel – you can navigate differently. And switching the indicator on when going left or right is now a swipe. Of course these were not the only differences.

Unity will not certify for the highest safety classes like other vehicles as car safety rules are a lot based on mechanical / human handling and responses. A fully computerized and full of sensors has complete different dynamics. And a light city car does not ride on the high-speed way. Based on the first prototype there are already more than 1000 pre-orders but Unity does not have a manufacturing facility. This will be franchised. Unity used the Apple mode – focus on an unmatched user-experience instead of manufacturability. Let’s see what happens when the first Unity’s start riding – current target prices 20.000 Euro. Will it be the new hype for modern citizens?

Focus on quality – not on happy engineers

Not only the title of this paragraph but also other statements were made by Hilmer Brunn, head of global PLM from Mettler-Toledo related to their PLM implementation strategy.  As Hilmer stated:

We should not focus to give engineers more time to design only. The job of engineering is more comprehensive than just creating designs. Engineers also need to solve issues that are related to their design – not leave it to the others.

Another interesting statement:

As long as you do not connect simulation to your design in 3D, you are actually working with 3D as if you do it with 2D. The value of 3D is more than just representation of geometry.

And the last quote I want to share from Hilmer was again related to engineering.

Engineering should consider themselves as a service provider of information to the rest of the company, providing the full information associated with a design, instead of behaving like extreme, intelligent people who need more resources to translate and complete their work.

Grand statements although during Q&A it became clear that also Mettler-Toledo did not have the magic bullet to get an organization work integrated.

Working towards a Model-Based Enterprise with PLM

I consider Model-Based practices as one of the essential needs for future PLM as this approach reduces the amount of derived information related to a product/ system. And it provides a digital continuity. In the last PDT conference in Gothenburg this topic was shared on a quit extensive matter. Have a read to fresh-up your memory here:  The weekend after PDT Europe – part 1 and part 2

The focus group  which I moderated was with approximate 20 attendees and the majority was looking for getting a better understanding what model-based would mean for their organization. Therefore, the discussion was at the end more around areas where a few persons had the experience while others still tried to grasp the concepts. For me a point to take action related to education and in future posts I will go deeper into the basics.

PLMPulse Survey results and panel discussion

Nick Leeder presented the context of the PLMPulse survey and the results in a precise manner, where perhaps the result was not that surprising to the audience as many of us are involved in PLM. Two recurring points: PLM is still considered as an engineering tool and: The value related to PLM is most of the time not clear. You can register and download the full report from here.

Next Nick lead a panel discussion where people from the audience could participate.  And here we got into a negative spiral where it became an inward-looking discussion why PLM has never been able to show the value and get out of the engineering domain. It was a someone said like an anonymous PLM meeting where members stood up and confessed they were also part of the group that could not change this behavior.

Was it the time of the day? Was it the mood of the audience? Too much old experiences?  I believe it has to do with the fact that in PLM projects and conferences we focus too much on what we do and how we do things, not connecting it to tangible benefits that are recognized at the board level. And we will see an example later.

Solar Stratos

The food and drinks at the end of day 1 probably washed away the PLMPulse feedback session and Raphael Domjan inspired us with his SolarStratos project – a mission to develop a plane that can fly on solar energy on the heights of the stratosphere. Raphael is working hard with a team now to get there.

Designing an airplane, more a glider, that can take off en reach the stratosphere on solar energy requires solving a combination of so many different challenges. The first test flight reached an altitude of 500 m, but you can imagine challenges with the stratosphere – lack of oxygen / air pressure need to be solved. Raphael is looking for funding and you can find more details here. Back to the relative easy PLM challenges

The future of PLM Consultancy

Together with Oleg Shilovitsky we had a discussion related to the ways PLM could be realized in different manners thanks to changing technology. The dialogue started through our blogs – read it here. In this session there was a good dialogue with the audience and MarketKey promised to share the video recording of this session soon.  Stay tuned to Oleg’s blog or my blog and you can watch it.

PLM in the context of digitization

This was my main personal contribution to the conference. Sharing insights why we have to approach PLM in a different manner. Not the classical linear engineering approach but as a mix of system of record and system of engagement. You can see the full presentation on SlideShare here.

My main conclusions are that PLM consultants / experts focus too much on what and how they do PLM, where the connection to WHY is missing. (See also my post PLM WHY?).

In addition I defended the statement that old and new PLM are incompatible and therefore you need to accept they will exist both in your organization. For a while or for a long time, depending on your product lifecycle.  In order to reduce the gap between old and new PLM, there is a need for data governance, model-based ways of working, which allow the company to connect at some stages the old/record data and the new data. And don’t do pilots anymore experimenting new ways of working and then stop because the next step seems to be overwhelming. Start your projects in small, multidisciplinary teams and make them real. The only way to be faster in the future.

PLM in Manufacturing as Backbone of the Smart Factory

Susanne Lauda, Director, Global Advanced Manufacturing Technology, AGCO Corporation provided an overview related to AGCO’s new PLM journey and how they were benefiting from a digital thread towards manufacturing. It felt like a smooth vendor demo as everything looked nice and reasonable. It was all about the WHAT. However two points that brought the extra:

When moving to the new system the tried to bring in the data from an existing product into then new system. According to Susanne a waste of time as the data required so much rework – there was no real value added for that. This confirms again my statement that old and new PLM are incompatible and one should not try to unify everything again in one system.

Second, I got excited at the end when we discussed the WHY for PLM and the business value of PLM. Here Suzanne mentioned PLM started as a “must-do strategic” project.  PLM lead to a reduction of time to market with almost 50 %. Suzanne did not give exact number, but you can imagine I have heard these numbers from other companies too. Why aren’t we able to connect these benefits in the mindset of the management to PLM ? Perhaps still too much engineering focused.

Next Susanne explained that they investigated the cost for quality for their manufacturing plants. What if something was produced wrong, the wrong parts were ordered, the delays to fix it, the changes needed to be made on the shop floor?  These results were so high that people were even afraid to report them. This is the case at many companies I worked with – even their PLM consultants do not receive these numbers – you just have to imagine they are big.

At AGCO they were able to reduce the cost for quality in a significant manner and Susanne explain that PLM was a main contributor to that success. However, success always has many fathers – so if your PLM team does not claim loud (and we are modest people not used to talk finance) – the success will not be recognized.

PLM’s Place Within an Enterprise Application Architecture

Peter Bilello from CIMData in the closing keynote speech gave an excellent summary and overview of where and which capabilities fit in an enterprise architecture and the positioning of a product innovation platform. A blueprint that can be used for companies to grasp the holistic view before jumping into the details of the tools.

Conclusion

PLMx Hamburg 2018 was an event with valuable highlights for me and potential I missed several more due to the fact of parallel streams. I hope to catch-up with these sessions in the upcoming month and share interesting thoughts that I discover with you. What remains crucial I believe for all vendor-neutral events is to find new blood. New companies, new experiences that are focused on the future of PLM and connect to the WHY or the WHAT WE LEARNED values.

Perhaps an ambiguous title this time as it can be interpreted in various ways. I think that all these interpretations are one of the most significant problems with PLM. Ambiguity everywhere. Its definition, its value and as you might have noticed from the past two blog posts the required skill-set for PLM consultants.

As I am fine-tuning my presentation for the upcoming PLMx 2018 Event in Hamburg, some things become clearer for me. This is one of the advantages of blogging, speaking at PLM conferences and discussing PLM with companies that are eager to choose to right track for PLM. You are forced to look in more depth to be consistent and need to have arguments to support your opinion about what is happening in the scope of PLM. And from these learnings I realize often that the WHY PLM remains a big challenge for various reasons.

Current PLM

In the past twenty years, companies have implemented PLM systems, where the primary focus was on the P (Product) only from Product Lifecycle Management. PLM systems have been implemented as an engineering tool, as an evolution of (Product Data Management).

PLM systems have never been designed from the start as an enterprise system. Their core capabilities are related to engineering processes and for that reason that is why most implementations start with engineering.  Later more data-driven PLM-systems like Aras and Autodesk have begun from another angle, data connectivity between different disciplines as a foundation, avoiding to get involved with the difficulty of engineering first.

This week I saw the publication of the PLMPulse survey results by i42R / MarketKey where they claim:

The results from first industry-led survey on our status of Product Lifecycle Management and future priorities

The PLMPulse report is based on five different surveys as shown in the image above. Understanding the various aspects of PLM from usage, business value, organizational constraints, information value and future potential. More than 350 people from all around the world answered the various questions related to these survey.  Unfortunate inputs from some Asian companies are missing. We are all curious what happens in China as there, companies do not struggle with the same legacy related to PLM as other countries. Are they more embracing PLM in a different way?

The results as the editors also confirm, are not shocking and confirming that PLM has the challenge to get out of the engineering domain. Still, I recommend downloading the survey as it has interesting details. After registration you can download the report from here.

What’s next

During the upcoming PLMx 2018 Hamburg conference there will be a panel discussion where the survey results will be discussed. I am afraid that this debate will result again in a discussion where we will talk about the beauty and necessity of PLM and we wonder why PLM is not considered crucial for the enterprise.

There are a few challenges I see for PLM and hopefully they will be addressed. Most discussions are about WHAT PLM should/could do and not WHY.  If you want to get to the WHY of PLM, you need to be able to connect the value of PLM to business outcomes that resonate at C-level. Often PLM implementations are considered costly and ROI and business value are vague.

As the PLMPulse report also states, the ROI for PLM is most of the time based on efficiency and cost benefits related to the current way of working. These benefits usually do not offer significant ROI numbers. Major benefits come for working in a different way and focusing on working closer to your customer. Business value is hard to measure.

How do you measure the value of multidisciplinary collaboration or being more customer-centric? What is the value of being better connected to your customer and being able to react faster? These situations are hard to prove at the board level, as here people like to see numbers, not business transformations.

Focus on the WHY and HOW

A lot of the PLM messages that you can read through various marketing or social channels are related to futuristic concepts and high-level dreams that will come true in the next 10-20 years. Most companies however have a planning horizon of 2 years max 5 years. Peter Bilello from CIMdata presented one of their survey results at the PDT conference in 2014, shown below:

Technology and vision are way ahead of reality. Even the area where the leaders focusing the distance between technology and vision gets bigger. The PLM focus is more down-to-earth and should not be on what we are able to do, but the focus should be on what would be the next logical step for our company to progress to the future.

System of Record and System of Engagement

At the PLMx conference I will share my experiences related to PLM transformations with the audience. One and a half-year ago we started talking about the bi-modal approach. Now more and more I see companies adopting the concepts of bi-modal related to PLM.  Still most organizations struggle with the fact that their PLM should be related to one PLM system or one PLM vendor, where I believe we should come to the conclusion that there are two PLM modes at this moment. And this does not imply there need to be only one or two systems – it will become a federated infrastructure.

Current modes could be an existing PLM backbone, focusing on capturing engineering data, the classical PLM system serving as a system of record. And a second, new growing PLM-related infrastructure which will be a digital, most likely federated, platform where modern customer-centric PLM processes will run. As the digital platform will provide real-time interaction it might be considered as a system of engagement, complementary to the system of record.

It will be the system of engagement that should excite the board members as here new ways of working can be introduced and mastered. As there are no precise blueprints for this approach, this is the domain where innovative thinking needs to take place.

That’s why I hope that neutral PLM conferences will less focus on WHAT can be done. Discussions like MBSE, Digital Thread, Digital Twin, Virtual Reality / Augmented Reality are all beautiful to watch. However, let’s focus first on WHY and HOW. For me besides the PLMx Hamburg conference, other upcoming events like PDT 2018 (this time in the US and Europe) are interesting events and currently PDT the call for papers is open and hopefully we  find speakers that can teach and inspire.

CIMdata together with Eurostep are organizing these events in May (US) and October (Europe). The theme for the CIMdata roadmap conference will be “Charting the Course to PLM Value together – Expanding the value footprint of PLM and Tackling PLM’s Persistent Pain Points” where PDT will focus on Collaboration in the Engineering Supply Chain – the extended digital thread.  These themes need to be addressed first before jumping into the future. Looking forward to meeting you there.

 

Conclusions

In the world of PLM, we are most of the time busy with explaining WHAT we (can/will) do. Like a cult group sometimes we do not understand why others do not see the value or beauty of our PLM concepts. PLM dialogues and conferences should therefore focus more on WHY and HOW. Don’t worry, the PLM vendors/implementers will always help you with WHAT they can do and WHY it is different.

 

If you have followed my blog over the past 10 years, I hope you realize that I am always trying to bring sense to the nonsense and still looking into the future where new opportunities are imagined. Perhaps due to my Dutch background (our motto: try to be normal – do not stand out) and the influence of working with Israeli’s (a country where almost everyone is a startup).

Given this background, I enjoy the current discussion with Oleg Shilovitsky related to potential PLM disruptions. We worked for many years together at SmarTeam, a PDM/PLM disruptor at that time, in the previous century. Oleg has continued his passion for introducing potential disruptive solutions  (Inforbix / OpenBOM) where I got more and more intrigued by human behavior related to PLM. For that reason, I have the human brain in my logo.

Recently we started our “The death of ….” Dialogue, with the following episodes:

Jan 14thHow to democratize PLM knowledge and disrupt traditional consulting experience

Jan 21stThe death of PLM Consultancy

Jan 22ndWhy PLM consultants are questioning new tools and asking about cloud exit strategy?

Here is episode 4  – PLM Consultants are still alive and have an exit strategy

Where we agree

We agreed on the fact that traditional consultancy practices related to PLM ranking and selection processes are out of time. The Forester Wave publication was the cause of our discussion. For two reasons:

  1. All major PLM systems cover for 80 percent the same functionalities. Therefore there is no need to build, send and evaluate lengthy requirements lists to all potential candidates and then recommend on the preferred vendor. Waste of time as the besides the requirements there is much more to evaluate than just performing tool selection.
  2. Many major consultancy firms have PLM practices, most of the time related to the major PLM providers. Selecting one of the major vendors is usually not a problem for your reputation, therefore the importance of these rankings. Consultancy firms will almost never recommend disruptive tool-sets.

PLM businesses transformation

At this point, we are communicating at a different wavelength. Oleg talks about PLM business transformation as follows:

Cloud is transforming PLM business. Large on-premise PLM projects require large capital budget. It is a very good foundation for existing PLM consulting business. SaaS subscription is a new business model and it can be disruptive for lucrative consulting deals. Usually, you can see a lot of resistance when somebody is disrupting your business models. We’ve seen it in many places and industries. It happened with advertising, telecom and transportation. The time is coming to change PLM, engineering and manufacturing software and business.

I consider new business models less relevant compared to the need for a PLM practice transformation. Tools like Dropbox, perhaps disruptive for PDM systems, are tools that implement previous century methodology (document-driven / file-based models). We are moving from item-centric towards a model-driven future.

The current level of PLM practices is related to an item-centric approach, the domain where also OpenBOM is bringing disruption.
The future, however, is about managing complex products, where products are actually systems, a combination of hardware and software. Hardware and software have a complete different lifecycle, and all major PLM vendors are discovering an overall solution concept to incorporate both hardware and software. If you cannot manage software in the context of hardware in the future, you are at risk.  Each PLM vendor has a different focus area due to their technology history. I will address this topic during the upcoming PLMx conference in Hamburg. For a model-driven enterprise, I do not see an existing working combination of disruptors yet.

Cloud security and Cloud exit strategy

Oleg does not really see the impact of the cloud as related to the potential death of PLM consulting as you can read here:

I agree, cloud might be still not for everyone. But the adoption of cloud is growing and it is becoming a viable business model and technology for many companies. I wonder how “cloud” problem is related to the discussion about the death of PLM consulting. And…  here is my take on this. It is all about business model transformation.

I am not convinced that in the PLM cloud is the only viable business model. Imagine an on-premise rigid PLM system. Part of the cloud-based implementation benefits come from low upfront costs and scalable IT. However, cloud also pushes companies to defend a no-customization strategy – configuration of the user interface only.  This is a “secret” benefit for cloud PLM vendors as they can say “NO” to the end users of course within given usability constraints. Saying “NO” to the customer is lesson one for every current PLM implementation as everyone knows the problem of costly upgrades later

Also, make a 5-10 years cost evaluation of your solution and take the risk of raising subscription fees into account. No vendor will drop the price unless forced by the outside world. The initial benefits will be paid back later because of the other business model.

Cloud exit strategy and standards

When you make a PLM assessment, and usually experienced PLM consultants do this, there is a need to consider an exit strategy. What happens if your current PLM cloud vendor(s) stops to exist or migrate to a new generation of technology and data-modeling? Every time when new technology was introduced, we thought it was going to be THE future. The future is unpredictable. However, I can predict that in 10 years from now we live with different PLM concepts.

There will be changes and migrations and cloud PLM vendors will never promote standardized exports methods (unless forced) to liberate the data in the system. Export tools could be a niche market for PLM partners, who understand data standards. Håkan Kårdén, no finders fee required, however, Eurostep has the experience in-house.

 

Free downloads – low barriers to start

A significant difference in opinion between Oleg and me is Oleg’s belief in bottom-up, DIY PLM as part of PLM democratization and my belief in top-down business transformation supported by PLM. When talking about Aras, Autodesk, and OpenBOM,  Oleg states:

All these tools have one thing in common. You can get the tool or cloud services for free and try it by yourself before buying. You can do it with Aras Innovator, which can be downloaded for free using enterprise open source. You can subscribe for Autodesk Fusion Lifecycle and OpenBOM for trial and free subscriptions. It is different from traditional on-premise PLM tools provided by big PLM players. These tools require months and sometimes even years of planning and implementation including business consulting and services.

My experience with SmarTeam might influence this discussion. SmarTeam was also a disruptive PDM solution thanks to its easy data-modeling and Microsoft-based customization capabilities like Aras. Customers and implementers could build what they want, you only needed to know Visual Basic. As I have supported the field mitigating installed SmarTeam implementations, often the problem was SmarTeam has been implemented as a system replicating/automating current practices.

Here Henry Ford’s statement as shown below applies:

Implementations became troublesome when SmarTeam provided new and similar business logic. Customers needed to decide to use OOTB features and de-customize or not benefits from new standard capabilities. SmarTeam had an excellent business model for service providers and IT-hobbyists/professionals in companies. Upgrade-able SmarTeam implementations where those that remained close to the core, but meanwhile we were 5 – 8 years further down the line.

I believe we still need consultants to help companies to tell and coach them towards new ways of working related to the current digitization. Twenty years old concepts won’t work anymore. Consultants need a digital mindset and think holistic. Fitting technology and tools will be there in the future.

Conclusion

The discussion is not over, and as I reached already more than 1000 words, I will stop. Too many words already for a modern pitch, not enough for a balanced debate. Oleg and I will continue in Hamburg, and we both hope others will chime in, providing balanced insights in this discussion.

To be continued …..?

 

Dear readers, it is time for me to relax and focus on Christmas and a New Year upcoming. I realize that not everyone who reads my posts will be in the same mood. You might have had your New Year three months ago or have New Year coming up in a few months. This is the beauty and challenge of a global, multicultural diverse society. Imagine we are all doing the same, would you prefer such a world ? Perhaps it would give peace to the mind (no surprises, everything predictable) however for human survival we need innovation and new ways of life.

This mindset is also applicable to manufacturing companies. Where in the past companies were trying to optimize and standardize their processes driven by efficiency and predictability, now due to the dynamics of a globally connected world, businesses need to become extremely flexible however still reliable and profitable.

How will they make the change ?

Digital transformation is one of the buzz words pointing to the transition process. Companies need to go through a change to become flexible for the future and deliver products or solutions for the individual customer. Currently companies invest in digital transformation, most of the time in areas that bring direct visibility to the outside world or their own management, not necessarily delivering profitable results as a recent article from McKinsey illustrated: The case for digital reinvention.

And for PLM ?

I have investigated digital transformation in relation to PLM  with particular interest this year as I worked with several companies that preached to the outside world that they are changing or were going to make a change. However what is happening at the PLM level ? Most of the time nothing. Some new tools, perhaps some new disciplines like software engineering become more critical. However the organization and people do not change their ways of working as in particular the ongoing business and related legacy are blocking the change.

Change to ?

This is another difficult question to answer.  There is no clearly defined path to share. Yes, modern PLM will be digital PLM, it will be about data-driven connected information. A final blueprint for digital PLM does not exist yet. We are all learning and guessing.  You can read my thoughts here:

Software vendors in various domains are all contributing to support a modern digital product innovation management future. But where to start?  Is it the product innovation platform? Is it about federated solutions? Model-Based? Graph-databases? There are even people who want to define the future of PLM.  We can keep throwing pieces of the puzzle on the table, but all these pieces will not lead to a single solved puzzle. There will be different approaches based on your industry and your customers. Therefore, continuous learning and investing time to understand the digital future is crucial. This year’s PDT Europe conference was an excellent event to learn and discuss the themes around a model-based lifecycle enterprise. You can read my reviews here: The weekend after PDT Europe 2017 part 1 and part 2.

The next major event where I plan to discuss and learn about modern PLM topics is the upcoming PI PLMx event in Hamburg on February 19-20 organized by MarketKey. Here I will discuss the Model-Based Enterprise and lecture about the relation between PLM and digital transformation. Hoping to see some of you there for exciting discussions and actions.

Conclusion

Merry Christmas for those who are celebrating and a happy, healthy and prosperous 2018 to all of you. Thanks for your feedback. Keep on asking questions or propose other thoughts as we are all learning. The world keeps on turning, however for me the next two weeks will the time relax.

Talk to you in 2018 !

 

When I started working with SmarTeam Corp.  in 1999, the company had several product managers, who were responsible for the whole lifecycle of a component or technology. The Product Manager was the person to define the features for the new release and provide the justification for these new features internally inside R&D.  In addition the Product Manager had the external role to visit customers and understand their needs for future releases and building and explaining a coherent vision to the outside and internal world. The product manager had a central role, connecting all stakeholders.

In the ideal situation the Product Manager was THE person who could speak in R&D-language about the implementation of features, could talk with marketing and documentation teams to explain the value and expected behavior and could talk with the customer describing the vision, meanwhile verifying the product’s vision and roadmap based on their inputs.All these expected skills make the role of a product manager challenging. Is the person too “techy” than he/she will enjoy working with R&D but have a hard time understanding customer demands. From the other side if the Product Manager is excellent in picking-up customer and market feedback he/she might not be heard and get the expected priorities from R&D. For me, it has always been clear that in software world a “bi-directional” Product Manager is crucial to success.

Where are the Product Managers in the Manufacturing Industry?

Approximate four years ago new concepts related to digitalization for PLM became more evident. How could a digital continuity connect the various disciplines around the product lifecycle and therefore provide end-to-end visibility and traceability? When speaking of end-to-end visibility most of the time companies talked about the way they designed and delivered products, visibility of what is happening stopped most of the time after manufacturing. The diagram to the left, showing a typical Build To Order organization illustrates the classical way of thinking. There is an R&D team working on Innovation, typically a few engineers and most of the engineers are working in Sales Engineering and Manufacturing Preparation to define and deliver a customer specific order. In theory, once delivered none of the engineers will be further involved, and it is up to the Service Department to react to what is happening in the field.

A classical process in the PLM domain is the New Product Introduction process for companies that deliver products in large volumes to the market, most of the time configurable to be able to answer to various customer or pricing segments. This process is most of the time linear and is either described in one stream or two parallel streams. In the last case, the R&D department develops new concepts and prepares the full product for the market. However, the operational department starts in parallel, initially involved in strategic sourcing, and later scaling-up manufacturing disconnected from R&D.

I described these two processes because they both illustrate how disconnected the source (R&D/ Sales)  are from the final result in the field. In both cases managed by the service department. A typical story that I learned from many manufacturing companies is that at the end it is hard to get a full picture from what is happening across the whole lifecycle, How external feedback (market & customers) have the option to influence at any stage is undefined. I used the diagram below even  before companies were even talking about a customer-driven digital transformation. Just understanding end-to-end what is happening with a product along the lifecycle is already a challenge for a company.

Putting the customer at the center

Modern business is about having customer or market involvement in the whole lifecycle of the product. And as products become more and more a combination of hardware and software, it is the software that allows the manufacturer to provide incremental innovation to their products. However, to innovate in a manner that is matching or even exceeding customer demands, information from the outside world needs to travel as fast as possible through an organization. In case this is done in isolated systems and documents, the journey will be cumbersome and too slow to allow a company to act fast enough. Here digitization comes in, making information directly available as data elements instead of documents with their own file formats and systems to author them. The ultimate dream is a digital enterprise where date “flows”, advocated already by some manufacturing companies for several years.

In the previous paragraph I talked about the need to have an infrastructure in place for people in an organization to follow the product along the complete lifecycle, to be able to analyze and improve the customer experience. However, you also need to create a role in the organization for a person to be responsible for combining insights from the market and to lead various disciplines in the organization, R&D, Sales, Services. And this is precisely the role of a Product Manager.

Very common in the world of software development, not yet recognized in manufacturing companies. In case a product manager role exists already in your organization, he/she can tell you how complicated it currently is to get an overall view of the product and which benefits a digital infrastructure would bring for their job. Once the product manager is well-supported and recognized in the organization, the right skill set to prioritize or discover actions/features will make the products more attractive for consumers. Here the company will benefit.

Conclusion

If your company does not have the role of a product manager in place, your business is probably not yet well enough engaged in the customer journey.  There will be broken links and costly processes to get a fast response to the market.  Consider the role of a Product Manager, which will emerge as seen from the software business.

NOTE 1: Just before publishing this post I read an interesting post from Jan Bosch: Structure Eats Strategy. Well fitting in this context

NOTE 2: The existence of a Product Manager might be a digital maturity indicator for a company, like for classical PLM maturity, the handling of the MBOM (PDM/PLM/ERP) gives insight into PLM maturity of a company.

Related to the MBOM, please read: The Importance of a PLM data model – EBOM and MBOM

 

 

 

 

 

Translate

Email subscription to this blog

Advertisements
%d bloggers like this: