You are currently browsing the tag archive for the ‘PLM’ tag.

Some of you following my blog this year might not feel so connected with the content I have written many posts related to digitization and the future needs for model-driven approaches, not so much about topics that might keep you awake at this time.

When I look in my blog statistics, the most popular post is ECO/ECR for Dummies, leading with more than 30.000 views since I wrote this post in 2010. You can read the original post here: ECR/ECO for Dummies (2010)

Meanwhile, in most companies, the scope of PLM has broadened, and instead of a change process within the engineering department, it will be part of enterprise change management, connecting all options for change. Therefore, in this post, I will explain the basics of a modern enterprise change process.

It can start with an Issue

Already 10 years ago I was promoting the Issue-object in a PLM data model as this could be the starting point for many activities in the enterprise, product-related, technology-related, customer-related and more.

My definition of an Issue is that it is something happening that was not expected and requires follow-up. In our day-to-day life, we solve many issues by sending an e-mail or picking up the phone, and someone down the chain will resolve the issue (or make it disappear).

The disadvantage of this approach is that there is no collective learning for the organization. Imagine that you could see in your PLM-system how many issues there were with a project, can you learn from that and improve it for the future. Or when you notice you have had several costly issues during manufacturing, but you were never aware of them, because it happened in another country and it was solved there.

By creating issues in the PLM-system related to the object(s), it concerns (a product, a part, a customer, a manufacturing process, an installation, …..) you will create traceability and visibility based on global facts. By classifying the issues, you can run real-time reports on what is happening and what has happened unforeseen in your enterprise.

The challenge is to find a user-interface that can compete with e-mail as an entry point. So far PLM-system providers haven’t invested in highly user-friendly Issue management, leaving the email path possible. PLM Vendors – there is work to do!

Next, depending on the Issue various follow-up processes can start en some of them will be connected. See the diagram below and forgive me my graphical talent.

In this post we will focus only on the ECR and ECO path, leaving the other processes above open for next time.

The Engineering Change Request-process

The term ECR, meaning Engineering Change Request, might not be correct anymore for requested changes in an enterprise. Therefore, sometimes, you might also see the term CR only, without the reference to Engineering. For example, in the software world, you will not follow the same process as used for the hardware world, due to the different lifecycle, speed, and cost involved with software changes.  I will focus only on the ECR here.

As the picture above shows, there are two entry points for an engineering change request. Either someone in the enterprise has an issue that leads to an ECR, or someone in the enterprise has an idea to improve the products and sends it in as a request.

The next steps are quite standard for a typical ECR-process:

Analysis

In the Analysis step assigned individuals will evaluate the request. If it is well understood. Potential solution paths will be evaluated and rated. In case it is a change on a running product, what is the impact of performing this change on current products, current, and future manufacturing, finance, etc. In the analysis-phase there will be no detail design, it is more a feasibility study. In companies already having a well-structured PLM and ERP infrastructure, many of the impact analysis can be done rather fast, as for example the “Where Used” capability is a standard in every PLM-system.

CCB

The abbreviation stands for Change Control Board, a term also used in the software industry. In the case of hardware products, the CCB usually consists of engineering, manufacturing, purchasing, finance and potentially sales, based on the context of the ECR. This group of people decides what will be the next step of the ECR. They have four options:

  1. Ask for further analysis – a decision is not possible.
  2. Mandate the proposed change to be planned immediately by promoting it to an Engineering Change Order, which means the change is going to be executed as needed (Immediate for example in case of a product stop/customer issue – Longer Term when old stock needs to be consumed first)
  3. The proposed change can become a Candidate for the next product release/upgrade and put on hold to be implemented together with other candidates for a release.
  4. The ECR can also be Cancelled meaning the proposed change will potential not create business benefits for the company. Implementing the change might create more complexity as desired.

Engineering Change Order

The image above is an illustration of a possible flow for an ECO. When an ECO is launched a first analysis and planning is required. The ECO can be based on multiple ECRs, or the ECO can be depending on other ECO’s that need to be coordinated.

The ECO process is quite similar to a release process. There will be multidisciplinary collaboration (mechanical/electrical/ …) leading to a complete engineering definition (based on the EBOM). Next Manufacturing Preparation and Planning can be done, where the implementation at the manufacturing plant(s) will be depending on the ECO context.

Note: When only a change in manufacturing will be implemented, for example when certain parts/materials are not available or affordable, we do not name it an ECO but an MCO instead. MCO stands for Manufacturing Change Order and assumes the engineering specification will remain the same.

Conclusion

The ECR/ECO-process is slowly changing due to digitization and a broader implementation scope for PLM – it is no longer a mechanical engineering change process. The availability of digital connected information will offer a base for algorithms in the future, speeding up the process and reducing the effort for a CCB during the ECR-process.

Will these processes still be there in 2025?

 

 

 

Advertisements

According to LinkedIn, there are over a 7500 PLM consultants in my network.  It is quite an elite group of people as I have over 100.000 CEOs in my network according to LinkedIn. Being a CEO is a commodity.

PLM consultants share a common definition, the words Product Lifecycle Management. However, what we all mean by PLM is one of the topics that has evolved over the past 19 years in a significant way.

PLM or cPDM (collaborative PDM)?

In the early days, PLM was considered as an engineering tool for collaboration, either between global subsidiaries or suppliers. The main focus of PLM was to bring engineering information to manufacturing in a controlled way. PLM and cPDM, often seen as solving the same business needs as the implementation of a PLM system most of the time got stuck at the cPDM level.

Main players at that time were Dassault Systemes, UGS (later Siemens PLM) and PTC – their solutions were MCAD-driven with limited scope – bringing engineering information towards manufacturing in a coordinated way.

PLM was not really an approach that created visibility at the management level of a company. How do you value and measure collaboration? Because connectivity was expensive in the early days of PLM, combined with the idea that PLM systems needed to be customized, PLM was framed as costly and hard to deliver value.

Systems Engineering and New Product Introduction

Then, 2005 and beyond, thanks to better connectivity and newcomers in the PLM market, the solution landscape from PLM became broader.  CAD integrations were not a necessary part of the PLM scope according to these newcomers as they focused on governance (New Product Introduction), Bill of Materials or at the front-end of the product design cycle, connecting systems engineering by adding requirements management to their PLM suite.

New players in this domain where SAP, Aras, followed by Autodesk – their focus was more metadata-driven, connection and creating an end-to-end data flow for the product. Autodesk started the PLM and cloud path.

These new capabilities brought a broader scope for PLM indeed. However, they also strengthened the idea that PLM is there for engineers. For the management too complicated, unless they understood the value of coordinated collaboration. Large enterprises saw the benefits of having common processes for PLM as an essential reason to invest in PLM. The graph below showed the potential of PLM, where the shaded area indicates the potential revenue benefits.

Still, this graph does not create “hard numbers,” and it requires visionaries to get a PLM implementation explained and justified across the board.  PLM is framed as expensive even if the budgets spent on PLM are twenty percent or less compared to ERP implementations. As PLM is not about transactional data, the effects of PLM are hard to benchmark. Success has many fathers, and in case of difficulties, the newcomer is to blame.

PLM = IoT?

With the future possibilities, connectivity to the machine-level (IoT or IIoT), a new paradigm related to PLM was created by PTC.  PLM equals IoT – read more here.

Through IoT, it became possible to connect to products/assets in the field, and the simplified message from PTC was that now thanks to IoT (read ThingWorx) PLM was now really possible, releasing traditional PLM out of its engineering boundaries. The connected sensors created the possibility to build and implement more advanced and flexible manufacturing processes, often called Smart Manufacturing or Industrie 4.0.

None of the traditional PLM vendors is talking about PLM solely anymore. Digital transformation is a topic discussed at the board level, where GE played a visionary role with their strong message for change, driven by their CEO Jeff Immelt at that time – have a look at one of his energizing talks here.

However is PLM part of this discussion?

Digital Transformation opened a new world for everyone. Existing product lifecycle concepts could be changed, products are becoming systems, interacting with the environment realized through software features. Systems can be updated/upgraded relatively fast, in particular when you are able to watch and analyze the performance of your assets in almost real-time.

All consultants (me included) like to talk about digital transformation as it creates a positive mood towards the future, imagining everything that is possible. And with the elite of PLM consultants we are discovering the new roles of PLM – see picture below:

Is PLM equal to IoT or Digital Transformation?

I firmly believe the whole Digital Transformation and IoT hypes are unfortunately obfuscating the maximum needs for a digital enterprise. The IoT focus only exposes the last part of the lifecycle, disconnected from the concept and engineering cycles – yes on PowerPoint slides there might be a link. Re-framing PLM as Digital Transformation makes is even vaguer as we discussed during the CIMdata / PDT Europe conference last October. My main argument: Companies fail to have a link with their digital operations and dreams because current engineering processes and data, hardware (mechanical and electronics) combined with software are still operating in an analog, document-driven mode.

PLM = MBSE?

However what we also discussed during this conference was the fact that actually there is a need for an end-to-end model-based systems engineering infrastructure to support the full product lifecycle. Don Farr’s (Boeing) new way to depict the classical systems engineering “V” also hinted into that direction. See the image below – a connected environment between the virtual modeled word and the physical world at any time of the product lifecycle

So could MBSE be the new naming for PLM?

The problem is as Peter Bilello also mentioned during the CIMdata/PDT conference is that the word “ENGINEERING” is in Model-Based Systems Engineering. Therefore keeping the work what the PLM “elite” is doing again in the engineering box.

So perhaps Model-Based Enterprise as the new name?

Unfortunate MBE has already two current definitions – look here and here. Already too much confusion, and there a lot of people who like confusion. See Model-Based – The confusion. So any abbreviation with Model-Based terminology in it will not get attention at the board level. Even if it is crucial the words, Model-Based create less excitement as compared to Digital Twin, although the Digital Twin depends on a model-based approach.

Conclusion

Creating and maintaining unique products and experiences for their customers is the primary target of almost every company. However, no easy acronym that frames these aspects to value at the board level. Perhaps PID – the Product Innovation Diamond approach will be noticed? Your say ….

 

This is my concluding post related to the various aspects of the model-driven enterprise. We went through Model-Based Systems Engineering (MBSE) where the focus was on using models (functional / logical / physical / simulations) to define complex product (systems). Next we discussed Model Based Definition / Model-Based Enterprise (MBD/MBE), where the focus was on data continuity between engineering and manufacturing by using the 3D Model as a master for design, manufacturing and eventually service information.

And last time we looked at the Digital Twin from its operational side, where the Digital Twin was applied for collecting and tuning physical assets in operation, which is not a typical PLM domain to my opinion.

Now we will focus on two areas where the Digital Twin touches aspects of PLM – the most challenging one and the most over-hyped areas I believe. These two areas are:

  • The Digital Twin used to virtually define and optimize a new product/system or even a system of systems. For example, defining a new production line.
  • The Digital Twin used to be the virtual replica of an asset in operation. For example, a turbine or engine.

Digital Twin to define a new Product/System

There might be some conceptual overlap if you compare the MBSE approach and the Digital Twin concept to define a new product or system to deliver. For me the differentiation would be that MBSE is used to master and define a complex system from the R&D point of view – unknown solution concepts – use hardware or software?  Unknown constraints to be refined and optimized in an iterative manner.

In the Digital Twin concept, it is more about a defining a system that should work in the field. How to combine various systems into a working solution and each of the systems has already a pre-defined set of behavioral / operational parameters, which could be 3D related but also performance related.

You would define and analyze the new solution virtual to discover the ideal solution for performance, costs, feasibility and maintenance. Working in the context of a virtual model might take more time than traditional ways of working, however once the models are in place analyzing the solution and optimizing it takes hours instead of weeks, assuming the virtual model is based on a digital thread, not a sequential process of creating and passing documents/files. Virtual solutions allow a company to optimize the solution upfront instead of costly fixing during delivery, commissioning and maintenance.

Why aren’t we doing this already? It takes more skilled engineers instead of cheaper fixers downstream. The fact that we are used to fixing it later is also an inhibitor for change. Management needs to trust and understand the economic value instead of trying to reduce the number of engineers as they are expensive and hard to plan.

In the construction industry, companies are discovering the power of BIM (Building Information Model) , introduced to enhance the efficiency and productivity of all stakeholders involved. Massive benefits can be achieved if the construction of the building and its future behavior and maintenance can be optimized virtually compared to fixing it in an expensive way in reality when issues pop up.

The same concept applies to process plants or manufacturing plants where you could virtually run the (manufacturing) process. If the design is done with all the behavior defined (hardware-in-the-loop simulation and software-in-the-loop) a solution has been virtually tested and rapidly delivered with no late discoveries and costly fixes.

Of course it requires new ways of working. Working with digital connected models is not what engineering learn during their education time – we have just started this journey. Therefore organizations should explore on a smaller scale how to create a full Digital Twin based on connected data – this is the ultimate base for the next purpose.

Digital Twin to match a product/system in the field

When you are after the topic of a Digital Twin through the materials provided by the various software vendors, you see all kinds of previews what is possible. Augmented Reality, Virtual Reality and more. All these presentations show that clicking somewhere in a 3D Model Space relevant information pops-up. Where does this relevant information come from?

Most of the time information is re-entered in a new environment, sometimes derived from CAD but all the metadata comes from people collecting and validating data. Not the type of work we promote for a modern digital enterprise. These inefficiencies are good for learning and demos but in a final stage a company cannot afford silos where data is collected and entered again disconnected from the source.

The main problem: Legacy PLM information is stored in documents (drawings / excels) and not intended to be shared downstream with full quality.
Read also: Why PLM is the forgotten domain in digital transformation.

If a company has already implemented an end-to-end Digital Twin to deliver the solution as described in the previous section, we can understand the data has been entered somewhere during the design and delivery process and thanks to a digital continuity it is there.

How many companies have done this already? For sure not the companies that are already a long time in business as their current silos and legacy processes do not cater for digital continuity. By appointing a Chief Digital Officer, the journey might start, the biggest risk the Chief Digital Officer will be running another silo in the organization.

So where does PLM support the concept of the Digital Twin operating in the field?

For me, the IoT part of the Digital Twin is not the core of a PLM. Defining the right sensors, controls and software are the first areas where IoT is used to define the measurable/controllable behavior of a Digital Twin. This topic has been discussed in the previous section.

The second part where PLM gets involved is twofold:

  • Processing data from an individual twin
  • Processing data from a collection of similar twins

Processing data from an individual twin

Data collected from an individual twin or collection of twins can be analyzed to extract or discover failure opportunities. An R&D organization is interested in learning what is happening in the field with their products. These analyses lead to better and more competitive solutions.

Predictive maintenance is not necessarily a part of that.  When you know that certain parts will fail between 10.000 and 20.000 operating hours, you want to optimize the moment of providing service to reduce downtime of the process and you do not want to replace parts way too early.


The R&D part related to predictive maintenance could be that R&D develops sensors inside this serviceable part that signal the need for maintenance in a much smaller time from – maintenance needed within 100 hours instead of a bandwidth of 10.000 hours. Or R&D could develop new parts that need less service and guarantee a longer up-time.

For an R&D department the information from an individual Digital Twin might be only relevant if the Physical Twin is complex to repair and downtime for each individual too high. Imagine a jet engine, a turbine in a power plant or similar. Here a Digital Twin will allow service and R&D to prepare maintenance and simulate and optimize the actions for the physical world before.

The five potential platforms of a digital enterprise

The second part where R&D will be interested in, is in the behavior of similar products/systems in the field combined with their environmental conditions. In this way, R&D can discover improvement points for the whole range and give incremental innovation. The challenge for this R&D organization is to find a logical placeholder in their PLM environment to collect commonalities related to the individual modules or components. This is not an ERP or MES domain.

Concepts of a logical product structure are already known in the oil & gas, process or nuclear industry and in 2017 I wrote about PLM for Owners/Operators mentioning Bjorn Fidjeland has always been active in this domain, you can find his concepts at plmPartner here  or as an eLearning course at SharePLM.

To conclude:

  • This post is way too long (sorry)
  • PLM is not dead – it evolves into one of the crucial platforms for the future – The Product Innovation Platform
  • Current BOM-centric approach within PLM is blocking progress to a full digital thread

More to come after the holidays (a European habit) with additional topics related to the digital enterprise

 

I was planning to complete the model-based series with a post related to the digital twin. However, I did not find the time to structure my thoughts to write it up in a structured story. Therefore, this time some topics I am working on that I would like to share.

Executive days at CADCAM Group

Last week I supported the executive days organized by the CADCAM Group in Ljubljana and Zagreb. The CADCAM is a large PLM Solution and Services Provider (60+ employees) in the region of South-East Europe with offices in Croatia, Slovenia, Serbia and Bosnia and Herzegovina. They are operating in a challenging region, four relative young countries with historically more an inside focus than a global focus. Many of CADCAM Group customers are in the automotive supply chain and to stay significant for the future they need to understand and develop a strategy that will help them to move forward.

My presentation was related to the learning path each company has to go through to understand the power of digital combined with the observation that current and future ways of working are not compatible therefore requiring a scaled and bimodal approach (see also PDT Europe further down this post).

This presentation matched nicely with Oscar Torres’s presentation related to strategy. You need to decide on the new things you are going to do, what to keep and what to stop. Sounds easy and of course the challenge is to define the what to start, stop and keep. There you need good insights into your current and future business.

Pierre Aumont completed the inspiring session by explaining how the automotive industry is being disrupted and it is not only Tesla. So many other companies are challenging the current status quo for the big automotive OEMs. Croatia has their innovator for electrical vehicles too, i.e. Rimac. Have a look here.

The presentations were followed by a (long) panel discussion. The common theme in both discussions is that companies need to educate and organize themselves to become educated for the future. New technologies, new ways of working need time and resources which small and medium enterprises often do not have. Therefore, universities, governments and interest groups are crucial.

A real challenge for countries that do not have an industrial innovation culture (yet).

CADCAM Group as a catalyst for these countries understands this need by organizing these executive days. Now the challenge is after these inspiring days to find the people and energy to follow-up.

Note: CADCAM Group graciously covered my expenses associated with my participation in these events but did not in any way influence the content of this paragraph.

 

The MBD/MBE discussion

In my earlier post, Model-Based: Connecting Engineering and Manufacturing,  I went deeper into the MBD/MBE topic and its potential benefits, closing with the request to readers to add their experiences and/or comments to MBD/MBE. Luckily there was one comment from Paul van der Ree, who had challenging experiences with MBD in the Netherlands. Together with Paul and a MBD-advocate (to be named) I will try to have discussion analyzing pro’s and con’s from all viewpoints and hopefully come to a common conclusion.

This to avoid that proponents and opponents of MBD just repeat their viewpoints without trying to converge. Joe Brouwer is famous for his opposition to MBD. Is he right or is he wrong I cannot say as there has never been a discussion. Click on the above image to see Joe’s latest post yourself. I plan to come back with a blog post related to the pro’s and con’s

 

The Death of PLM Consultancy

Early this year Oleg Shilovitsky and I had a blog debate related to the “Death of PLM Consultancy”. The discussion started here: The Death of PLM Consultancy ? and a follow-up post was PLM Consultants are still alive and have an exit strategy. It could have been an ongoing blog discussion for month where the value would be to get response from readers from our blogs.

Therefore I was very happy that MarketKey, the organizers behind the PLMx conferences in Europe and the US, agreed on a recorded discussion session during PLMx 2018 in Hamburg.  Paul Empringham was the moderator of this discussion with approx. 10 – 12 participants in the room to join the discussion. You can view the discussion here through this link: PLMx Hamburg debate

I want to thank MarketKey for their support and look forward to participating in their upcoming PLMx European event and if you cannot wait till next year, there is the upcoming PLMx conference in North America on November 5th and 6th – click on the image on the left to see the details.

 

 

PDT Europe call for papers

As you might have noticed I am a big supporter of the joint CIMdata/PDT Europe conference. This year the conference will be in Stuttgart on October 24th (PLM Roadmap) and October 25th (PDT).

I believe that this conference has a more “geeky” audience and goes into topics of PLM that require a good base understanding of what’s happening in the field. Not a conference for a newcomer in the world of PLM, more a conference for an experienced PLM person (inside a company or from the outside) that has experience challenging topics, like changing business processes, deciding on new standards, how to move to a modern digital business platform.

It was at these events where concepts as Model-Based were discussed in-depth, the need for Master Data Management, Industry standards for data exchange and two years ago the bimodal approach, also valid for PLM.

I hope to elaborate on experiences related to this bimodal or phased approach during the conference. If you or your company wants to contribute to this conference, please let the program committee know. There is already a good set of content planned. However, one or two inspiring presentations from the field are always welcome.
Click on this link to apply for your contribution

Conclusion

There is a lot on-going related to PLM as you can see. As I mentioned in the first topic it is about education and engagement. Be engaged and I am looking forward to your response and contribution in one or more of the topics discussed.

In my earlier post; PLM 2018 my focus, your input, I invited you to send PLM related questions that would spark of a dialogue. As by coincidence Oleg Shilovitsky wrote a post with the catchy title: Why traditional PLM ranking is dead. PLM ranking 2.0. Read this post and the comments if you want to follow this dialogue.

Oleg reacts in this post on the discussion that had started around the Forester Wave ranking PLM Vendors, which on its own is a challenging topic. I know from my experience that these rankings depend very much on a mix of functions and features, but also are profoundly influenced by the slideware and marketing power of these PLM Vendors. Oleg also quotes Joe Barkai’s post: ranking PLM Vendors to illustrate that this kind of ranking does not bring a lot of value as there is so much commonality between these systems.

I agree with Oleg and Joe. PLM ranking does not make sense for companies to select a PLM solution. They are more an internal PLM show, useful for the organizing consultancy companies to conduct, but at the end, it is a discussion about who has the biggest and most effective button. Companies need to sell themselves and differentiate.

Do we need consultancy?

We started a dialogue on the comments of Oleg’s blog post where I mentioned that PLM is not about selecting a solution from a vendor, there are many other facets related to a PLM implementation. First of all, the industry your company is active in. No solution fits all industries.

But before selecting a solution, you first need to understand what does a company want to achieve in the future. What is the business strategy and how can PLM support this business strategy?

In most cases, a strategy is future-oriented and not about consolidating the current status quo. Therefore I believe a PLM implementation is always done in the context of a business transformation, which is most of the time not only related to PLM – it is about People, Processes and then the tools.

Oleg suggests that this complexity is created by the consulting business, as he writes:

Complex business and product strategies are good for consulting business you do. High level of complexity with high risk of failure for expensive PLM projects is a perfect business environment to sell consulting. First create complexity and then hire consulting people to explain how to organize processes and build business and product strategy. Win-win

Enterprise and engineering IT are hiring consulting to cover their decision process. That was a great point made by Joe Barkai- companies are buying roadmaps and long-term commitments, but rarely technologies. Technologies can be developed, and if even something is missed, you can always acquire independent vendors or technology later – it was done many times by many large ISVs in the past.

Here I agree with a part of the comments. If you hire consultancy firms just for the decision process, it does not make sense/ The decision process needs to be owned by the company. Do not let a consultancy company prescribe your (PLM) strategy as there might be mixed interests. However, when it comes to technologies, they are derived from the people and process needs.

So when I write in the comment:

We will not change the current status quo and ranking processes very soon. Technology is an enabler, but you need a top-down push to work different (at least for those organizations that read vendor rankings).

Oleg states:

However, the favorite part of your comments is this – “We will not change the current status quo and ranking processes very soon.” Who are “we”???? Management consulting people?

With “we” I do not mean the consulting people. In general, the management of companies is more conservative than consultants are. It is our human brain that is change averse and pushes people to stay in a kind of mainstream mode. In that context, the McKinsey article: How biases, politics, and egos derail business decisions is a fascinating read about company dynamics. Also, CIMdata published in the past a slide illustrating the gap between vision, real capabilities and where companies really are aiming at.

There is such a big gap between where companies are and what it possible. Software vendors describe the ideal world but do not have a migration path. One of the uncomfortable discussions is when discussing a cloud solution is not necessary security (topic #1) but what is your exit strategy? Have you ever thought about your data in a cloud solution and the vendor raises prices or does no longer have a viable business model. These are discussions that need to take place too.

Oleg also quotes a CIMdata cloud PLM research how companies are looking for solutions as they are “empowered” by the digital world. Oleg states:

In a digital world, companies are checking websites, technologies, watching YouTube and tried products available online. Recent cloud PLM research published by CIMdata tells that when companies are thinking about cloud PLM, the first check they do is independent software providers recommendations and websites (not business process consultants).

I am wondering the value of this graph. The first choice is independent software recommendations/websites.  Have you ever seen independent software recommendations?

Yes, when it comes to consumer tools. “I like software A because it gives me the freedom what to do” or “Software B has so many features for such a low price – great price/value ratio.”

These are the kind of reviews you find on the internet for consumers. Don’t try to find answers on a vendor website as there you will get no details, only the marketing messages.

I understand that software vendors, including Oleg’s company OpenBOM, needs to differentiate by explaining that the others are too complex. It is the same message you hear from all the relative PLM newcomers, Aras, Autodesk, …….

All these newcomers provide marketing stories and claim successes because of their tools, where reality is the tool is secondary to the success. First, you need the company to have a vision and a culture that matches this tool. Look at an old Gartner picture (the hockey stick projection) when all is aligned. The impact of the tool is minimal.

Conclusion

Despite democratization of information, PLM transformations will still need consultants or a well-educated workforce inside your company. Consultants have the advantage of collected experience, which often is not the case when you work inside a company. We should all agree that at the end it is about the business first (human beings are complex) and then the tools (here you can shop on the internet what matches the vision)

Although this post seems like ping-pong match of arguments, I challenge you to take part of this discussion. Tell us where you agree or disagree combined with argumentation as we should realize the argumentation is the most valuable point.
Your thoughts?

Happy New Year to all of you. A new year comes traditionally with good intentions for the upcoming year.  I would like to share my PLM intentions for this year with you and look forward to your opinion. I shared some of my 2017 thoughts in my earlier post: Time for a Break. This year will I focus on the future of PLM in a digital enterprise, current PLM practices and how to be ready for the future.

Related to these activities I will zoom in on people-related topics, like organizational change, business impact and PLM justification in an enterprise. When it happens during the year, or based on your demands, I will zoom in on architectural stuff and best practices.

The future of PLM

Accenture – Digital PLM

At this moment digital transformation is on the top of the hype curve and the impact varies of course per industry. For sure at the company’s C-level managers will be convinced they have the right vision and the company is on the path to success.

Statements like: “We will be the first digital industrial enterprise” or “We are now a software company” impress the outside world and often investors in the beginning.

 

Combined with investments in customer related software platforms a new digital world is relative fast created facing the outside world.  And small pilots are celebrated as significant successes.

What we do not see is that to show and reap the benefits of digital transformation companies need to do more than create a modern, outside facing infrastructure. We need to be able to connect and improve the internal data flow in an efficient way to stay competitive. Buzzwords like digital thread and digital twin are relevant here.

To my understanding we are still in the early phases of discovering the ideal architecture and practices for a digital enterprise. PLM Vendors and technology companies show us the impressive potential as-if the future already exists already now. Have a reality check from Marc Halpern (Gartner) in this article on engineering.com – Digital Twins: Beware of Naive Faith in Simplicity.

I will focus this year on future PLM combined with reality, hopefully with your support for real cases.

Current PLM practices

Although my curiosity is focused on future PLM, there is still a journey to go for companies that have just started with PLM.  Before even thinking of a digital enterprise, there is first a need to understand and implement PLM as an infrastructure outside the engineering department.

Many existing PLM implementations are actually more (complex) document management systems supporting engineering data, instead of using all available capabilities of a modern PLM systems. Topics like Systems Engineering, multidisciplinary collaboration, Model-Based Enterprise, EBOM-MBOM handling, non-intelligent numbering are all relevant for current and future PLM.

Not exploring and understanding them in your current business will make the gap towards the future even bigger. Therefore, keep on sending your questions and when time allows I will elaborate. For example, see last year’s PLM dialogue – you find these posts here: PLM dialogue and PLM dialogue (continued). Of course I will share my observations in this domain too when I bump into them.

 

To be ready for the future

The most prominent challenge for most companies however is how to transform their existing business towards a modern digital business where new processes and business opportunities need to be implemented inside an existing enterprise. These new processes and business opportunities are not just simple extensions of the current activities, they need new ways of working like delivering incremental results through agile and multidisciplinary teams. And these ways of working combined with never-existing-before interactivity with the market and the customer.

How to convince management that these changes are needed and do not happen without their firm support? It is easier to do nothing and push for small incremental changes. But will this be fast enough? Probably not as you can read from research done by strategic consultancy firms. There is a lot of valuable information available if you invest time in research. But spending time is a challenge for management.

I hope to focus on these challenges too, as all my clients are facing these challenges. Will I be able to help them? I will share successes and pitfalls with you, combined supporting information that might be relevant for others

Your input?

A blog is a modern way of communicating with anyone connected in the world. What I would like to achieve this year is to be more interactive. Share your questions – there are no stupid questions as we are all learning. By sharing and learning we should be able to make achievable steps and become PLM winners.

Best wishes to us all and be a winner not a tweeter …..

 

 

When I started working with SmarTeam Corp.  in 1999, the company had several product managers, who were responsible for the whole lifecycle of a component or technology. The Product Manager was the person to define the features for the new release and provide the justification for these new features internally inside R&D.  In addition the Product Manager had the external role to visit customers and understand their needs for future releases and building and explaining a coherent vision to the outside and internal world. The product manager had a central role, connecting all stakeholders.

In the ideal situation the Product Manager was THE person who could speak in R&D-language about the implementation of features, could talk with marketing and documentation teams to explain the value and expected behavior and could talk with the customer describing the vision, meanwhile verifying the product’s vision and roadmap based on their inputs.All these expected skills make the role of a product manager challenging. Is the person too “techy” than he/she will enjoy working with R&D but have a hard time understanding customer demands. From the other side if the Product Manager is excellent in picking-up customer and market feedback he/she might not be heard and get the expected priorities from R&D. For me, it has always been clear that in software world a “bi-directional” Product Manager is crucial to success.

Where are the Product Managers in the Manufacturing Industry?

Approximate four years ago new concepts related to digitalization for PLM became more evident. How could a digital continuity connect the various disciplines around the product lifecycle and therefore provide end-to-end visibility and traceability? When speaking of end-to-end visibility most of the time companies talked about the way they designed and delivered products, visibility of what is happening stopped most of the time after manufacturing. The diagram to the left, showing a typical Build To Order organization illustrates the classical way of thinking. There is an R&D team working on Innovation, typically a few engineers and most of the engineers are working in Sales Engineering and Manufacturing Preparation to define and deliver a customer specific order. In theory, once delivered none of the engineers will be further involved, and it is up to the Service Department to react to what is happening in the field.

A classical process in the PLM domain is the New Product Introduction process for companies that deliver products in large volumes to the market, most of the time configurable to be able to answer to various customer or pricing segments. This process is most of the time linear and is either described in one stream or two parallel streams. In the last case, the R&D department develops new concepts and prepares the full product for the market. However, the operational department starts in parallel, initially involved in strategic sourcing, and later scaling-up manufacturing disconnected from R&D.

I described these two processes because they both illustrate how disconnected the source (R&D/ Sales)  are from the final result in the field. In both cases managed by the service department. A typical story that I learned from many manufacturing companies is that at the end it is hard to get a full picture from what is happening across the whole lifecycle, How external feedback (market & customers) have the option to influence at any stage is undefined. I used the diagram below even  before companies were even talking about a customer-driven digital transformation. Just understanding end-to-end what is happening with a product along the lifecycle is already a challenge for a company.

Putting the customer at the center

Modern business is about having customer or market involvement in the whole lifecycle of the product. And as products become more and more a combination of hardware and software, it is the software that allows the manufacturer to provide incremental innovation to their products. However, to innovate in a manner that is matching or even exceeding customer demands, information from the outside world needs to travel as fast as possible through an organization. In case this is done in isolated systems and documents, the journey will be cumbersome and too slow to allow a company to act fast enough. Here digitization comes in, making information directly available as data elements instead of documents with their own file formats and systems to author them. The ultimate dream is a digital enterprise where date “flows”, advocated already by some manufacturing companies for several years.

In the previous paragraph I talked about the need to have an infrastructure in place for people in an organization to follow the product along the complete lifecycle, to be able to analyze and improve the customer experience. However, you also need to create a role in the organization for a person to be responsible for combining insights from the market and to lead various disciplines in the organization, R&D, Sales, Services. And this is precisely the role of a Product Manager.

Very common in the world of software development, not yet recognized in manufacturing companies. In case a product manager role exists already in your organization, he/she can tell you how complicated it currently is to get an overall view of the product and which benefits a digital infrastructure would bring for their job. Once the product manager is well-supported and recognized in the organization, the right skill set to prioritize or discover actions/features will make the products more attractive for consumers. Here the company will benefit.

Conclusion

If your company does not have the role of a product manager in place, your business is probably not yet well enough engaged in the customer journey.  There will be broken links and costly processes to get a fast response to the market.  Consider the role of a Product Manager, which will emerge as seen from the software business.

NOTE 1: Just before publishing this post I read an interesting post from Jan Bosch: Structure Eats Strategy. Well fitting in this context

NOTE 2: The existence of a Product Manager might be a digital maturity indicator for a company, like for classical PLM maturity, the handling of the MBOM (PDM/PLM/ERP) gives insight into PLM maturity of a company.

Related to the MBOM, please read: The Importance of a PLM data model – EBOM and MBOM

 

 

 

 

 

As I am preparing my presentation for the upcoming PDT Europe 2017 conference in Gothenburg, I was reading relevant experiences to a data-driven approach. During PDT Europe conference we will share and discuss the continuous transformation of PLM to support the Lifecycle Model-Based Enterprise. 

One of the direct benefits is that a model-based enterprise allows information to be shared without the need to have documents to be converted to a particular format, therefore saving costs for resources and bringing unprecedented speed for information availability, like what we are used having in a modern digital society.

For me, a modern digital enterprise relies on data coming from different platforms/systems and the data needs to be managed in such a manner that it can serve as a foundation for any type of app based on federated data.

This statement implies some constraints. It means that data coming from various platforms or systems must be accessible through APIs / Microservices or interfaces in an almost real-time manner. See my post Microservices, APIs, Platforms and PLM Services. Also, the data needs to be reliable and understandable for machine interpretation. Understandable data can lead to insights and predictive analysis. Reliable and understandable data allows algorithms to execute on the data.

Classical ECO/ECR processes can become highly automated when the data is reliable, and the company’s strategy is captured in rules. In a data-driven environment, there will be much more granular data that requires some kind of approval status. We cannot do this manually anymore as it would kill the company, too expensive and too slow. Therefore, the need for algorithms.

What is understandable data?

I tried to avoid as long as possible academic language, but now we have to be more precise as we enter the domain of master data management. I was triggered by this recent post from Gartner: Gartner Reveals the 2017 Hype Cycle for Data Management. There are many topics in the hype cycle, and it was interesting to see Master Data Management is starting to be taken seriously after going through inflated expectations and disillusionment.

This was interesting as two years ago we had a one-day workshop preceding PDT Europe 2015, focusing on Master Data Management in the context of PLM. The attendees at that workshop coming from various companies agreed that there was no real MDM for the engineering/manufacturing side of the business. MDM was more or less hijacked by SAP and other ERP-driven organizations.

Looking back, it is clear to me why in the PLM space MDM was not a real topic at that time. We were still too much focusing and are again too much focusing on information stored in files and documents. The only area touched by MDM was the BOM, and Part definitions as these objects also touch the ERP- and After Sales-  domain.

Actually, there are various MDM concepts, and I found an excellent presentation from Christopher Bradley explaining the different architectures on SlideShare: How to identify the correct Master Data subject areas & tooling for your MDM initiative. In particular, I liked the slide below as it comes close to my experience in the process industry

Here we see two MDM architectures, the one of the left driven from ERP. The one on the right could be based on the ISO-15926 standard as the process industry has worked for over 25 years to define a global exchange standard and data dictionary. The process industry was able to reach such a maturity level due to the need to support assets for many years across the lifecycle and the relatively stable environment. Other sectors are less standardized or so much depending on new concepts that it would be hard to have an industry-specific master.

PLM as an Application Specific Master?

If you would currently start with an MDM initiative in your company and look for providers of MDM solution, you will discover that their values are based on technology capabilities, bringing data together from different enterprise systems in a way the customer thinks it should be organized. More a toolkit approach instead of an industry approach. And in cases, there is an industry approach it is sporadic that this approach is related to manufacturing companies. Remember my observation from 2015: manufacturing companies do not have MDM activities related to engineering/manufacturing because it is too complicated, too diverse, too many documents instead of data.

Now with modern digital PLM, there is a need for MDM to support the full digital enterprise. Therefore, when you combine the previous observations with a recent post on Engineering.com from Tom Gill: PLM Initiatives Take On Master Data Transformation I started to come to a new hypotheses:

For companies with a model-based approach that has no MDM in place, the implementation of their Product Innovation Platform (modern PLM) should be based on the industry-specific data definition for this industry.

Tom Gill explains in his post the business benefits and values of using the PLM as the source for an MDM approach. In particular, in modern PLM environments, the PLM data model is not only based on the BOM.  PLM now encompasses the full lifecycle of a product instead of initially more an engineering view. Modern PLM systems, or as CIMdata calls them Product Innovation Platforms, manage a complex data model, based on a model-driven approach. These entities are used across the whole lifecycle and therefore could be the best start for an industry-specific MDM approach. Now only the industries have to follow….

Once data is able to flow, there will be another discussion: Who is responsible for which attributes. Bjørn Fidjeland from plmPartner recently wrote: Who owns what data when …?  The content of his post is relevant, I only would change the title: Who is responsible for what data when as I believe in a modern digital enterprise there is no ownership anymore – it is about sharing and responsibilities

 

Conclusion

Where MDM in the past did not really focus on engineering data due to the classical document-driven approach, now in modern PLM implementations, the Master Data Model might be based on the industry-specific data elements, managed and controlled coming from the PLM data model

 

Do you follow my thoughts / agree ?

 

 

During my summer holidays, I read some fantastic books to relax the brain. Confessions from Jaume Cabré was an impressive novel, and I finished Yuval Noah Harari’s book Sapiens.

However, to get my PLM-twisted brain back on track, I also decided to read the book “The Death of Expertise” from Tom Nichols, with the thought-provoking subtitle” “The Campaign Against Established Knowledge and Why it Matters.”

I wanted to read it and understand if and how this would apply for PLM.

Tom Nichols is an American, so you understand he has many examples to support his statement from his own experience, like the anti-vaccination “experts”,  the climate change “hoax” and an “expert” tweeting president in his country who knows everything. Besides these obvious examples, Tom explains in a structured way how due to more general education and the internet, the distance between an expert and a average person has disappeared and facts and opinions seem to be interchangeable. I talked about this phenomena during the Product Innovation conference in Munich 2016: The PLM identity crisis.

Further down the book, Tom becomes a little grumpy and starts to complain about the Internet, Google and even about Wikipedia. These information resources provide so often fake or skin-deep information, which is not scientifically proven by experts. It reminded me of a conference that I attended in the early nineties of the previous century.  An engineering society had organized this conference to discuss the issue that finite element analysis became more and more available to laymen. The affordable simulation software would be used by non-trained engineers, and they would make the wrong decisions. Constructions would fall down, machines would fail. Looking back now, we can see the liberation of finite element analysis leads to more usage of simulation technology providing better products and when really needed experts are still involved.

I have the same opinion for internet, Google, and Wikipedia. They rapidly provide information. Still, you need to do fact checking and look at multiple sources, even if you found the answer that you liked already. Usually, when I do my “research” using the internet, I try to find different sources with different opinions and if possible also from various countries. What you will discover is that, when using the internet, there is often detailed information, but not in the headlines of these pages. To get down to the details, we will need experts for certain cases, but we cannot turn the clock back to the previous century.

What about PLM Expertise?

In the case of PLM, it is hard to find real expertise. Although PLM is recognized as a business strategy / a domain / an infrastructure , PLM has so many faces depending on the industry and its application. It will be hard to find an expert who understands it all and I assume headhunters can confirm this. A search for “PLM Consultant” on LinkedIn gives me almost 4000 hits, and when searching for “PLM Expert,” this number is reduced to less than 200. With only one source of information (LinkedIn), these figures do not really give an in-depth result (as expected !)

However, what is a PLM expert? Recently I wrote a post sharing the observation that a lot of PLM product – or IT-focused discussions miss the point of education (see PLM for Small and Medium Enterprises – It is not the software). In this post, I referred to an initiative from John Stark striving for the recognition of a PLM professional. You can read John’s follow up on this activity here: How strong is the support for Professional PLM?  Would a PLM Professional bring expertise?

I believe when a company understands the need for PLM, they have to build this knowledge internally. Building knowledge is a challenge for small and medium enterprises. It is a long-term investment contributing to the viability of the company. Support from a PLM professional can help. However, like the job of a teacher, it is about the skill-set (subjects, experience) and the motivational power of such a person. A certificate won’t help to select a qualified person.

Conclusion

We still need PLM expertise, and it takes time to build it. Expertise is something different as an (internet) opinion. When gaining PLM expertise, use the internet and other resources wisely. Do not go for the headlines of an internet page. Go deeper than the marketing pages from PLM related companies (vendors/implementers). Take time and hire experts to help you, not to release you from your responsibility to collect the expertise.

 

Note: If you want to meet PLM Experts and get a vendor-independent taste of PLM, join me at PDT Europe 2017 on 18-19 October in Gothenburg.  The theme of the conference: Continuous transformation of PLM to support the Lifecycle Model-Based Enterprise.  The conference is preceded on 17th October by CIMdata’s PLM Roadmap Europe 2017. Looking forward to meet you there !

 

 

My last blog post was about reasons why PLM is not simple. PLM supporting a well-planned business transformation requires business change / new ways of working. PLM is going through different stages. We are moving from drawing-centric (previous century), through BOM-centric (currently) towards model-centric (current and future). You can read the post here: PLM is not simple!

I was happy to see  my blog buddy Oleg Shilovitsky chimed in on this theme, with his post: Who needs Simple PLM? Oleg reviewed the stakeholders around a PLM implementation. An analytical approach which could be correct in case predictive human beings were involved. Since human beings are not predictive and my focus is on the combination of PLM and human beings, here are some follow comments on the points Oleg made:

 

Customers (Industrial companies)

Oleg wrote:

A typical PLM customer isn’t a single user. A typical PLM buyer is engineering IT organization purchasing software to solve business problem. His interest to solve business problem, but not really to make it simple. Complex software requires more people, an increased budget and can become an additional reason to highlight IT department skills and experience. End-users hate complex software these days,therefore, usability is desired, but not top priority for enterprise PLM.

My comments on this part: PLM becomes more and more an infrastructure for product information along the whole lifecycle. PLM is no longer an engineering tool provided by IT.

There are now many other stakeholders that need product data, in particular when we are moving to a digital enterprise. A model-based approach connects Manufacturing and Service/Operations through a digital thread. It is the business demanding for PLM to manage their complexity. IT will benefit from a reduction in silo applications.

 

PLM Vendors

Oleg wrote:

…most PLM vendors are far away from a desired level of simplicity. Marketing will like “simple” messages, but if you know how to sell complex software, you won’t be much interested to see “simple package” everyone can sell. However, for the last decade, PLM vendors were criticized a lot for complexity of their solutions, so they are pretty much interested how to simplify things and present it as a competitive differentiation.

 

Here we are aligned. All PLM vendors are dreaming of simplifying their software. Imagine: if you have a simple product everyone can use, you would be the market leader and profitable like crazy without a big effort as the product is simple. Of course, this only works, assuming this dream can be realized.

Some vendors believe that easy customization or configuration of the system means simplification. Others believe a simple user-interface is the key differentiator. Compared to mass-consumer software products in the market, a PLM system is still a niche product, with a limited amount of users working with the exact same version of the software. Combined with the particular needs (customizations) every company has (“we are different”), there will never be a simple PLM solution. Coming back to the business transformation theme, human beings are the weakest link.

 

Implementation and Service Providers

Oleg wrote:

Complex software, customization, configuration, know-hows, best practices, installation… you name it.More of these things can only lead to more services which is core business of PLM service providers. PLM industry is very much competitive, but simplicity is not a desired characteristic for PLM when it comes to service business. Guess what… customer can figure it out how to make it and stop paying for services.

Here we are totally aligned. In the past, I have been involved in potential alliances where certain service providers evaluated SmarTeam as a potential tool for their business. In particular, the major PLM service providers did not see enough value in an easy to configure and relatively cheap product. Cheap means no budget for a huge amount of services.

Still, the biggest problem SmarTeam had after ten years was the fact that every implementation became a unique deployment. Hard to maintain and guarantee for the future. In particular, when new functionality was introduced which potentially already existed as customization.  Implementation and service providers will never say NO to a customer when it comes to further customization of the system. Therefore, the customer should be in charge and own the implementation. For making strategic decision support can come from a PLM consultant or coach.

 

PLM Consultants

Here Oleg wrote:

Complex software can lead to good consulting revenues. It was true many years for enterprise software. Although, most of PLM consultants are trying to distant from PLM software and sell their experience “to implement the future”, simplicity is not a favorite word in consulting language. Customer will hire consulting people to figure out the future and how to transform business, but what if software is simple enough to make it happen without consultant? Good question to ask, but most of them will tell you it is not a realistic scenario. Which is most probably true today. But here is the hint – remember the time PC technicians knew how to configured jumpers on PC cards to make printer actually print something?

Here we are not aligned. Business transformations will never happen because of simple tools. People are measured and pushed to optimize their silos in the organization. A digital transformation, which is creating a horizontal flow and transparency of information, will never happen through a tool. The organization needs to change, and this is always driven by a top-down strategy. PLM consultants are valuable to explain the potential future, to coach all levels of the organization. In theory, a PLM consultant’s job is tool independent. However, the challenge of being completely disconnected from the existing tools might allow for dreams that never can be realized. In reality, most PLM consultants are experienced in one or more specific tools they have been implementing. The customer should be aware of that and make sure they own the PLM roadmap.

My conclusion:

Don’t confuse PLM with a tool, simple or complex. All PLM tools have a common base and depending on your industry and company’s vision there will be a short list. However, before you touch the tools, understand your business and the transformation path you want to take. And that is not simple !!

 

Your opinion?

Oleg and I can continue this debate for a long time.  We would be interested in learning your view on PLM and Simplicity – please tune in through the comments section below:

Translate

Email subscription to this blog

Advertisements
%d bloggers like this: