You are currently browsing the category archive for the ‘PLM’ category.

A week ago, Martijn Dullaart and Martin Haket, both well skilled in configuration management, published a post on MDUX.net and LinkedIn where they pledged for standardization across domains, first of all when supporting processes between PLM and ERP – read the full article here: We need a cross-platform interface for impact analysis!

In the article, some standards with various scopes related to product information are mentioned (PDX, ISO 10303, ISO305:2017), and the article suggests that impact analysis should be done in an overarching domain, the CM domain, outside PLM. See the image on the left. I have several issues with this approach, which I will explain here.

PLM and/or CM are overarching domains

The diagram puts CAD, SW, PLM, and ERP as verticals, where I would state PLM is responsible for the definition of the Product, which means governing CAD and SW and publishing towards ERP for execution. You might discuss if CM is part of PLM or that CM is a service on top of PLM.

We had this discussion before: PLM and Configuration Management – a happy marriage? and one of my points was that in aerospace, defense, and automotive companies actively invest in CM. If you go to other industries and sizes of business, CM becomes more an intention than a practice.

Somehow the same challenge PLM has when it comes to full lifecycle support.

Now let’s look at impact analysis

During my personal PLM lifecycle, I discussed with many companies how a PLM system could provide a base for impact analysis. These scenarios were mainly based on a Where-Used analysis in the context of an engineering change, which PLM should be able to offer.

PLM would provide the information in which actual running products or upcoming products are impacted by a potential change – this would provide the technical answer and the impact on production.

To support the financial case, a more advanced impact analysis was required. This is often a manual process. In more advanced cases customization is used, to provide real-time information about the current warehouse stock (scrap?) and potential ordered parts/materials.

I could imagine some other potential impacts to analyze, for example, the marketing/sales plans, but haven’t come across these situations in my projects.

To start from a well-thought approach, I expected more from the article, Martijn and Martin wrote:

A good candidate to be used as interface between the expert domains and CM domain is, in our opinion, the CM2 impact matrix. This captures the information on an aggregate level like a part, or document or dataset.  This aggregate level can be used by other expert domains to identify impact within their scope or by the CM domain to support cost estimation and implementation planning.

So I followed the link to discover and digest the CM2 impact matrix. However, the link leads to CM2 training, not directly useful information – the impact matrix. Should I get CM2 trained before to gain access to the information?

This is the same lock-in where a PLM Vendor will state:
Buy our system and use our impact analysis.

I believe every respectable PLM system has a base for impact analysis and probably a need to be customized for outside data. Martijn and Martin agree on that, as they write:

This interface is currently not existent in the offerings of the various vendors. If an impact matrix is available, it is to support the impact analysis within the tool of a vendor not to support impact analysis within a business. That is why Martin and I challenge the vendors in the various expert domains to come with a standard to allow businesses to perform a high-quality cross-functional impact analysis that improves the quality of decision making.

Vendors coming with a standard?

In particular, the sentences challenging the vendors to come with a standard is a mission impossible, to my opinion.

A vendor will never come with a standard unless THEY become the standard.

CATIA in Aerospace/Automotive, DXF in 2D mid-market CAD, IFC for the building market, Excel for calculations, are all examples where one vendor dominates the market.  I do not believe there will be in any R&D department of a software company, people working spontaneously on standards,

Companies have to develop and push for standards

Standards have always been developed because there was a need to exchange information, most of the time needed for an exchange between companies – OEMs – Suppliers – partners. In the case of impact analysis, the target might be slightly different. Impact analysis is mainly focusing on internal systems within the organization that is planning a change.

And this makes the push for standardization again more complicated.

Let me explain why:

First, there is ERP – the image above shows Management of Change in the SAP help environment.

In most companies, the ERP-system is the major IT-system and all efforts to automate processes were targeted to be solved within ERP.

Therefore, you will find basic impact analysis capabilities, mainly related to the execution side: actual stock, planned production orders, and logistics in ERP. The rest of impact analysis is primarily a manual task.

Next, with the emergence of PLM-systems, impact analysis shifted towards the planning side: Where Used or Where Related became capabilities related to engineering change request. In my SmarTeam days we developed templates for that:

Where the Analysis was still a manual action, where the PLM-system would provide Where Used support and potentially a custom ERP-connection would give some additional information.

Nowadays, I would state all PLM vendors have the technical capabilities to create an impact analysis dashboard.  Aras by rapid customization, Dassault Systemes by using Exalead, PTC by using Navigate and Siemens by using Mendix – so technology exists, but what about standards?

In the comments section to the LinkedIn post, Martijn mentions that the implemented change behavior in PLM is not exactly as he (or CM2 methodology) would propose. – the difficulty in the happy marriage between PLM and CM. See his comment here.

For me, these comments are change requests to the PLM vendors, and they will be only heard when there is a push from the outside world.

Therefore my (simplified)  proposal:

  • Start an Impact Analysis community outside CM2 as there are many companies not following CM2; still they have their particular ways of working. Perhaps this community exists and lacks visibility – I am happy to learn.
  • Describe the potential processes and people involved and collect/combine the demands – think tool independent as this is the last step.
  • Publish the methodology as an open standard and have it rated by the masses. The rating will influence the software vendors in the market.

Conclusion

Asking vendors to come with a cross-platform interface standard for impact analysis is a mission impossible. Standards appear when there is a business need that needs to come from the market. Impact Analysis has an additional difficulty as it is mostly a company-internal process.

 

The usage of standards has been a recurring topic the past 10 months, probably came back to the surface at PI PLMx Chicago during the PLM Leaders panel discussion. If you want to refresh the debate, Oleg Shilovitsky posted an overview: What vendors are thinking about PLM standards – Aras, Dassault Systemes, Onshape, Oracle PLM, Propel PLM, SAP, Siemens PLM.

It is clear for vendors when they would actively support standards they reduce their competitive advantage, after all, you are opening your systems to connect to other vendor solutions, reducing the chance to sell adjacent functionality. We call it vendor lock-in. If you think this approach only counts for PLM, I would suggest you open your Apple (iPhone) and think about vendor lock-in for a moment.

Vendors will only adhere to standards when pushed by their customers, and that is why we have a wide variety of standards in the engineering domain.

Take the example of JT as a standard viewing format, heavily pushed by Siemens for the German automotive industry to be able to work downstream with CATIA and NX models. There was a JT-version (v9.5) that reached ISO 1306 alignment, but after that, Siemens changed JT (v10) again to optimize their own exchange scenarios, and the standard was lost.

And as customers did not complain (too much), the divergence continued. So it clear  vendors will not maintain standards out of charity as your business does not work for charity either (or do you ?). So I do not blame them is there is no push from their customers to maintain them.

What about standards?

The discussion related to standards flared up around the IpX ConX19 conference and a debate between Oleg & Hakan Kardan (EuroSTEP) where Hakan suggested that PLCS could be a standard data model for the digital thread – you can read Oleg’s view here: Do we need a standard like PLCS to build a digital thread.

Oleg’s opening sentence made me immediately stop reading further as more and more I am tired of this type of framing if you want to do a serious discussion based on arguments. Such a statement is called framing and in particular in politics we see the bad examples of framing.

Standards are like toothbrushes, a good idea, but no one wants to use anyone else’s. The history of engineering and manufacturing software is full of stories about standards.

This opening sentence says all about the mindset related to standards – it is a one-liner – not a fact. It could have been a tweet in this society of experts.

Still later,I read the blog post and learned Oleg has no arguments to depreciate PLCS, however as he does not know the details, he will probably not use it. The main challenge of standards: you need to spend time to understand and adhere to them and agree on following them. Otherwise, you get the same diversion of JT again or similar examples.

However, I might have been wrong in my conclusion as Oleg did some thinking on a Sunday and came with an excellent post: What would happen if PLM Vendors agree about data standards. Here Oleg is making the comparison with a standard in the digital world, established by Google, Microsoft, Yahoo, and Yandex : Schema.org: Evolution of Structured Data on the Web.

There is a need for semantic mapping and understanding in the day-to-day-world, and this understanding makes you realize the same is needed for PLM. That was one of the reasons why I wrote in the past (2015) a series of posts related to the importance of a PLM data model:

All these posts were aimed to help companies and implementers to make the right choices for an item-centric PLM implementation. At that time – 2015, item-centric was the current PLM best practice. I learned from my engagements in the past 15 years, in particular when you have a flexible modeling tool like SmarTeam or nowadays Aras, making the right data model decisions are crucial for future growth.

Who needs standards?

First of all, as long as you stay in your controlled environment, you do not need standards. In particular, in the Aerospace and Automotive industry, the OEMs defined the software versions to be used, and the supply chain had to adhere to their chosen formats. Even this narrow definition was not complete enough as a 3D CAD model needed to be exported for simulation or manufacturing purposes. There was not a single vendor working on a single CAD model definition at that time. So the need for standards emerged as there was a need to exchange data.

Data exchange is the driving force behind standards.

In a second stage also neutral format data storage became an important point – how to save for 75 years an aircraft definition.

Oil & Gas / Building – Construction

These two industries both had the need for standards. The Oil & Gas industry relies on EPC (Engineering / Procurement / Construction)  companies that build plants or platforms. Then the owner/operator takes over the operation and needs a hand-over of all the relevant information. However if this information would be delivered in the application-specific formats the EPC companies have used, the owner/operator would require various software environments and skills, just to have access to the data.

Therefore if the data is delivered in a standard format (ISO 15926) and the exchange follows CFIHOS (Capital Facilities Information Hand Over Specification) this exchange can be done more automated between the EPC and Owner/Operator environment, leading to lower overall cost of delivering and maintaining the information combined with a higher quality. For that reason, the Oil & Gas industry has invested already for a long time in standards as their plants/platform have a long lifecycle.

And the same is happening in the construction industry. Initially Autodesk and Bentley were fighting to become the vendor-standard and ultimately the IFC-standard has taken a lot from the Autodesk-world, but has become a neutral standard for all parties involved in a construction project to share and exchange data. In particular for the construction industry,  the cloud has been an accelerator for collaboration.

So standards are needed where companies/people exchange information

For the same reason in most global companies, English became the standard language. If you needed to learn all the languages spoken in a worldwide organization, you would not have time for business. Therefore everyone making some effort to communicate in one standard language is the best way to operate.

And this is the same for a future data-driven environment – we cannot afford for every exchange to go to the native format from the receiver or source – common neutral (or winning) standards will ultimately also come up in the world of manufacturing data exchange and IoT.

Companies need to push

This is probably the blocking issue for standards. Developing standards, using standards require an effort without immediate ROI. So why not use vendor-formats/models and create custom point-to-point interface as we only need one or two interfaces?  Companies delivering products with a long lifecycle know that the current data formats are not guaranteed for the future, so they push for standards (aerospace/defense/ oil & gas/construction/ infrastructure).

3D PDF Model

Other companies are looking for short term results, and standards are slowing them down. However as soon as they need to exchange data with their Eco-system (suppliers/ customers) an existing standard will make their business more scalable. The lack of standards is one of the inhibitors for Model-Based Definition or the Model-Based Enterprise – see also my post on this topic: Model-Based – Connecting Engineering and Manufacturing

When we would imagine the Digital Enterprise of the future, information will be connected through data streams and models. In a digital enterprise file conversions and proprietary formats will impede the flow of data and create non-value added work. For example if we look to current “Digital Twin” concepts, the 3D-representation of the twin is recreated again instead of a neutral 3D-model continuity. This because companies currently work in a coordinated manner. In perhaps 10 years from now we will reach maturity of a model-based enterprise, which only can exist based on standards. If the standards are based on one dominating platform or based on a merger of standards will be the question.

To discuss this question and how to bridge from the past to the future I am looking forward meeting you at the upcoming PLM Roadmap & PDT 2019 EMEA conference on 13-14 November in Paris, France. Download the program here: PLM for Professionals – Product Lifecycle Innovation

Conclusion

I believe PLM Standards will emerge when building and optimizing a digital enterprise. We need to keep on pushing and actively working for meaningful standards as they are crucial to avoid a lock-in of your data. Potentially creating dead-ends and massive inefficiencies.  The future is about connected Eco-systems, and the leanest companies will survive. Standards do not need to be extraordinarily well-defined and can start from a high-level alignment as we saw from schema.org. Keep on investing and contributing to standards and related discussion to create a shared learning path.

Thanks Oleg Shilovitsky to keep the topic alive.

p.s. I had not time to read and process your PLM Data Commodizitation post

 

Last week I read Verdi Ogewell’ s article:  PTC puts the Needle to the Digital Thread on Engineering.com where Verdi raised the question (and concluded) who is the most visionary PLM CEO – Bernard Charles from Dassault Systemes or Jim Heppelman from PTC. Unfortunate again, an advertorial creating more haziness around modern PLM than adding value.

People need education and Engineering.com is/was a respected site for me, as they state in their Engineering.com/about statement:

Valuable Content for Busy Engineers. Engineering.com was founded on the simple mission to help engineers be better.

Unfortunate this is not the case in the PLM domain anymore. In June, we saw an article related to the failing PLM migration at Ericsson – see The PLM migration dilemma. Besides the fact that a big-bang migration had failed at Ericsson, the majority of the article was based on rumors and suggestions, putting the sponsor of this article in a better perspective.

Of course, Engineering.com needs sponsoring to host their content, and vendors are willing to spend marketing money on that. However, it would be fairer to mention in a footnote who sponsored the article – although per article you can guess. Some more sincere editors or bloggers mention their sponsoring that might have influenced their opinion.

Now, why did the article PTC puts the Needle to the Digital Thread made me react ?

Does a visionary CEO pay off?

It can be great to have a visionary CEO however, do they make the company and their products/services more successful? For every successful visionary CEO, there are perhaps ten failing visionary CEOs as the stock market or their customers did not catch their vision.

There is no lack of PLM vision as Peter Bilello mapped in 2014 when imagining the gaps between vision, available technology, and implementations at companies (leaders and followers). See below:

The tremendous gap between vision and implementations is the topic that concerns me the most. Modern PLM is about making data available across the enterprise or even across the company’s ecosystem. It is about data democratization that allows information to flow and to be presented in context, without the need to recreate this information again.

And here the marketing starts. Verdi writes:

PTC’s Internet of Things (IoT), Industrial Internet of Things (IIoT), digital twin and augmented reality (AR) investments, as well as the collaboration with Rockwell Automation in the factory automation arena, have definitely placed the company in a leading position in digital product realization, distribution and aftermarket services

With this marketing sentence, we are eager to learn why

“With AR, for example, we can improve the quality control of the engines,” added Volvo Group’s Bertrand Felix, during an on-stage interview by Jim Heppelmann. Heppelmann then went down to a Volvo truck with the engine lifted out of its compartment. Using a tablet, he was able to show how the software identified the individual engine, the parts that were included, and he could also pick up the 3D models of each component and at the same time check that everything was included and in the right place.

Impressive – is it real?

The point is that this is the whole chain for digital product realization–development and manufacturing–that the Volvo Group has chosen to focus on. Sub-components have been set up that will build the chain, much is still in the pilot stage, and a lot remains to be done. But there is a plan, and the steps forward are imminent.

OK, so it is a pilot, and a lot remains to be done – but there is a plan. I am curious about the details of that plan, as a little later, we learn from the CAD story:

The Pro/ENGINEER “inheritor” Creo (engine, chassis) is mainly used for CAD and creation of digital twins, but as previously noted, Dassault Systémes’ CATIA is also still used. Just as in many other large industrial organizations, Autodesk’s AutoCAD is also represented for simpler design solutions.

There goes the efficient digital dream. Design data coming from CATIA needs to be recreated in Creo for digital twin support. Data conversion or recreation is an expensive exercise and needs to be reliable and affordable as the value of the digital twin is gone once the data is incorrect.

In a digital enterprise, you do not want silos to work with their own formats, you want a digital thread based on (neutral) models that share metadata/parameters from design to service.

So I dropped the article and noticed Oleg had already commented faster than me in his post: Does PLM industry need a visionary pageant? Oleg refers also to CIMdata, as they confirmed in 2018 that the concept of a platform for product innovation (PIP), or the beyond PLM is far from reality in companies. Most of the time, a PLM-implementation is mainly a beyond PDM environment, not really delivering product data downstream.

I am wholly aligned with Oleg’s  technical conclusion:

What is my(Oleg’s) conclusion? PLM industry doesn’t need another round of visionary pageants. I’d call democratization, downstream usage and openness as biggest challenges and opportunities in PLM applications. Recent decades of platform development demonstrated the important role network platforms played in the development of global systems and services. PLM paradigm change from isolated vertical platforms to open network services required to bring PLM to the next level. Just my thoughts..

My comments to Oleg’s post:

(Jos) I fully agree we do not need more visionary PLM pageants. It is not about technology and therefore I have to disagree with your point about Aras. You call it democratization and openness of data a crucial point – and here I agree – be it that we probably disagree about how to reach this – through standards or through more technology. My main point to be made (this post ) is that we need visionary companies that implement and rethink their processes and are willing to invest resources in that effort. Most digital transformation projects related to PLM fail because the existing status quo/ middle management has no incentive to change. More thoughts to come

And this the central part of my argumentation – it is not about technology (only).

Organizational structures are blocking digital transformation

Since 2014 I have been following several larger manufacturing companies on their path from pushing products to the market in a linear mode towards a customer-driven, more agile, fast responding enterprise. As this is done by taking benefit of digital technologies, we call this process: digital transformation.

(image depicting GE’s digital thread)

What I have learned from these larger enterprises, and both Volvo Trucks and GE as examples, that there is a vision for an end result. For GE, it is the virtual twin of their engines monitored and improved by their Predix platform. For Volvo Trucks, we saw the vision in the quote from Verdi’s article before.

However, these companies are failing in creating a horizontal mindset inside their companies. Data can only be efficient used downstream if there is a willingness to work on collecting the relevant data upstream and delivering this information in an accessible format, preferably data-driven.

The Middle Management Dilemma

And this leads to my reference to middle management. Middle managers learn about the C-level vision and are pushed to make this vision happen. However, they are measured and driven to solve these demands, mainly within their own division or discipline. Yes, they might create goodwill for others, but when it comes to money spent or changing people responsibilities, the status quo will remain.

I wrote about this challenge in The Middle Management dilemma. Digital transformation, of course, is enabled by digital technologies, but it does not mean the technology is creating the transformation. The crucial fact lies in making companies more flexible in their operations, yet establishing better and new contacts with customers.

It is interesting to see that the future of businesses is looking into agile, multidisciplinary teams that can deliver incremental innovations to the company’s portfolio. Somehow going back to the startup culture inside a more significant enterprise. Having worked with several startups, you see the outcome-focus as a whole in the beginning – everyone contributes. Then when the size of the company grows, middle-management is introduced, and most likely silos are created as the middle management gets their own profit & loss targets.

Digital Transformation myths debunked

This week Helmut Romer (thanks Helmut) pointed me to the following HBR-article: Digital does not need to be disruptive where the following myths are debunked:

  1. Myth: Digital requires radical disruption of the value proposition.
    Reality: It usually means using digital tools to better serve the known customer need.
  2. Myth: Digital will replace physical
    Reality: It is a “both/and.”
  3. Myth: Digital involves buying start-ups.
    Reality: It involves protecting start-ups.
  4. Myth: Digital is about technology.
    Reality: It’s about the customer
  5. Myth: Digital requires overhauling legacy systems.
    Reality: It’s more often about incremental bridging.

If you want to understand these five debunked myths, take your time to read the full article, very much aligned with my argumentation, albeit it that my focus is more on the PLM domain.

Conclusions

Vendor sponsoring at Engineering.com has not improved the quality of their PLM articles and creates misleading messages. Especially as the sponsor is not mentioned, and the sponsor is selling technology – the vision gap is too big with reality to compete around a vision.

Transforming companies to take benefit of new technologies requires an end-to-end vision and mindset based on achievable, incremental learning steps. The way your middle management is managed and measured needs to be reworked as the focus is on horizontal flow and understanding of customer/market-oriented processes.

 

Three weeks ago, I closed my PLM-twisted mind for a short holiday. Meanwhile, some interesting posts appeared about the PLM journey.

  • Is it a journey?
  • Should the journey be measurable?
  • And what kind of journey could you imagine?

Together these posts formed a base for a decent discussion amongst the readers.  I like these discussions. For me, the purpose of blogging is not the same as tweeting. It is not about just making noise so others will chime in or react (tweeting), it is about sharing an opinion, and if more people are interested, the discussion can start. And a discussion is not about right or false, as many conversations happen to be nowadays, it is about learning.

Let’s start with the relevant posts.

How to measure PLM?

The initial discussion started with Oleg Shilovitsky’s post about the need to measure the value of PLM. As Oleg mentions in his comments:

“During the last decades, I learned that every company that measured what they do was winning the business and succeeded (let’s count Google, Amazon, etc ..)”

This is an interesting statement, just measure! The motto people are using for digital businesses. In particular for the fast-moving software business. Sounds great, so let’s measure PLM. What can we measure with PLM? Oleg suggests as an example:

“Let’s say before PLM implemented a specific process, sales needed 2 days to get a quote. After PLM process implementation, it is 15 min.”

So what does this result tell us? Your sales can do 64 times more sales quotes. Do we need fewer salespeople now? We do not know from this KPI what is the real value for the company. This because there are so many other dependencies related to this process, and that makes PLM different from, for example, ERP. We do not talk about optimizing a process as Oleg might suggest below:

“Some of my PLM friends like to say – PLM is a journey and not some kind of software. Well, I’m not sure to agree about “journey,” but I can take PLM as a process. A process, which includes all stages of product development, manufacturing, support, and maintenance.”

Note: I do not want to be picky on Oleg, as he is provoking us all many times with just his thoughts. Moreover, several of them are a good points for discussion. So please dive into his LinkedIn posts and follow the conversation.

In Oleg’s follow-up post on measuring the value, he continued with Can we measure the PLM-journey which summarizes the comments from the previous post with a kind of awkward conclusion:

What is my conclusion? It is a time for PLM get out of old fashion guessing and strategizing and move into digital form of thinking – calculating everything. Modern digital businesses are strongly focused on the calculation and measurement of everything. Performance of websites, metrics of application usage, user experience, efficiency, AB testing of everything. Measurement of PLM related activity sounds like no brainier decision to me. Just my thoughts…

I think all of us agree that there needs to be a kind of indicative measurement in place to justify investments in place. There must be expected benefits that solve current business problems or bottlenecks.

My points that I want to share with you are:

  • It is hard to measure non-comparable ways of working – how do you measure collaboration?
  • Do you know what to measure? – engineering/innovation is not an ERP process
  • People and culture have so much impact on the results – how do you measure your company’s capability to adapt to new ways of working?

Meanwhile, we continue our journey…

Is PLM a never-ending journey?

In the context of the discussion related to the PLM journey, I assume Chad Jackson from Lifecycle Insights added his 3 minutes of thoughts. You can watch the video here:

Vlogging seems to become more prevalent in the US. The issue for me is that vlogs only touch the surface, and they are hard to scan for interesting reusable content. Something you miss when you are an experienced speed-reader. I like written content as it is easier to pick and share relevant pieces, like what I am doing now in this post.

Chad states that as long as PLM delivers quantified value, PLM could be expanding. This sounds like a journey, and I could align here. The only additional thought I would like to add to this point is that it is not necessary expanding all the time, it is also about a continuous change in the world and therefore your organization. So instead of expanding, there might be a need to do things differently: Have you noticed PLM is changing.

Next Chad mentions organizational fatigue. I understand the point – our society and business are currently changing extremely fast, which causes people to long for the past. A typical behavior I observe everywhere: in the past, everything was better. However, if companies would go back and operate like in the past, they would be out of business. We moved from the paper drawing board to 3D CAD, managing it through PDM and PLM to remain significant. So there is always a journey.

Fatigue comes from choosing the wrong directions, having a reactive culture – instead of being inspired and motivated to reach the next stage, the current stage is causing already so much stress. Due to the reactive culture, people cannot imagine a better future – they are too busy. I believe it is about culture and inspiration that makes companies successful – not by just measuring.  For avoiding change, think about the boiling frog metaphor, and you see what I mean

 

Upgrading to PLM when PDM falls short

At the same time, Jim Brown from Tech-Clarity published a PTC-sponsored eBook: Upgrading to PLM when PDM fall short, in which as he states:

This eBook explains how to recognize that you’ve outgrown PDM and offers several options to find the data and process management capabilities your company needs, whether it’s time to find a more capable PDM or upgrade to PLM. It also provides practical advice on what to look for in a PLM solution, to ensure a successful implementation, and in a software partner.

Jim is mentioning various business drivers that can drive this upgrade path. Enlarge the image to the left. I challenge all the believers in measurable digital results to imagine which KPIs they would use and how they can be related to pure PLM.

Here the upgrade process is aiming at replacing PDM by PLM something PLM vendors like. Immediate a significant numbers of licenses for the same basic PDM functionality – for your company hard to justify there is no additional value.

In many situations, I have seen that this type of PDM upgrade projects became advanced PDM projects – not PLM. The new PLM system was introduced in the engineering department and became an even bigger silo than before as other disciplines/departments were not willing to work with this new “monster” and preferred their own system. They believe that PLM is a system to be purchased and implemented, which is killing for a real PLM strategy.

Therefore I liked Oleg Shilovitsky’s post: 3 Reasons for Not Growing Existing PDM Into the Full PLM System.  Where Oleg’s points were probably more technology-driven, the value of this post was extended in the discussion. It became a discussion where various people and different opinions which I would like to have in real-time. The way LinkedIn filters/prioritizes comments makes it hard to have a chronological view of the discussion.

Still, if you are interested and have time for a puzzle, follow this discussion and add your thoughts

Conclusion

During my holidays, there was a vivid discussion related to the PLM value and journey. Looking back, it is clear we are part of a PLM journey. Some do not take part in the journey and keep on hanging to the past, those who understand the journey are all seeing different Points Of Interests – the characteristics of a journey

In my previous post, the PLM blame game, I briefly mentioned that there are two delivery models for PLM. One approach based on a PLM system, that contains predefined business logic and functionality, promoting to use the system as much as possible out-of-the-box (OOTB) somehow driving toward a certain rigidness or the other approach where the PLM capabilities need to be developed on top of a customizable infrastructure, providing more flexibility. I believe there has been a debate about this topic over more than 15 years without a decisive conclusion. Therefore I will take you through the pros and cons of both approaches illustrated by examples from the field.

PLM started as a toolkit

The initial cPDM/PLM systems were toolkits for several reasons. In the early days, scalable connectivity was not available or way too expensive for a standard collaboration approach. Engineering information, mostly design files, needed to be shared globally in an efficient manner, and the PLM backbone was often a centralized repository for CAD-data. Bill of Materials handling in PLM was often at a basic level, as either the ERP-system (mostly Aerospace/Defense) or home-grown developed BOM-systems(Automotive) were in place for manufacturing.

Depending on the business needs of the company, the target was too connect as much as possible engineering data sources to the PLM backbone – PLM originated from engineering and is still considered by many people as an engineering solution. For connectivity interfaces and integrations needed to be developed in a time that application integration frameworks were primitive and complicated. This made PLM implementations complex and expensive, so only the large automotive and aerospace/defense companies could afford to invest in such systems. And a lot of tuition fees spent to achieve results. Many of these environments are still operational as they became too risky to touch, as I described in my post: The PLM Migration Dilemma.

The birth of OOTB

Around the year 2000, there was the first development of OOTB PLM. There was Agile (later acquired by Oracle) focusing on the high-tech and medical industry. Instead of document management, they focused on the scenario from bringing the BOM from engineering to manufacturing based on a relatively fixed scenario – therefore fast to implement and fast to validate. The last point, in particular, is crucial in regulated medical environments.

At that time, I was working with SmarTeam on the development of templates for various industries, with a similar mindset. A predefined template would lead to faster implementations and therefore reducing the implementation costs. The challenge with SmarTeam, however, was that is was very easy to customize, based on Microsoft technology and wizards for data modeling and UI design.

This was not a benefit for OOTB-delivery as SmarTeam was implemented through Value Added Resellers, and their major revenue came from providing services to their customers. So it was easy to reprogram the concepts of the templates and use them as your unique selling points towards a customer. A similar situation is now happening with Aras – the primary implementation skills are at the implementing companies, and their revenue does not come from software (maintenance).

The result is that each implementer considers another implementer as a competitor and they are not willing to give up their IP to the software company.

SmarTeam resellers were not eager to deliver their IP back to SmarTeam to get it embedded in the product as it would reduce their unique selling points. I assume the same happens currently in the Aras channel – it might be called Open Source however probably it is only high-level infrastructure.

Around 2006 many of the main PLM-vendors had their various mid-market offerings, and I contributed at that time to the SmarTeam Engineering Express – a preconfigured solution that was rapid to implement if you wanted.

Although the SmarTeam Engineering Express was an excellent sales tool, the resellers that started to implement the software began to customize the environment as fast as possible in their own preferred manner. For two reasons: the customer most of the time had different current practices and secondly the money come from services. So why say No to a customer if you can say Yes?

OOTB and modules

Initially, for the leading PLM Vendors, their mid-market templates were not just aiming at the mid-market. All companies wanted to have a standardized PLM-system with as little as possible customizations. This meant for the PLM vendors that they had to package their functionality into modules, sometimes addressing industry-specific capabilities, sometimes areas of interfaces (CAD and ERP integrations) as a module or generic governance capabilities like portfolio management, project management, and change management.

The principles behind the modules were that they need to deliver data model capabilities combined with business logic/behavior. Otherwise, the value of the module would be not relevant. And this causes a challenge. The more business logic a module delivers, the more the company that implements the module needs to adapt to more generic practices. This requires business change management, people need to be motivated to work differently. And who is eager to make people work differently? Almost nobody,  as it is an intensive coaching job that cannot be done by the vendors (they sell software), often cannot be done by the implementers (they do not have the broad set of skills needed) or by the companies (they do not have the free resources for that). Precisely the principles behind the PLM Blame Game.

OOTB modularity advantages

The first advantage of modularity in the PLM software is that you only buy the software pieces that you really need. However, most companies do not see PLM as a journey, so they agree on a budget to start, and then every module that was not identified before becomes a cost issue. Main reason because the implementation teams focus on delivering capabilities at that stage, not at providing value-based metrics.

The second potential advantage of PLM modularity is the fact that these modules supposed to be complementary to the other modules as they should have been developed in the context of each other. In reality, this is not always the case. Yes, the modules fit nicely on a single PowerPoint slide, however, when it comes to reality, there are separate systems with a minimum of integration with the core. However, the advantage is that the PLM software provider now becomes responsible for upgradability or extendibility of the provided functionality, which is a serious point to consider.

The third advantage from the OOTB modular approach is that it forces the PLM vendor to invest in your industry and future needed capabilities, for example, digital twins, AR/VR, and model-based ways of working. Some skeptic people might say PLM vendors create problems to solve that do not exist yet, optimists might say they invest in imagining the future, which can only happen by trial-and-error. In a digital enterprise, it is: think big, start small, fail fast, and scale quickly.

OOTB modularity disadvantages

Most of the OOTB modularity disadvantages will be advantages in the toolkit approach, therefore discussed in the next paragraph. One downside from the OOTB modular approach is the disconnect between the people developing the modules and the implementers in the field. Often modules are developed based on some leading customer experiences (the big ones), where the majority of usage in the field is targeting smaller companies where people have multiple roles, the typical SMB approach. SMB implementations are often not visible at the PLM Vendor R&D level as they are hidden through the Value Added Reseller network and/or usually too small to become apparent.

Toolkit advantages

The most significant advantage of a PLM toolkit approach is that the implementation can be a journey. Starting with a clear business need, for example in modern PLM, create a digital thread and then once this is achieved dive deeper in areas of the lifecycle that require improvement. And increased functionality is only linked to the number of users, not to extra costs for a new module.

However, if the development of additional functionality becomes massive, you have the risk that low license costs are nullified by development costs.

The second advantage of a PLM toolkit approach is that the implementer and users will have a better relationship in delivering capabilities and therefore, a higher chance of acceptance. The implementer builds what the customer is asking for.

However, as Henry Ford said, if I would ask my customers what they wanted, they would ask for faster horses.

Toolkit considerations

There are several points where a PLM toolkit can be an advantage but also a disadvantage, very much depending on various characteristics of your company and your implementation team. Let’s review some of them:

Innovative: a toolkit does not provide an innovative way of working immediately. The toolkit can have an infrastructure to deliver innovative capabilities, even as small demonstrations, the implementation, and methodology to implement this innovative way of working needs to come from either your company’s resources or your implementer’s skills.

Uniqueness: with a toolkit approach, you can build a unique PLM infrastructure that makes you more competitive than the other. Don’t share your IP and best practices to be more competitive. This approach can be valid if you truly have a competing plan here. Otherwise, the risk might be you are creating a legacy for your company that will slow you down later in time.

Performance: this is a crucial topic if you want to scale your solution to the enterprise level. I spent a lot of time in the past analyzing and supporting SmarTeam implementers and template developers on their journey to optimize their solutions. Choosing the right algorithms, the right data modeling choices are crucial.

Sometimes I came into a situation where the customer blamed SmarTeam because customizations were possible – you can read about this example in an old LinkedIn post: the importance of a PLM data model

Experience: When you plan to implement PLM “big” with a toolkit approach, experience becomes crucial as initial design decisions and scope are significant for future extensions and maintainability. Beautiful implementations can become a burden after five years as design decisions were not documented or analyzed. Having experience or an experienced partner/coach can help you in these situations. In general, it is sporadic for a company to have internally experienced PLM implementers as it is not their core business to implement PLM. Experienced PLM implementers vary from size and skills – make the right choice.

 

Conclusion

After writing this post, I still cannot write a final verdict from my side what is the best approach. Personally, I like the PLM toolkit approach as I have been working in the PLM domain for twenty years seeing and experiencing good and best practices. The OOTB-box approach represents many of these best practices and therefore are a safe path to follow. The undecisive points are who are the people involved and what is your business model. It needs to be an end-to-end coherent approach, no matter which option you choose.

 

 

 

After my previous post about the PLM migration dilemma, I had several discussions with peers in the field why these PLM bad news are creating so much debate. For every PLM vendor, I can publish a failure story if I want. However, the reality is that the majority of PLM implementations do not fail.

Yes, they can cause discomfort or friction in an organization as implementing the tools often forces people to work differently.  And often working differently is not anticipated by the (middle) management and causes, therefore, a mismatch for the people, process & tools paradigm.

So we love bad news in real life. We talk about terrorism while meanwhile, a large number of people are dying through guns, cars, and even the biggest killer mosquitos. Fear stories sell better than success stories, and in particular, in the world of PLM Vendors, every failure of the competition is enlarged.  However, there are more actors involved in a PLM implementation, and if PLM systems would be that bad, they would not exist anymore and replace by ………?

Who to blame – the vendor?

Of course, it is the easiest way to blame the vendor as their marketing is promising to solve all problems. However, when you look from a distance to the traditional PLM vendor community, you see they are in a rat-race to deliver the latest and greatest technology ahead of their competition, often driven by some significant customers.

Their customers are buying the vision and expect it to be ready and industrialized, which is not the case – look at the digital twin hype or AI (Artificial Intelligence).  Released PLM software is not at the same maturity compared to office applications. Office applications do not innovate so much and have thousands of users during a beta-cycle and no dependency on processes.

Most PLM vendors are happy when a few customers jump on their latest release, combined with the fact that implementations of the most recent version are not yet a push on the button.  This might change in the long term if PLM Vendors can deliver cloud-based solutions.

PLM implementations within the same industry might look the same but often vary a lot due to existing practices, which will not change due to the tool – so there is a need for customization or configuration.

PLM systems with strong business rules inside their core might more and more develop towards configuration, where PLM toolkit-like systems might focus on ease of customization. Both approaches have their pro’s and con’s (in another blog post perhaps).

Another topic to blame the vendor is lack of openness.  You hear it in many discussions. If vendor X were open, they would not lock the data – a typical marketing slogan. If PLM vendors would be completely open, to which standards should they adhere?  Every PLM has its preferred collection of tools together – if you stay within their portfolio you have a minimum of compatibility or interface issues.

This logic started already with SAP in the previous century. For PLM vendors, there is no business model for openness. For example, the SmarTeam APIs for connecting and extracting data are available free of charge, leading to no revenue for the vendor and significant revenue for service providers. Without any license costs, they can build any type of interface/solution. In the end, when the PLM vendor has no sustainable revenue, the vendor will disappear as we have seen between 2000 and 2010, where several stand-alone PLM systems disappeared.

So yes, we can blame PLM vendors for their impossible expectations – coming to realistic expectations related to capabilities and openness is probably the biggest challenge.

Who to blame – the implementer?

The second partner in a PLM implementation is the implementation partner, often a specialized company related to the PLM vendor. There are two types of implementation partners – the strategic partners and the system integrators.

Let’s see where we can blame them.

Strategic partners, the consultancy firms,  often have a good relationship with the management, they help the company to shape the future strategy, including PLM. You can blame this type of company for their lack of connection to the actual business. What is the impact on the organization to implement a specific strategy, and what does this mean for current or future PLM?

Strategic partners should be the partner to support business change management as they are likely to have experience with other companies. Unfortunate, this type of companies does not have significant skills in PLM as the PLM domain is just a small subset of the whole potential business strategy.

You can blame them that they are useful in building a vision/strategy but fail to create a consistent connection to the field.

Implementation partners, the system integrators, are most of the times specialized in one or two PLM vendor’s software suites, although the smaller the implementation partner, the less broad their implementation skills. These implementation partners sometimes have built their own PLM best practices for a specific vendor and use this as a sales argument. Others just follow blindly what the vendor is promoting or what the customer is asking for.

They will do anything you request, as long as they get paid for it. The larger ones have loads of resources for offshore deliveries – the challenge you see here is that it might look cheap; however, it becomes expensive if there is no apparent convergence of the deliverables.

As I mentioned before they will never say No to a customer and claim to fill all the “gaps,” there are in the PLM environment.

You can blame implementation partners that their focus is on making money from services. And they are right, to remain in business your company needs to be profitable. It is like lawyers; they will invoice you based on their efforts. And the less you take on your plate, the more they will do for you.

The challenge for both consultancy partners as system integrators is to find a balance between experienced people, who really make it happen and educating juniors to become experts too. Often the customer pays for the education of these juniors

Who to blame – your company?

If your company is implementing PLM, then probably the perception is that that you made all the effort to make it successful.  You followed the advice of the strategic consultants, you selected the best PLM Vendor and system integrator, you created a budget – so what could go wrong?

This all depends on your company’s ambition and scope for PLM.

Implementing the as-is processes

If your PLM implementation is just there to automate existing practices and store data in a central location, this might work out. And this is most of the time when PLM implementations are successful. You know what to expect, and your system integrator knows what to expect.

This type of project can run close to budget, and some system integrators might be tempted to offer a fixed price. I am not a fan of fixed priced projects as you never know exactly what needs to be done. The system integrator might raise the target price with 20 – 40 % to cover their risk or you as a company might select the cheapest bid – another guarantee for failure. A PLM implementation is not a one-time project, it is an on-going journey. Therefore your choice needs to be sustainable.

My experience with this type of implementations is that it easy to blame the companies here too. Often the implementation becomes an IT-project, as business people are too busy to run their day-to-day jobs, therefore they only incidentally support the PLM project. The result is that at a specific moment, users confronted with the system feel not connected to the new system – it was better in the past. In particular, configuration management and change processes can become waterproof, leaving no freedom for the users. Then the blaming starts – first the software then the implementer.

But what if you have an ambitious PLM project as part of a business transformation?

In that case, the PLM platform is just one of the elements to consider. It will be the enabler for new ways of working, enabling customer-centric processes, multi-discipline collaboration, and more. All related to a digital transformation of the enterprise. Therefore, I mention PLM platform instead of PLM system. Future enterprises run on data through connected platforms. The better you can connect your disciplines, the more efficient and faster your company will operate. This, as opposed to the coordinated approach, which I have been addressing several times in the past.

A business transformation is a combination of end-to-end understanding of what to change – from management vision connected to the execution in the field. And as there is not an out-of-the-box template for business transformation, it is crucial a company experiments, evaluates and when successful, scales up new habits.

Therefore, it is hard to define upfront all the effort for the PLM platform and the implementation resources. What is sure is that your company is responsible for that, not an external part. So if it fails, your company is to blame.

Is everyone to blame?

You might have the feeling that everyone is to blame when a PLM implementation fails. I believe that is indeed the case. If you know in advance where all players have their strengths and weaknesses, a PLM implementation should not fail, but be balanced with the right resources. Depending on the scope of your PLM implementation, is it a consolidation or a transformation, you should take care of all stakeholders are participating in the anti-blame game.

The anti-blame game is an exercise where you make sure that the other parties in the game cannot blame you.

  • If you are a vendor – do not over commit
  • If you are a consultant or system integrator – learn to say NO
  • If you are the customer – make sure enough resources are assigned – you own the project. It is your project/transformation.

This has been several times my job in the past, where I was asked to mediate in a stalling PLM implementation. Most of the time at that time it was a blame game, missing the target to find a solution that makes sense. Here coaching from experienced PLM consultants makes sense.

 

Conclusion

Most of the time, PLM implementations are successful if the scope is well understood and not transformative. You will not hear a lot about these projects in the news as we like bad news.

To avoid bad news challenging PLM implementations should make sure all parties involved are challenging the others to remain realistic and invest enough. The role of an experienced external coach can help here.

 

 

After two reposts, I have finally the ability to write with full speed, and my fingers were aching, having read some postings in the past four weeks.  It started with Verdi Ogewell’ s article on Engineering.com Telecom Giant Ericsson Halts Its PLM Project with Dassault’s 3DEXPERIENCE followed by an Aras blog post Don’t Be a Dinosaur from Mark Reisig, and of course, I would say Oleg Shilovitsky’s post: What to learn from Ericsson PLM failure?

Setting the scene

Verdi’s article is quite tendentious based on outside observations and insinuations. I let you guess who sponsored this article.  If I had to write an article about this situation,

I would state: Ericsson and Dassault failed to migrate the old legacy landscape into a new environment – an end-to-end migration appeared to be impossible.

The other topics mentioned are not relevant to the current situation.

Mark is chiming in on Verdi’s truth and non-relevant points to data migration, suggesting PLM is chosen over dinner. Of course, decisions are not that simple. It is not clear from Mark’s statement, who are the Dinosaurs:

Finally, don’t bet your future on a buzzword. Before making a huge PLM investment, take the time to make sure your PLM vendor has an actual platform. Have them show you their spider chart.  And here’s the hard reality: they won’t do it, because they can’t.

Don’t be a dinosaur—be prepared for the unexpected with a truly resilient digital platform.

I would state, “Don’t bet your future on a spider chart” if you do not know what the real problem is.

 

Oleg’s post finally is more holistic, acknowledging that a full migration might not be the right target, and I like his conclusion:

Flexibility Vs. Out of the box products – which one do you prefer? Over-customize a new PLM to follow old processes? To use a new system as an opportunity to clean existing processes? To move 25,000 people from one database to another is not a simple job. It is time to think about no upgrade PLM systems. While a cloud environment is not an option for mega-size OEMs like Ericsson, there is an opportunity for OEM IT together with the PLM vendor to run a migration path. The last one is a costly step. But… without this step, the current database oriented single-version of truth PLM paradigm is doomed.

The Migration Problem

I believe migration of data – and sometimes the impossibility of data migration – is the biggest elephant in the room when dealing with PLM projects. In 2015 during the PI PLM conference in Dusseldorf, I addressed this topic for the first time: The Challenge of PLM Upgrades.
You can find the presentation on SlideShare here.

I shared a similar example to the Ericsson case from almost 10 years ago. At that time, one of the companies I was working with wanted to replace their mainframe application, which was managing the configuration of certain airplanes. The application managed the aircraft configuration structures in tables and where needed pointing to specifications in a document repository. The two systems were not connected; integrity was guaranteed through manual verification procedures.

The application was considered as the single version of the truth, and has been treated like that for decades. The reason for migration was that all the knowledge of the application disappeared, tables were documented, but the logic was not. And besides this issue, the maintenance costs for the mainframe was also high – also at that time vendor lock-in existed.

The idea was to implement SmarTeam – flexible data model – rapid deployment based on windows technology  -to catch two birds with one stone, i.e., latest microsoft technology and meanwhile direct link to the controlled documents. As they were using CATIA V5, the SmarTeam-integration was a huge potential benefit. For the migration of data, the estimate was two months. What could go wrong?

Well, technically, almost nothing went wrong. The challenge was to map the relational tables to the objects in the SmarTeam data model. And as the relational tables contained a mix of document and item attributes, splitting these tables was not always easy. Sometimes the same properties were with different values in the original table – which one was the truth? The migration took almost two years also due to limited availability of the last knowledgeable resource who could explain the logic.

After the conversion, the question still remained if the migrated data was accurate? Perhaps 99 %?
But what if it was critical? For this company, it was significant, but not mission critical like in Ericsson, where a lot of automation and rules are linked together between loads of systems.

So my point: Dassault has failed at Ericsson and so will Siemens or Aras or any other PLM vendor as the migration issue is not in the technology – we should stop thinking about this kind of migrations.

Who are the dinosaurs?

Mark is in a way suggesting that when you use PLM software from the “old” PLM vendors, you are a dinosaur. Of course, this is a great marketing message, but the truth is that it is not the PLM vendor to blame. Yes, some have more friction than the other in some instances, but in my opinion, there is no ultimate single PLM vendor.

Have a look at the well-known Daimler case from some years ago, which made the news because Daimler decided to replace CATIA by NX. Not because NX was superior – it was about maintaining the PLM backbone Smaragd which would be hard to replace. Even in 2010, there was already the notion that the existing data management infrastructure is hard to replace. See a more neutral article about this topic from Monica Schnitger if you want: Update: Daimler chooses NX for Smaragd.  Also here in the end, it became a complete Siemens account for compatibility reasons.

When you look at the significant wins Aras is mentioning in their customer base, GM, Schaeffler or Airbus, you will probably discover Aras is more the connection layer between legacy systems, old PLM or PDM systems. They are not the new PLM replacing old PLM.  A connection layer creates a digital thread, connecting various data sources for traceability but does not provide digital continuity as the data in the legacy systems is untouched. Still it is an intermediate step towards a hybrid environment.

For me the real dinosaurs are these large enterprises that have been implementing their proprietary PLM environments in the previous century and have built a fully automated infrastructure based on custom data models with a lot of proprietary rules. This was the case in Ericsson, but most traditional automotive and aerospace companies share this problem, as they were the early PLM adopters. And they are not the only ones. Many industrial manufacturing companies suffer from the past, opposite to their Asian competitors who can start with less legacy.

What’s next?

It would be great if the PLM community focused more on the current incompatibility of data between current/past concepts and future digital needs and discuss solution paths (for sure standards will pop-up)

Incompatibility means: Do not talk about migration but probably focus on a hybrid landscape with legacy data, managed in a coordinated manner, and modern, growing digital PLM processes based on a connected approach.

This is the discussion I would like to see, instead of vendors claiming that their technology is the best. None of the vendors will talk about this topic – like the old “Rip-and-Replace” approach is what brings the most software revenue combined with the simplification that there is only OnePLM. It is interesting to see how many companies have a kind of OnePLM or OneXXX statement.

The challenge, of course, is to implement a hybrid approach. To have the two different PLM-concepts work together, there is a need to create a reliable overlap. The reliable overlap can come from an enterprise data governance approach if possible based on a normalized PLM data model. So far all PLM vendors that I know have proprietary data models, only ShareAspace from Eurostep is based on the PLCS standard, but their solutions are most of the time part of a larger PLM-infrastructure (the future !)

To conclude: I look forward to discussing this topic with other PLM peers that are really in the field, discovering and understanding the chasm between the past and the future. Contact me directly or join us as the PLM Roadmap and PDT Europe 13-14 November in Paris. Let’s remain fact-based!
(as a matter of fact you can still contribute – call for papers still open)

 

 

 

Unfortunate one more time and old post with some new comments in green as I am not yet able to type at regular speed. I promise this will be the last reprise as I am sure in one week from now I will be double-handed again. The reason I chose this six-year-old post is that the topic is still actual, however, at that time, digital transformation was not yet in fashion for PLM.

If you look at the comments to the article at that time (Feb 2013), you will see some well-known names and behaviors.  What I can state for the moment – there are still people doubting there is a need for PLM, there are still people blaming technology  for the lousy perception of PLM, and there is a large group of silent companies out there that have implemented the basics of PLM, perhaps not as advanced as vendors/consultants have suggested, and they are reaping the benefits.

The main question in upcoming blog posts; “Is this enough ?” Happy rereading!

How come PLM is boring? – Feb 2013

PLM is a popular discussion topic in various blogs, LinkedIn discussion groups, PLM Vendor web sites, and for the upcoming Product Innovation Congress in Berlin.  I look forward to the event to meet and discuss with attendees their experience and struggle to improve their businesses using PLM. (Meanwhile, PI PLMx London has passed – for a review look here –The weekend after PI PLMx London 2019)

From the other side, talking about pure PLM becomes boring. Sometimes it looks like PLM is a monotheistic topic:

  • “What is the right definition of PLM ?” (I will give you the right one)
  • “We are the leading PLM vendor” (and they all are)
  • A PLM system should be using technology XYZ (etc., etc.)
  • Digital Transformation and IoT have come into the picture now

Some meetings with customers in the past three weeks and two different blog posts I read recently made me aware of this ambiguity between boring and fun.

PLM dictating Business is boring

Oleg Shilovitsky´s sequence of posts (and comments) starting with A single bill of materials in 6 steps was an example of the boring part. (Sorry Oleg, as you publish so many posts, there are many that I like and some I  can use as an example). When reading the BOM-related posts,  I noticed they are a typical example of an IT- or Academic view on PLM, in particular on the BOM topic.

questionWill these posts help you after reading them? Do they apply to your business? Alternatively, do you feel more confused as a prolific PLM blogger makes you aware of all the different options and makes you think you should use a single bill of materials?

I learned from my customers and coaching and mediating hundreds of PLM implementations that the single BOM discussion is one of the most confusing and complicated topics. Moreover, for sure if you address it from the IT-perspective.

The customer might say:
“Our BOM is already in ERP – so if it is a single BOM, you know where it is – goodbye !”.

A different approach is to start looking for the optimal process for this customer, addressing the bottlenecks and pains they currently face.  It will be no surprise that PLM best practices and technology are often the building blocks for the considered solution. If it will be a single BOM or a collection of structures evolving through time, this depends on the situation, not on the ultimate PLM system.

Note: meanwhile Oleg has further materialized his thinking through OpenBOM, and he has not lost his speed of publishing

Business dictating PLM is fun

Therefore I was happy to read Stephen Porter´s opinion and comments in: The PLM state: Penny-wise Pound Foolish Pricing and PLM (unfortunate this post has disappeared) where he passes a similar message like mine, from a different starting point, the pricing models of PLM Vendors. My favorite part is in his conclusion:

A PLM decision is typically a long term choice so make sure the vendor and partners have the staying power to grow with your company. Also make sure you are identifying the value drivers that are necessary for your company’s success and do not allow yourself to be swayed by the trendy short term technology

Management in companies can be confused by starting to think they just need PLM because they hear from the analysts, that it improves business. They need to think first to solve their business challenges and change the way they currently work to improve. Moreover, next look for the way to implement this change.

Not:e Stephen wrote at that time an interesting series of post and promised a revival. However I haven’t seen new posts. Did anyone of my readers see new materials that I missed?

Changing the way to work is the problem, not PLM.

It is not the friendly user-interface of PLM system XYZ or the advanced technical capabilities of PLM system ABC,  that will make a PLM implementation easier. Nothing is solved on the cloud or by using a mobile device. If there is no change when implementing PLM, why implement and build a system to lock yourself in even more?

abbThis is what Thomas Schmidt (VP Head of Operational Excellence and IS at ABB’s Power Products Division) told last year at PLM Innovation 2012 in Munich. He was one of the keynote speakers and surprised the audience by stating he did not need PLM!

He explained this by describing the business challenges ABB has to solve: Being a global company but acting around the world as a local company. He needed product simplification, part reduction among product lines around the world, compliance, and more.

Note: Thomas Schmidt meanwhile moved forward in his career, identifying himself as Experienced “Change Leader”, digital transformation, mentor and coach

Another customer in a whole different industry mentioned they were looking for improving global instant collaboration as the current information exchange is too slow and error-prone. Besides, they want to capitalize on the work done and make it accessible and reusable in the future, authoring tool independent. However, they do not call it PLM as in their business nobody uses PLM!

Both cases should make a PLM reseller´s mouths water (watertanden in Dutch), as these companies are looking for critical capabilities available in most of the PLM systems. However, none of these companies asked for a single BOM or a service-oriented architecture. They wanted to solve their business issues. Moreover, for sure, it will lead to implementing PLM capabilities when business and IT-people together define and decide on the right balance.

Unfortunate here we still see a function-feature approach – if it is not there, we will build it

Management take responsibility

Combining PLM and new business needs is the responsibility of management in these companies. It is crucial that a business issue (or a new strategy) is the driving force for a PLM implementation. PLM is not about automating what we have.

In too many situations, the management decides that a new strategy is required. One or more bright business leaders decide they need PLM (note -the strategy has now changed towards buying and implementing a system). Together with IT and after doing an extensive selection process, the selected PLM system (disconnected from the strategy) will be implemented.

I believe we read something about such a case recently

Moreover, this is the place where all PLM discussions come together:

  • why PLM projects are difficult
  • why it is unclear what PLM does.

PLM Vendors and Implementers are not connected anymore at this stage to the strategy or business. They implement technology and do what the customer project team tells them to do (or what they think is best for their business model).

Successful implementations are those where the business and management are actively involved during the whole process and the change.  Involvement requires a significant contribution from their side, often delegated to business and change consultants.

PLM Implementations usually lead to a crisis at some moment in time, when the business is not leading, and the focus is on IT and User Acceptance. In the optimal situation, business is driving IT. However, in most cases, due to lack of time and priorities from the business people, they delegate this activity to IT and the implementation team. So here it is a matter of luck if they will be successful: how experienced is the team?

Will they implement a new business strategy or just automate and implement the way the customer worked before, but now in a digital manner? Do we blame the software when people do not change?

Some notes here: I believe the disconnect between management/PLM vendors and on the other side meanwhile, people in business has become more prominent, due to the digital transformation hype. The hype is moving faster than the organization. Second point: I will not talk about people change anymore – organizations can change – people can adapt within a specific range. It is up to the organization where to push the limits.

 

Back to fun

imageI would not be so passionate about PLM if it was boring. However looking back the fun and enthusiasm does not come from PLM. The fun comes from a pro-active business approach knowing that first the motivating the people and preparing the change are defined, before implementing PLM practices

I believe the future success for PLM technologies is when we know to speak and address real business value and only then use (PLM) technologies to solve them.

PLM becomes is a  logical result not the start. And don´t underestimate: change is required. What do you think – is it a dream ?

????

Due to some physical inconvenience the upcoming weeks, I will not be able to write a full blog post at this time. Typing with one finger is not productive.
A video post could be an alternative, however for me, the disadvantage of a video message is that it requires the audience to follow all the information in a fixed speed – no fast or selective reading possible – hard to archive and store in context of other information. Putting pieces of information in a relevant context is a PLM-mission.

So this time my post from December 2008, where I predicted the future for 2050. I think the predictions were not too bad – you will recognize some trends and challenges still ahead. Some newer comments in italic green. I am curious to learn what you think after reading this post. Enjoy, and I am looking forward to your feedback

PLM in 2050

As the year ends (December 2008), I decided to take my crystal ball to see what would happen with PLM in the future.

It felt like a virtual experience and this is what I saw:

  • Data is not replicated any more – every piece of information that exists will have a Unique Universal ID; some people might call it the UUID. In 2020 this initiative became mature, thanks to the merger of some big PLM and ERP vendors, who brought this initiative to reality. This initiative reduced the exchange costs in supply chains dramatically and lead to bankruptcy for many companies providing translators and exchange software. (still the dream of a digital enterprise)
  • Companies store their data in ‘the cloud’ based on the previous concept. Only some old-fashioned companies still have their own data storage and exchange issues, as they are afraid someone will touch their data. Analysts compare this behavior with the situation in the year 1950, when people kept their money under a mattress, not trusting banks (and they were not always wrong) (we are getting there – sill some years to go)
  • After 3D, an entire virtual world, based on holography, became the next step for product development and understanding of products. Thanks to the revolutionary quantum-3D technology, this concept could be even applied to life sciences. Before ordering a product, customers could first experience and describe their needs in a virtual environment (to be replaced by virtual twin / VR / AR)
  • Finally the cumbersome keyboard and mouse were replaced by voice and eye-recognition.
    Initially voice recognition (Siri, Alexia please come to the PLM domain)
    http://www.youtube.com/watch?v=2Y_Jp6PxsSQand eye tracking (some time to go still)

    were cumbersome. Information was captured by talking to the system and capturing eye-movement when analyzing holograms. This made the life of engineers so much easier, as while analyzing and talking, their knowledge was stored and tagged for reuse. No need for designers to send old-fashioned emails or type their design decisions for future reuse (now moving towards AI)

  • Due to the hologram technology, the world became greener. People did not need to travel around the world, and the standard became virtual meetings with global teams(airlines discontinued business class). Even holidays could be experienced in the virtual world thanks to a Dutch initiative based on the experience with coffee. (not sure why I selected this movie. Sorry ….)
    http://www.youtube.com/watch?v=HUqWaOi8lYQThe whole IT infrastructure was powered by efficient solar energy, reducing the amount of carbon dioxide dramatically
  • Then with a shock, I noticed PLM did not longer exist. Companies were focusing on their core business processes. Systems/terms like PLM, ERP, and CRM did not longer exist. Some older people still remembered the battle between these systems to own the data and the political discomfort this gave inside companies (so true …)
  • As people were working so efficient, there was no need to work all week. There were community time slots, when everyone was active, but 50 percent of the time, people had the time to recreate (to re-create or recreate was the question). Some older French and German designers remembered the days when they had only 10 weeks holiday per year, unimaginable nowadays. (the dream remains)

As we still have more than 40  years to reach this future, I wish you all a successful and excellent 2009.

I am looking forward to be part of the green future next year.

I am writing this post during the Easter weekend in the Netherlands. Easter / Passover / Pascha / are religious festivities that happen around this time, depending on full moons, etc. I am not the expert here, however, what I like about Easter is that is it is an optimistic religious celebration, connecting history, the “dark days,” and the celebration of new life.

Of course, my PLM-twisted brain never stops associating and looking into an analogy, I saw last week a LinkedIn post from Mark Reisig, about Aras ACE 2019 opening with the following statement:

Digital Transformation – it used to be called PLM,” said Aras CEO Peter Schroer, as he opened the conference with some thoughts around attaining sustainable Digital Transformation and owning the lifecycle.

Was this my Easter Egg surprise? I thought we were in the middle of the PLM Renaissance as some other vendors and consultants talk about this era. Have a look at a recent Engineering.com TV-report: Turning PLM on its head

All jokes aside, the speech from Peter Schroer contained some interesting statements and I want to elaborate on them in this post as the space to comment in LinkedIn is not designed for a long answer.

PLM is Digital Transformation?

In the past few years, there has been a discussion if the acronym PLM (Product Lifecycle Management) is perhaps outdated. PTC claimed thanks to IoT (Internet of Things) now PLM equals IoT, as you can read in  Mark Taber’s 2018 guest article in Digital Engineering: IoT Equals PLM.
Note: Mark is PTC’s vice president of marketing and go-to-market marketing according to the bio at the bottom of the article. So a lot of marketing words, which  strengthens the believers of the old world, that everything new is probably marketing.

Also during the PDT conferences, we discussed if PLM should be replaced by a new acronym and I participated in that discussion too – my Nov 2018 postWill MBSE be the new PLM instead of IoT? is a reflection of my thoughts at that time.

For me, Digital Transformation is a metamorphosis from a document-driven, sequential processes towards data-driven, iterative processes. The metamorphosis example used a lot at this moment, is the one from Caterpillar towards the Butterfly. This process is not easy when it comes to PLM-related information, as I described in my PI PLMx 2019 London Presentation and blog post: The Challenges of a Connected Ecosystem for PLM. The question is even: Will there be a full metamorphosis at the end or will we keep on working in two different modes of operations?

However, Digital Transformation does not change the PLM domain. Even after a successful digital transformation, there will be PLM. The only significant difference in the future – PLM boarders will not be so evident anymore when implementing capabilities in a system or a platform. The upcoming of digital platforms will dissolve or fade the traditional PLM-mapped capabilities.

You can see these differences already by taking an in-depth look at how Oracle, SAP or Propel address PLM. Each of them starts from a core platform with different PLM-flavored extensions, sometimes very different from the traditional PLM Vendors. So Digital transformation is not the replacement of PLM.

Back to Peter Schroer’s rebuttal of some myths. Note: DX stands for Digital Transformation

Myth #1: DX leverages disruptive tech

Peter Schroer:

 It’s easy to get excited about AI, AR, and the 3D visual experience. However, let’s be real. The first step is to get rid of your spreadsheets and paper documentation – to get an accurate product data baseline. We’re not just talking a digital CAD model, but data that includes access to performance data, as-built parts, and previous maintenance work history for everyone from technicians to product managers

Here I am fully aligned with Peter. There are a lot of fancy features discussed by marketing teams, however, when working in the field with companies, the main challenge is to get an organization digital aligned, sharing data accessible along the whole lifecycle with the right quality.

This means you need to have a management team, understanding the need for data governance, data quality and understanding the shift from data ownership to data accountability.  This will only happen with the right mix of vision, strategy and the execution of the strategy – marketing does not make it happen

 

Myth #2: DX results in increased market share, revenue, and profit

Peter Schroer:

Though there’s a lot of talk about it – there isn’t yet any compelling data which proves this to be true. Our goal at Aras is to make our products safer and faster. To support a whole suite of industrial applications to extend your DX strategy quite a bit further.

Here I agree and disagree, depending on the context of this statement. Some companies have gone through a digital transformation and therefore increased their market share, revenue, and profit. If you read books like Leading Transformation or Leading Digital, you will find examples of companies that have gone through successful digital transformations. However, you might also discover that most of these companies haven’t transformed their PLM-domain, but other parts of their businesses.

Also, it is interesting to read a 2017 McKinsey post: The case for digital reinvention, where you will get the confirmation that a lot of digital initiatives did not bring more top-line revenue and most of the times lead to extra costs. Interesting to see where companies focus their digital strategies – picture below:

Where only 2 percent of the respondents were focusing on supply chains, this is, according to the authors of the article, one of the areas with the highest potential ROI. And digital supply chains are closely related to modern PLM – so this is an area with enough work to do by all PLM practitioners– connecting ecosystems (in real-time)

Myth #3: Market leaders are the most successful at DX

Peter Schroer:

If your company is hugely profitable at the moment, it’s highly likely that your organization is NOT focused on Digital Transformation. The lifespan of S&P 500 companies continuing to shrink below 20 years.

How to Attain Sustainable Digital Transformation

– Stop buying disposable systems. It’s about an adaptable platform – it needs to change as your company changes.

– Think incremental. Do not lose momentum. Continuous change is a multi-phase journey. If you are in or completed phase I, then that means there is a phase II, a phase III, and so on.

– Align people & processes.  Mistakes will happen, “the tech side is only 50% of DX” – Aras CEO.

Here I agree with Peter on the business side, be it that some of the current market leaders are already digital. Look at Apple, Google, and Amazon. However, the majority of large enterprises have severe problems with various aspects of a digital transformation as the started in the past before digital technologies became affordable..

Digitization allows information to flow without barriers within an organization, leading to rapid insights and almost direct communication with your customers, your supply chain or other divisions within your company. This drives the need to learn and build new, lean processes and get people aligned to them. Learning to work in a different mode.

And this is extremely difficult for a market leader – as market leader fear for the outside changing world is often not felt. Between the C-level vision and people working in the company, there are several layers of middle management. These layers were created to structure and stabilize the old ways of working.

I wrote about the middle management challenge in my last blog post: The Middle Management dilemma. Almost in the same week there was an article from McKinsey: How companies can help midlevel managers navigate agile transformations.
Conclusion: It is not (only) about technology as some of the tech geeks may think.

Conclusion

Behind the myths addressed by Peter Schroer, there is a complex transformation on-going. Probably not a metamorphosis. With the Easter spirit in mind connected to PLM, I believe digital transformations are possible – Not as a miracle but driven by insights into all aspects. I hope this post gave you some more ideas and please read the connected articles – they are quite relevant if you want to discover what’s below the surface.

%d bloggers like this: