You are currently browsing the category archive for the ‘Digital enterprise’ category.

 

For me, the joint conference from CIMdata and Eurostep is always a conference to look forward too. The conference is not as massive as PLM-Vendor conferences (slick presentations and happy faces); it is more a collection of PLM-practitioners (this time a 100+) with the intent to discuss and share their understanding and challenges, independent from specific vendor capabilities or features.  And because of its size a great place to network with everyone.

Day 1 was more a business/methodology view on PLM and Day 2 more in-depth focusing on standards and BIM. In this post, the highlights from the first day.

The State of PLM

 

Peter Bilello, CIMdata’s president, kicked of with a review of the current state of the PLM industry. Peter mentioned the PLM-market grew by 9.4 % to $47.8 billion (more than the expected 7 %). Good for the PLM Vendors and implementers.

However, Peter also mentioned that despite higher spending, PLM is still considered as a solution for engineering, often implemented as PDM/CAD data management. Traditional organizational structures, marketing, engineering, manufacturing, quality were defined in the previous century and are measured as such.

This traditional approach blocks the roll-out of PLM across these disciplines. Who is the owner of PLM or where is the responsibility for a certain dataset are questions to solve. PLM needs to transform to deliver end-to-end support instead of remaining the engineering silo. Are we still talking about PLM in the future? See Peter’s takeaways below:

 

We do not want to open the discussion if the the name PLM should change – too many debates – however unfortunate too much framing in the past too.

The Multi View BOM

 

Fred Feru from Airbus presented a status the Aerospace & Defense PLM action group are working on: How to improve and standardize on a PLM solution for multi-view BOM management, in particular, the interaction between the EBOM and MBOM. See below:

 

You might think this is a topic already solved when you speak with your PLM-vendor. However, all existing solutions at the participant implementations rely on customizations and vary per company. The target is to come up with common requirements that need to be addressed in the standard methodology. Initial alignment on terminology was already a first required step as before you standardize, you need to have a common dictionary. Moreover, a typical situation in EVERY PLM implementation.

 

An initial version was shared with the PLM Editors for feedback and after iterations and agreement to come with a solution that can be implemented without customization. If you are interested in the details, you can read the current status here with Appendix A en Appendix B.

 

Enabling the Circular Economy for Long Term Prosperity

Graham Aid gave a fascinating presentation related to the potentials and flaws of creating a circular economy. Although Graham was not a PLM-expert (till he left this conference), as he is the Strategy and Innovation Coordinator for the Ragn-Sells Group, which performs environmental services and recycling across Sweden, Norway, Denmark, and Estonia. Have a look at their website here.

 

Graham shared with us the fact that despite logical arguments for a circular economy – it is more profitable at the end – however, our short term thinking and bias block us from doing the right things for future generations.

Look at the missing link for a closed resource-lifecycle view below.

Graham shared weird examples where scarce materials for the future currently were getting cheaper, and therefore there is no desire for recycling them. A sound barrier with rubble could contain more copper than copper ore in a mine.

In the PLM-domain, there is also an opportunity for supporting and working on more sustainable products and services. It is a mindset and can be a profitable business model. In the PDT 2014 conference, there was a session on circular product development with Xerox as the best example. Circular product development but also Product As A Service can be activities that contribute to a more sustainable world. Graham’s presentation was inspiring for our PLM community and hopefully planted a few seeds for the future. As it is all about thinking long-term.

 

With the PLM Green Alliance, I hope we will be able to create a larger audience and participation for a sustainable future. More about the PLM Green Alliance next week.

 

The Fundamental Role of PLM in Data-driven Product Portfolio Management

 

Hannu Hannila (Polar) presented his study related to data-driven product portfolio management and why it should be connected to PLM.  For many companies, it is a challenge to understand which products are performing well and where to invest. These choices are often supported by Data Damagement as Hannu called it.

An example below:

The result of this fragmented approach is that organizations make their decisions on subjective data and emotions. Where the assumption is that 20 % of the products a company is selling is related to 80 % of the revenue, Hannu found in his research companies where only 10 % of the products were contributing to the revenue. As PPM (Product Portfolio Management)  often is based on big emotions – who shouts the loudest mentality, influenced by the company’s pet products and influence by the HIPPO (HIghest Paid Person in the Office).  So how to get a better rationale?

 

Hannu explained a data-driven framework that would provide the right analytics on management level, depending on overall data governance from all disciplines and systems.  See below:

I liked Hannu’s conclusions as it aligns with my findings:

  • To be data-driven, you need Master Data Management and Data Governance
  • Product Portfolio Management is the driving discipline for PLM, and in a modern digital enterprise, it should be connected.

Sponsor sessions

Sponsors are always needed to keep a conference affordable for the attendees.  The sponsor sessions on day 1 were of good quality.  Here a quick overview and a link if you want to invest further

 

Configit – explaining the value of a configurator that connects marketing, technical and sales, introducing CLM (Configuration Lifecycle Management) – a new TLA

 

Aras – explaining their view on what we consider the digital thread

 

Variantum – explaining their CPQ solution as part of a larger suite of cloud offerings

 

Quick Release – bringing common sense to PLM implementations, similar to what I am doing as PLM coach – focusing on the flow of information

 

SAP – explaining the change in focus when a company moves toward a product as a service model

 

SharePLM – A unique company addressing the importance of PLM training delivered through eLearning

Conclusion

CIMdata roadmap was an easy to digest conference with a good quality of presentations. I only shared 50 % of the session as we already reached 1000+ words.  The evening I enjoyed the joint dinner, being able to network and discuss in depth with participants and finished with a social network event organized by SharePLM. Next week part 2 – The PDT part of the conference.

In recent years, more and more PLM customers approached me with questions related to the usage of product information for downstream publishing. To be fair, this is not my area of expertise for the moment. However, with the mindset of a connected enterprise, this topic will come up.

For that reason, I have a strategic partnership with Squadra, a Dutch-based company, providing the same coaching model as TacIT; however, they have their roots in PIM and MDM.

Together we believe we can deliver a meaningful answer on the question: What are the complementary roles of PLM and PIM? In this post, our first joint introduction.

Note: The topic is not new. Already in 2005, Jim Brown from Tech-Clarity published a white-paper: The Complementary Roles of PIM and PLM. This all before digitization and connectivity became massive.

Let’s start with the abbreviations, the TLAs (Three-Letter-Acronyms) and their related domains

PLM – level 1
(Product Lifecycle Management – push)

For PLM, I want to stay close to the current definitions. It is the strategic approach to provide a governance infrastructure to deliver a product to the market. Starting from an early concept phase till manufacturing and in its extended definition also during its operational phase.
The focus with PLM is to reduce time to market by ensuring quality, cost, and delivery through more and more a virtual product definition, therefore being able to decide upfront for the best design choices, manufacturing options with the lowest cost. In the retail world, own-brand products are creating a need for PLM.

The above image is nicely summarizing the expected benefits of a traditional PLM implementation.

 

MDM (Master Data Management)

When product data is shared in an enterprise among multiple systems, there is a need for Master Data Management (MDM). Master Data Management focuses on a governance approach that information stored in various systems has the same meaning and shared values where relevant.

MDM guards and streamlines the way master data is entered, processed, guarded, and changed within the company, resulting in one single version of the truth and enabling different departments and systems to stay synced regarding their crucial data.

Interestingly, in the not-so-digital world of PLM, you do not see PLM vendors working on an MDM-approach. They do not care about an end-to-end connected strategy yet. I wrote about this topic in 2017 here: Master Data Management and PLM.

PIM (Product Information Management)

The need for PIM starts to become evident when selling products through various business channels. If you are a specialized machine manufacturer, your product information for potential customers might be very basic and based on a few highlights.

However, due to digitization and global connectivity, product information now becomes crucial to be available in real-time, wherever your customers are in the world.

In a competitive world, with an omnichannel strategy, you cannot survive without having your PIM streamlined and managed.

 

Product Innovation Platforms (PLM – Level 2 – Pull)

With the introduction of Product Innovation Platforms as described by CIMdata and Gartner, the borders of PLM, PIM, and MDM might become vague, as they might be all part of the same platform, therefore reducing the immediate need for an MDM-environment.  For example, companies like Propel, Stibo, and Oracle are building a joint PLM-PIM portfolio.

Let’s dive more profound in the two scenarios that we meet the most in business, PLM driving PIM (my comfort zone) and PIM driving the need for PLM (Squadra’s s area of expertise).

PLM driving PIM

Traditionally PLM (Product Lifecycle Management) has been focusing on several aspects of the product lifecycle. Here is an excellent definition for traditional PLM:

PLM is a collection of best practices, dependent per industry to increase product revenue, reduce product-related costs and maximize the value of the product portfolio  (source 2PLM)

This definition shows that PLM is a business strategy, not necessarily a system, but an infrastructure/approach to:

  • ensure shorter time to market with the right quality (increasing product revenue)
  • efficiently (reduce product-related costs – resources and scrap)
  • deliver products that bring the best market revenue (maximize the value of the product portfolio)

The information handled by traditional PLM consists mostly of design data, i.e., specifications, manufacturing drawings, 3D Models, and Bill of Materials (physical part definitions) combined with version and revision management. In elaborate environments combined with processes supporting configuration management.

PLM data is more focused on internal processes and quality than on targeting the company’s customers. Sometimes the 3D Design data is used as a base to create lightweight 3D graphics for quotations and catalogs, combining it with relevant sales data. Traditional marketing was representing the voice of the customer.

PLM implementations are more and more providing an enterprise backbone for product data. As a result of this expansion, there is a wish to support sales and catalogs, more efficiently, sharing master data from creation till publishing, combining the product portfolio with sales and service information in a digital way.

In particular, due to globalization, there was a need to make information globally available in different languages without a significant overhead of resources to manage the data or manage the disconnect from the real product data.

Companies that have realized the need for connected data understood that Product Master Data Management is more than only the engineering/manufacturing view. Product Master Data Management is also relevant to the sales and services view. Historically done by companies as a customized extension on their PLM-system, now more and more interfacing with specialized PIM-systems. Proprietary PLM-PIM interfaces exist. Hopefully, with digital transformation, a more standardized approach will appear.

 

PIM driving the need for PLM

Because of changes in the retail market, the need for information in the publishing processes is also changing. Retailers also need to comply with new rules and legislation. The source of the required product information is often in the design process of the product.

In parallel, there is an ongoing market trend to have more and more private label products in the (wholesale and retail) assortments. This means a growing number of retailers and wholesalers will become producers and will have their own Ideation and innovation process.

A good example is ingredients and recipe information in the food retail sector. This information needs to be provided now by suppliers or by their own brand department that owns the design process of the product. Similar to RoHS or REACH compliance in the industry.

Retail and Wholesale can tackle own brands reasonably well with their PIM systems (or Excels), making use of workflows and product statuses. However, over the years, the information demands have increased, and a need for more sophisticated lifecycle management has emerged and, therefore the need for PLM (in this case, PLM also stands for Private Label Management).

In the image below, illustrates a PLM layer and a PIM layer, all leading towards rich product information for the end-users (either B2B or B2C).

In the fast-moving consumer goods (FMCG) world, most innovative products are coming from manufacturers. They have pipelines with lots of ideas resulting in a limited number of sellable products. In the Wholesale and Retail business, the Private Label development process usually has a smaller funnel but a high pressure on time to market, therefore, a higher need for efficiency in the product data chain.

Technological changes, like 3D Printing, also change the information requirements in the retail and wholesale sectors. 3D printing can be used for creating spare parts on-demand, therefore changing the information flow in processes dramatically. Technical drawings and models that were created in the design process, used for mass production, are now needed in the retail process closer to the end customer.

These examples make it clear that more and more information is needed for publication in the sales process and therefore needs to be present in PIM systems. This information needs to be collected and available during the PLM release process. A seamless connection between the product release and sales processes will support the changing requirements and will reduce errors and rework in on data.

PLM and PIM are two practices that need to go hand in hand like a relay baton in athletics. Companies that are using both tools must also organize themselves in a way that processes are integrated, and data governance is in place to keep things running smoothly.

 

Conclusion

Market changes and digital transformation force us to work in value streams along the whole product lifecycle ensuring quality and time to market. PLM and PIM will be connected domains in the future, to enable smooth product go-to-market. Important is the use of data standards (PLM and PIM should speak a common language) – best based on industry standards so that cross-company communication on product data is possible.

What do you think? Do you see PLM and PIM getting together too, in your business?

Please share in the comments.

 

 

 

 

 

The usage of standards has been a recurring topic the past 10 months, probably came back to the surface at PI PLMx Chicago during the PLM Leaders panel discussion. If you want to refresh the debate, Oleg Shilovitsky posted an overview: What vendors are thinking about PLM standards – Aras, Dassault Systemes, Onshape, Oracle PLM, Propel PLM, SAP, Siemens PLM.

It is clear for vendors when they would actively support standards they reduce their competitive advantage, after all, you are opening your systems to connect to other vendor solutions, reducing the chance to sell adjacent functionality. We call it vendor lock-in. If you think this approach only counts for PLM, I would suggest you open your Apple (iPhone) and think about vendor lock-in for a moment.

Vendors will only adhere to standards when pushed by their customers, and that is why we have a wide variety of standards in the engineering domain.

Take the example of JT as a standard viewing format, heavily pushed by Siemens for the German automotive industry to be able to work downstream with CATIA and NX models. There was a JT-version (v9.5) that reached ISO 1306 alignment, but after that, Siemens changed JT (v10) again to optimize their own exchange scenarios, and the standard was lost.

And as customers did not complain (too much), the divergence continued. So it clear  vendors will not maintain standards out of charity as your business does not work for charity either (or do you ?). So I do not blame them is there is no push from their customers to maintain them.

What about standards?

The discussion related to standards flared up around the IpX ConX19 conference and a debate between Oleg & Hakan Kardan (EuroSTEP) where Hakan suggested that PLCS could be a standard data model for the digital thread – you can read Oleg’s view here: Do we need a standard like PLCS to build a digital thread.

Oleg’s opening sentence made me immediately stop reading further as more and more I am tired of this type of framing if you want to do a serious discussion based on arguments. Such a statement is called framing and in particular in politics we see the bad examples of framing.

Standards are like toothbrushes, a good idea, but no one wants to use anyone else’s. The history of engineering and manufacturing software is full of stories about standards.

This opening sentence says all about the mindset related to standards – it is a one-liner – not a fact. It could have been a tweet in this society of experts.

Still later,I read the blog post and learned Oleg has no arguments to depreciate PLCS, however as he does not know the details, he will probably not use it. The main challenge of standards: you need to spend time to understand and adhere to them and agree on following them. Otherwise, you get the same diversion of JT again or similar examples.

However, I might have been wrong in my conclusion as Oleg did some thinking on a Sunday and came with an excellent post: What would happen if PLM Vendors agree about data standards. Here Oleg is making the comparison with a standard in the digital world, established by Google, Microsoft, Yahoo, and Yandex : Schema.org: Evolution of Structured Data on the Web.

There is a need for semantic mapping and understanding in the day-to-day-world, and this understanding makes you realize the same is needed for PLM. That was one of the reasons why I wrote in the past (2015) a series of posts related to the importance of a PLM data model:

All these posts were aimed to help companies and implementers to make the right choices for an item-centric PLM implementation. At that time – 2015, item-centric was the current PLM best practice. I learned from my engagements in the past 15 years, in particular when you have a flexible modeling tool like SmarTeam or nowadays Aras, making the right data model decisions are crucial for future growth.

Who needs standards?

First of all, as long as you stay in your controlled environment, you do not need standards. In particular, in the Aerospace and Automotive industry, the OEMs defined the software versions to be used, and the supply chain had to adhere to their chosen formats. Even this narrow definition was not complete enough as a 3D CAD model needed to be exported for simulation or manufacturing purposes. There was not a single vendor working on a single CAD model definition at that time. So the need for standards emerged as there was a need to exchange data.

Data exchange is the driving force behind standards.

In a second stage also neutral format data storage became an important point – how to save for 75 years an aircraft definition.

Oil & Gas / Building – Construction

These two industries both had the need for standards. The Oil & Gas industry relies on EPC (Engineering / Procurement / Construction)  companies that build plants or platforms. Then the owner/operator takes over the operation and needs a hand-over of all the relevant information. However if this information would be delivered in the application-specific formats the EPC companies have used, the owner/operator would require various software environments and skills, just to have access to the data.

Therefore if the data is delivered in a standard format (ISO 15926) and the exchange follows CFIHOS (Capital Facilities Information Hand Over Specification) this exchange can be done more automated between the EPC and Owner/Operator environment, leading to lower overall cost of delivering and maintaining the information combined with a higher quality. For that reason, the Oil & Gas industry has invested already for a long time in standards as their plants/platform have a long lifecycle.

And the same is happening in the construction industry. Initially Autodesk and Bentley were fighting to become the vendor-standard and ultimately the IFC-standard has taken a lot from the Autodesk-world, but has become a neutral standard for all parties involved in a construction project to share and exchange data. In particular for the construction industry,  the cloud has been an accelerator for collaboration.

So standards are needed where companies/people exchange information

For the same reason in most global companies, English became the standard language. If you needed to learn all the languages spoken in a worldwide organization, you would not have time for business. Therefore everyone making some effort to communicate in one standard language is the best way to operate.

And this is the same for a future data-driven environment – we cannot afford for every exchange to go to the native format from the receiver or source – common neutral (or winning) standards will ultimately also come up in the world of manufacturing data exchange and IoT.

Companies need to push

This is probably the blocking issue for standards. Developing standards, using standards require an effort without immediate ROI. So why not use vendor-formats/models and create custom point-to-point interface as we only need one or two interfaces?  Companies delivering products with a long lifecycle know that the current data formats are not guaranteed for the future, so they push for standards (aerospace/defense/ oil & gas/construction/ infrastructure).

3D PDF Model

Other companies are looking for short term results, and standards are slowing them down. However as soon as they need to exchange data with their Eco-system (suppliers/ customers) an existing standard will make their business more scalable. The lack of standards is one of the inhibitors for Model-Based Definition or the Model-Based Enterprise – see also my post on this topic: Model-Based – Connecting Engineering and Manufacturing

When we would imagine the Digital Enterprise of the future, information will be connected through data streams and models. In a digital enterprise file conversions and proprietary formats will impede the flow of data and create non-value added work. For example if we look to current “Digital Twin” concepts, the 3D-representation of the twin is recreated again instead of a neutral 3D-model continuity. This because companies currently work in a coordinated manner. In perhaps 10 years from now we will reach maturity of a model-based enterprise, which only can exist based on standards. If the standards are based on one dominating platform or based on a merger of standards will be the question.

To discuss this question and how to bridge from the past to the future I am looking forward meeting you at the upcoming PLM Roadmap & PDT 2019 EMEA conference on 13-14 November in Paris, France. Download the program here: PLM for Professionals – Product Lifecycle Innovation

Conclusion

I believe PLM Standards will emerge when building and optimizing a digital enterprise. We need to keep on pushing and actively working for meaningful standards as they are crucial to avoid a lock-in of your data. Potentially creating dead-ends and massive inefficiencies.  The future is about connected Eco-systems, and the leanest companies will survive. Standards do not need to be extraordinarily well-defined and can start from a high-level alignment as we saw from schema.org. Keep on investing and contributing to standards and related discussion to create a shared learning path.

Thanks Oleg Shilovitsky to keep the topic alive.

p.s. I had not time to read and process your PLM Data Commodizitation post

 

Last week I read Verdi Ogewell’ s article:  PTC puts the Needle to the Digital Thread on Engineering.com where Verdi raised the question (and concluded) who is the most visionary PLM CEO – Bernard Charles from Dassault Systemes or Jim Heppelman from PTC. Unfortunate again, an advertorial creating more haziness around modern PLM than adding value.

People need education and Engineering.com is/was a respected site for me, as they state in their Engineering.com/about statement:

Valuable Content for Busy Engineers. Engineering.com was founded on the simple mission to help engineers be better.

Unfortunate this is not the case in the PLM domain anymore. In June, we saw an article related to the failing PLM migration at Ericsson – see The PLM migration dilemma. Besides the fact that a big-bang migration had failed at Ericsson, the majority of the article was based on rumors and suggestions, putting the sponsor of this article in a better perspective.

Of course, Engineering.com needs sponsoring to host their content, and vendors are willing to spend marketing money on that. However, it would be fairer to mention in a footnote who sponsored the article – although per article you can guess. Some more sincere editors or bloggers mention their sponsoring that might have influenced their opinion.

Now, why did the article PTC puts the Needle to the Digital Thread made me react ?

Does a visionary CEO pay off?

It can be great to have a visionary CEO however, do they make the company and their products/services more successful? For every successful visionary CEO, there are perhaps ten failing visionary CEOs as the stock market or their customers did not catch their vision.

There is no lack of PLM vision as Peter Bilello mapped in 2014 when imagining the gaps between vision, available technology, and implementations at companies (leaders and followers). See below:

The tremendous gap between vision and implementations is the topic that concerns me the most. Modern PLM is about making data available across the enterprise or even across the company’s ecosystem. It is about data democratization that allows information to flow and to be presented in context, without the need to recreate this information again.

And here the marketing starts. Verdi writes:

PTC’s Internet of Things (IoT), Industrial Internet of Things (IIoT), digital twin and augmented reality (AR) investments, as well as the collaboration with Rockwell Automation in the factory automation arena, have definitely placed the company in a leading position in digital product realization, distribution and aftermarket services

With this marketing sentence, we are eager to learn why

“With AR, for example, we can improve the quality control of the engines,” added Volvo Group’s Bertrand Felix, during an on-stage interview by Jim Heppelmann. Heppelmann then went down to a Volvo truck with the engine lifted out of its compartment. Using a tablet, he was able to show how the software identified the individual engine, the parts that were included, and he could also pick up the 3D models of each component and at the same time check that everything was included and in the right place.

Impressive – is it real?

The point is that this is the whole chain for digital product realization–development and manufacturing–that the Volvo Group has chosen to focus on. Sub-components have been set up that will build the chain, much is still in the pilot stage, and a lot remains to be done. But there is a plan, and the steps forward are imminent.

OK, so it is a pilot, and a lot remains to be done – but there is a plan. I am curious about the details of that plan, as a little later, we learn from the CAD story:

The Pro/ENGINEER “inheritor” Creo (engine, chassis) is mainly used for CAD and creation of digital twins, but as previously noted, Dassault Systémes’ CATIA is also still used. Just as in many other large industrial organizations, Autodesk’s AutoCAD is also represented for simpler design solutions.

There goes the efficient digital dream. Design data coming from CATIA needs to be recreated in Creo for digital twin support. Data conversion or recreation is an expensive exercise and needs to be reliable and affordable as the value of the digital twin is gone once the data is incorrect.

In a digital enterprise, you do not want silos to work with their own formats, you want a digital thread based on (neutral) models that share metadata/parameters from design to service.

So I dropped the article and noticed Oleg had already commented faster than me in his post: Does PLM industry need a visionary pageant? Oleg refers also to CIMdata, as they confirmed in 2018 that the concept of a platform for product innovation (PIP), or the beyond PLM is far from reality in companies. Most of the time, a PLM-implementation is mainly a beyond PDM environment, not really delivering product data downstream.

I am wholly aligned with Oleg’s  technical conclusion:

What is my(Oleg’s) conclusion? PLM industry doesn’t need another round of visionary pageants. I’d call democratization, downstream usage and openness as biggest challenges and opportunities in PLM applications. Recent decades of platform development demonstrated the important role network platforms played in the development of global systems and services. PLM paradigm change from isolated vertical platforms to open network services required to bring PLM to the next level. Just my thoughts..

My comments to Oleg’s post:

(Jos) I fully agree we do not need more visionary PLM pageants. It is not about technology and therefore I have to disagree with your point about Aras. You call it democratization and openness of data a crucial point – and here I agree – be it that we probably disagree about how to reach this – through standards or through more technology. My main point to be made (this post ) is that we need visionary companies that implement and rethink their processes and are willing to invest resources in that effort. Most digital transformation projects related to PLM fail because the existing status quo/ middle management has no incentive to change. More thoughts to come

And this the central part of my argumentation – it is not about technology (only).

Organizational structures are blocking digital transformation

Since 2014 I have been following several larger manufacturing companies on their path from pushing products to the market in a linear mode towards a customer-driven, more agile, fast responding enterprise. As this is done by taking benefit of digital technologies, we call this process: digital transformation.

(image depicting GE’s digital thread)

What I have learned from these larger enterprises, and both Volvo Trucks and GE as examples, that there is a vision for an end result. For GE, it is the virtual twin of their engines monitored and improved by their Predix platform. For Volvo Trucks, we saw the vision in the quote from Verdi’s article before.

However, these companies are failing in creating a horizontal mindset inside their companies. Data can only be efficient used downstream if there is a willingness to work on collecting the relevant data upstream and delivering this information in an accessible format, preferably data-driven.

The Middle Management Dilemma

And this leads to my reference to middle management. Middle managers learn about the C-level vision and are pushed to make this vision happen. However, they are measured and driven to solve these demands, mainly within their own division or discipline. Yes, they might create goodwill for others, but when it comes to money spent or changing people responsibilities, the status quo will remain.

I wrote about this challenge in The Middle Management dilemma. Digital transformation, of course, is enabled by digital technologies, but it does not mean the technology is creating the transformation. The crucial fact lies in making companies more flexible in their operations, yet establishing better and new contacts with customers.

It is interesting to see that the future of businesses is looking into agile, multidisciplinary teams that can deliver incremental innovations to the company’s portfolio. Somehow going back to the startup culture inside a more significant enterprise. Having worked with several startups, you see the outcome-focus as a whole in the beginning – everyone contributes. Then when the size of the company grows, middle-management is introduced, and most likely silos are created as the middle management gets their own profit & loss targets.

Digital Transformation myths debunked

This week Helmut Romer (thanks Helmut) pointed me to the following HBR-article: Digital does not need to be disruptive where the following myths are debunked:

  1. Myth: Digital requires radical disruption of the value proposition.
    Reality: It usually means using digital tools to better serve the known customer need.
  2. Myth: Digital will replace physical
    Reality: It is a “both/and.”
  3. Myth: Digital involves buying start-ups.
    Reality: It involves protecting start-ups.
  4. Myth: Digital is about technology.
    Reality: It’s about the customer
  5. Myth: Digital requires overhauling legacy systems.
    Reality: It’s more often about incremental bridging.

If you want to understand these five debunked myths, take your time to read the full article, very much aligned with my argumentation, albeit it that my focus is more on the PLM domain.

Conclusions

Vendor sponsoring at Engineering.com has not improved the quality of their PLM articles and creates misleading messages. Especially as the sponsor is not mentioned, and the sponsor is selling technology – the vision gap is too big with reality to compete around a vision.

Transforming companies to take benefit of new technologies requires an end-to-end vision and mindset based on achievable, incremental learning steps. The way your middle management is managed and measured needs to be reworked as the focus is on horizontal flow and understanding of customer/market-oriented processes.

 

This is the moment of the year, where at least in my region, most people take some time off to disconnect from their day-to-day business.  For me, it is never a full disconnect as PLM became my passion, and you should never switch off your passion.

On August 1st, 1999, I started my company TacIT, the same year the acronym PLM was born. I wanted to focus on knowledge management, therefore the name TacIT.  Being dragged into the SmarTeam world with a unique position interfacing between R&D, implementers and customers I found the unique sweet spot, helping me to see all aspects from PLM – the vendor position, the implementer’s view, the customer’s end-user, and management view.

It has been, and still, is 20 years of learning and have been sharing most in the past ten years through my blog. What I have learned is that the more you know, the more you understand that situations are not black and white. See one of my favorite blog pictures below.

So there is enough to overthink during the holidays. Some of my upcoming points:

From coordinated to connected

Instead of using the over-hyped term: Digital Transformation, I believe companies should learn to work in a connected mode, which has become the standard in our daily life. Connected means that information needs to be stored in databases somewhere, combined with openness and standards to make data accessible. For more transactional environments, like CRM, MES, and ERP, the connected mode is not new.

In the domain of product development and selling, we have still a long learning path to go as the majority of organizations is relying on documents, be it Excels, Drawings (PDF) and reports. The fact that they are stored in electronic file formats does not mean that they are accessible. There is still manpower needed to create these artifacts or to extract the required information from them.

The challenge for modern PLM is to establish new best practices around a model-based approach for systems engineering (MBSE), for engineering to manufacturing (MBD/MBE) and operations (Digital Twins). All these best practices should be generic and connected ultimately.  I wrote about these topics in the past, have a look at:

PLM Vendors are showing pieces of the puzzle, but it is up to the implementers to establish the puzzle, without knowing in detail what the end result will be. This is the same journey of Columbus. He had a boat and a target towards the unknown. He discovered a country with a small population, nowadays a country full of immigrants who call themselves natives.

However, the result was an impressive transformation.

Reading about transformation

Last year I read several books to get more insight into what motivates us, and how can we motivate people to change. In one way, it is disappointing to learn that we civilized human beings most of the time to not make rational decisions but act based on our per-historic brain.

 

Thinking, Fast and Slow from Daniel Kahneman was one of the first books in that direction as a must-read to understand our personal thinking and decision processes.

 

 

 

I read Idiot Brain: What Your Head Is Really Up To from Dean Burnett, where he explains this how our brain appears to be sabotaging our life, and what on earth it is really up to. Interesting to read but could be a little more comprehensive

 

I got more excited from Dan Ariely”s book: Predictably Irrational: The Hidden Forces That Shape Our Decisions as it was structured around topics where we handle completely irrational but predictable. And this predictability is used by people (sales/politicians/ management) to drive your actions. Useful to realize when you recognize the situation

 

These three books also illustrate the flaws of our modern time – we communicate fast (preferable through tweets) – we decide fast based on our gut feelings – so you realize towards what kind of world we are heading.  Going through a transformation should be considered as a slow, learning process. Like reading a book – it takes time to digest.

Once you are aiming at a business transformation for your company or supporting a company in its transformation, the following books were insightful:

Leading Digital: Turning Technology into Business Transformation by George Westerman, Didier Bonnet and Andrew McAfee is maybe not the most inspiring book, however as it stays close to what we experience in our day-to-day-life it is for sure a book to read to get a foundational understanding of business transformation.

 

The book I liked the most recent was Leading Transformation: How to Take Charge of Your Company’s Future by Nathan Furr, Kyle Nel, Thomas Zoega Ramsoy as it gives examples of transformation addressing parts of the irrational brain to get a transformation story. I believe in storytelling instead of business cases for transformation. I wrote about it in my blog post: PLM Measurable or a myth referring to Yuval Harari’s book Homo Sapiens

Note: I am starting my holidays now with a small basket of e-books. If you have any recommendations for books that I must read – please write them in the comments of this blog

Discussing transformation

After the summer holidays, I plan to have fruitful discussions around topics close to PLM. Working on a post and starting a conversation related to PLM, PIM, and Master Data Management. The borders between these domains are perhaps getting vaguer in a digital enterprise.

Further, I am looking forward to a discussion around the value of PLM assisting companies in developing sustainable products. A sustainable and probably circular economy is required to keep this earth a place to live for everybody. The whole discussion around climate change, however, is worrying as we should be Thinking – not fast and slow – but balanced.

A circular economy has been several times a topic during the joint CIMdata PLM Roadmap and PDT conferences, which bring me to the final point.

On 13th and 14th November this year I will participate again in the upcoming PLM Roadmap and PDT conference. This time in La Defense, Paris, France. I will share my experiences from working with companies trying to understand and implement pieces of a digital transformation related to PLM.

There will be inspiring presentations from other speakers, all working on some of the aspects of moving to facets of a connected enterprise. It is not a marketing event, it is done by professionals, serving professionals. Therefore I hope if you are passioned about the new aspects of PLM, no matter how you name label them, come and join, discuss and most of all, learn.

Conclusion

 

Modern life is about continuous learning  – make it a habit. Even a holiday is again a way to learn to disconnect.

How disconnected I was you will see after the holidays.

 

 

 

In my previous post, the PLM blame game, I briefly mentioned that there are two delivery models for PLM. One approach based on a PLM system, that contains predefined business logic and functionality, promoting to use the system as much as possible out-of-the-box (OOTB) somehow driving toward a certain rigidness or the other approach where the PLM capabilities need to be developed on top of a customizable infrastructure, providing more flexibility. I believe there has been a debate about this topic over more than 15 years without a decisive conclusion. Therefore I will take you through the pros and cons of both approaches illustrated by examples from the field.

PLM started as a toolkit

The initial cPDM/PLM systems were toolkits for several reasons. In the early days, scalable connectivity was not available or way too expensive for a standard collaboration approach. Engineering information, mostly design files, needed to be shared globally in an efficient manner, and the PLM backbone was often a centralized repository for CAD-data. Bill of Materials handling in PLM was often at a basic level, as either the ERP-system (mostly Aerospace/Defense) or home-grown developed BOM-systems(Automotive) were in place for manufacturing.

Depending on the business needs of the company, the target was too connect as much as possible engineering data sources to the PLM backbone – PLM originated from engineering and is still considered by many people as an engineering solution. For connectivity interfaces and integrations needed to be developed in a time that application integration frameworks were primitive and complicated. This made PLM implementations complex and expensive, so only the large automotive and aerospace/defense companies could afford to invest in such systems. And a lot of tuition fees spent to achieve results. Many of these environments are still operational as they became too risky to touch, as I described in my post: The PLM Migration Dilemma.

The birth of OOTB

Around the year 2000, there was the first development of OOTB PLM. There was Agile (later acquired by Oracle) focusing on the high-tech and medical industry. Instead of document management, they focused on the scenario from bringing the BOM from engineering to manufacturing based on a relatively fixed scenario – therefore fast to implement and fast to validate. The last point, in particular, is crucial in regulated medical environments.

At that time, I was working with SmarTeam on the development of templates for various industries, with a similar mindset. A predefined template would lead to faster implementations and therefore reducing the implementation costs. The challenge with SmarTeam, however, was that is was very easy to customize, based on Microsoft technology and wizards for data modeling and UI design.

This was not a benefit for OOTB-delivery as SmarTeam was implemented through Value Added Resellers, and their major revenue came from providing services to their customers. So it was easy to reprogram the concepts of the templates and use them as your unique selling points towards a customer. A similar situation is now happening with Aras – the primary implementation skills are at the implementing companies, and their revenue does not come from software (maintenance).

The result is that each implementer considers another implementer as a competitor and they are not willing to give up their IP to the software company.

SmarTeam resellers were not eager to deliver their IP back to SmarTeam to get it embedded in the product as it would reduce their unique selling points. I assume the same happens currently in the Aras channel – it might be called Open Source however probably it is only high-level infrastructure.

Around 2006 many of the main PLM-vendors had their various mid-market offerings, and I contributed at that time to the SmarTeam Engineering Express – a preconfigured solution that was rapid to implement if you wanted.

Although the SmarTeam Engineering Express was an excellent sales tool, the resellers that started to implement the software began to customize the environment as fast as possible in their own preferred manner. For two reasons: the customer most of the time had different current practices and secondly the money come from services. So why say No to a customer if you can say Yes?

OOTB and modules

Initially, for the leading PLM Vendors, their mid-market templates were not just aiming at the mid-market. All companies wanted to have a standardized PLM-system with as little as possible customizations. This meant for the PLM vendors that they had to package their functionality into modules, sometimes addressing industry-specific capabilities, sometimes areas of interfaces (CAD and ERP integrations) as a module or generic governance capabilities like portfolio management, project management, and change management.

The principles behind the modules were that they need to deliver data model capabilities combined with business logic/behavior. Otherwise, the value of the module would be not relevant. And this causes a challenge. The more business logic a module delivers, the more the company that implements the module needs to adapt to more generic practices. This requires business change management, people need to be motivated to work differently. And who is eager to make people work differently? Almost nobody,  as it is an intensive coaching job that cannot be done by the vendors (they sell software), often cannot be done by the implementers (they do not have the broad set of skills needed) or by the companies (they do not have the free resources for that). Precisely the principles behind the PLM Blame Game.

OOTB modularity advantages

The first advantage of modularity in the PLM software is that you only buy the software pieces that you really need. However, most companies do not see PLM as a journey, so they agree on a budget to start, and then every module that was not identified before becomes a cost issue. Main reason because the implementation teams focus on delivering capabilities at that stage, not at providing value-based metrics.

The second potential advantage of PLM modularity is the fact that these modules supposed to be complementary to the other modules as they should have been developed in the context of each other. In reality, this is not always the case. Yes, the modules fit nicely on a single PowerPoint slide, however, when it comes to reality, there are separate systems with a minimum of integration with the core. However, the advantage is that the PLM software provider now becomes responsible for upgradability or extendibility of the provided functionality, which is a serious point to consider.

The third advantage from the OOTB modular approach is that it forces the PLM vendor to invest in your industry and future needed capabilities, for example, digital twins, AR/VR, and model-based ways of working. Some skeptic people might say PLM vendors create problems to solve that do not exist yet, optimists might say they invest in imagining the future, which can only happen by trial-and-error. In a digital enterprise, it is: think big, start small, fail fast, and scale quickly.

OOTB modularity disadvantages

Most of the OOTB modularity disadvantages will be advantages in the toolkit approach, therefore discussed in the next paragraph. One downside from the OOTB modular approach is the disconnect between the people developing the modules and the implementers in the field. Often modules are developed based on some leading customer experiences (the big ones), where the majority of usage in the field is targeting smaller companies where people have multiple roles, the typical SMB approach. SMB implementations are often not visible at the PLM Vendor R&D level as they are hidden through the Value Added Reseller network and/or usually too small to become apparent.

Toolkit advantages

The most significant advantage of a PLM toolkit approach is that the implementation can be a journey. Starting with a clear business need, for example in modern PLM, create a digital thread and then once this is achieved dive deeper in areas of the lifecycle that require improvement. And increased functionality is only linked to the number of users, not to extra costs for a new module.

However, if the development of additional functionality becomes massive, you have the risk that low license costs are nullified by development costs.

The second advantage of a PLM toolkit approach is that the implementer and users will have a better relationship in delivering capabilities and therefore, a higher chance of acceptance. The implementer builds what the customer is asking for.

However, as Henry Ford said, if I would ask my customers what they wanted, they would ask for faster horses.

Toolkit considerations

There are several points where a PLM toolkit can be an advantage but also a disadvantage, very much depending on various characteristics of your company and your implementation team. Let’s review some of them:

Innovative: a toolkit does not provide an innovative way of working immediately. The toolkit can have an infrastructure to deliver innovative capabilities, even as small demonstrations, the implementation, and methodology to implement this innovative way of working needs to come from either your company’s resources or your implementer’s skills.

Uniqueness: with a toolkit approach, you can build a unique PLM infrastructure that makes you more competitive than the other. Don’t share your IP and best practices to be more competitive. This approach can be valid if you truly have a competing plan here. Otherwise, the risk might be you are creating a legacy for your company that will slow you down later in time.

Performance: this is a crucial topic if you want to scale your solution to the enterprise level. I spent a lot of time in the past analyzing and supporting SmarTeam implementers and template developers on their journey to optimize their solutions. Choosing the right algorithms, the right data modeling choices are crucial.

Sometimes I came into a situation where the customer blamed SmarTeam because customizations were possible – you can read about this example in an old LinkedIn post: the importance of a PLM data model

Experience: When you plan to implement PLM “big” with a toolkit approach, experience becomes crucial as initial design decisions and scope are significant for future extensions and maintainability. Beautiful implementations can become a burden after five years as design decisions were not documented or analyzed. Having experience or an experienced partner/coach can help you in these situations. In general, it is sporadic for a company to have internally experienced PLM implementers as it is not their core business to implement PLM. Experienced PLM implementers vary from size and skills – make the right choice.

 

Conclusion

After writing this post, I still cannot write a final verdict from my side what is the best approach. Personally, I like the PLM toolkit approach as I have been working in the PLM domain for twenty years seeing and experiencing good and best practices. The OOTB-box approach represents many of these best practices and therefore are a safe path to follow. The undecisive points are who are the people involved and what is your business model. It needs to be an end-to-end coherent approach, no matter which option you choose.

 

 

 

After two reposts, I have finally the ability to write with full speed, and my fingers were aching, having read some postings in the past four weeks.  It started with Verdi Ogewell’ s article on Engineering.com Telecom Giant Ericsson Halts Its PLM Project with Dassault’s 3DEXPERIENCE followed by an Aras blog post Don’t Be a Dinosaur from Mark Reisig, and of course, I would say Oleg Shilovitsky’s post: What to learn from Ericsson PLM failure?

Setting the scene

Verdi’s article is quite tendentious based on outside observations and insinuations. I let you guess who sponsored this article.  If I had to write an article about this situation,

I would state: Ericsson and Dassault failed to migrate the old legacy landscape into a new environment – an end-to-end migration appeared to be impossible.

The other topics mentioned are not relevant to the current situation.

Mark is chiming in on Verdi’s truth and non-relevant points to data migration, suggesting PLM is chosen over dinner. Of course, decisions are not that simple. It is not clear from Mark’s statement, who are the Dinosaurs:

Finally, don’t bet your future on a buzzword. Before making a huge PLM investment, take the time to make sure your PLM vendor has an actual platform. Have them show you their spider chart.  And here’s the hard reality: they won’t do it, because they can’t.

Don’t be a dinosaur—be prepared for the unexpected with a truly resilient digital platform.

I would state, “Don’t bet your future on a spider chart” if you do not know what the real problem is.

 

Oleg’s post finally is more holistic, acknowledging that a full migration might not be the right target, and I like his conclusion:

Flexibility Vs. Out of the box products – which one do you prefer? Over-customize a new PLM to follow old processes? To use a new system as an opportunity to clean existing processes? To move 25,000 people from one database to another is not a simple job. It is time to think about no upgrade PLM systems. While a cloud environment is not an option for mega-size OEMs like Ericsson, there is an opportunity for OEM IT together with the PLM vendor to run a migration path. The last one is a costly step. But… without this step, the current database oriented single-version of truth PLM paradigm is doomed.

The Migration Problem

I believe migration of data – and sometimes the impossibility of data migration – is the biggest elephant in the room when dealing with PLM projects. In 2015 during the PI PLM conference in Dusseldorf, I addressed this topic for the first time: The Challenge of PLM Upgrades.
You can find the presentation on SlideShare here.

I shared a similar example to the Ericsson case from almost 10 years ago. At that time, one of the companies I was working with wanted to replace their mainframe application, which was managing the configuration of certain airplanes. The application managed the aircraft configuration structures in tables and where needed pointing to specifications in a document repository. The two systems were not connected; integrity was guaranteed through manual verification procedures.

The application was considered as the single version of the truth, and has been treated like that for decades. The reason for migration was that all the knowledge of the application disappeared, tables were documented, but the logic was not. And besides this issue, the maintenance costs for the mainframe was also high – also at that time vendor lock-in existed.

The idea was to implement SmarTeam – flexible data model – rapid deployment based on windows technology  -to catch two birds with one stone, i.e., latest microsoft technology and meanwhile direct link to the controlled documents. As they were using CATIA V5, the SmarTeam-integration was a huge potential benefit. For the migration of data, the estimate was two months. What could go wrong?

Well, technically, almost nothing went wrong. The challenge was to map the relational tables to the objects in the SmarTeam data model. And as the relational tables contained a mix of document and item attributes, splitting these tables was not always easy. Sometimes the same properties were with different values in the original table – which one was the truth? The migration took almost two years also due to limited availability of the last knowledgeable resource who could explain the logic.

After the conversion, the question still remained if the migrated data was accurate? Perhaps 99 %?
But what if it was critical? For this company, it was significant, but not mission critical like in Ericsson, where a lot of automation and rules are linked together between loads of systems.

So my point: Dassault has failed at Ericsson and so will Siemens or Aras or any other PLM vendor as the migration issue is not in the technology – we should stop thinking about this kind of migrations.

Who are the dinosaurs?

Mark is in a way suggesting that when you use PLM software from the “old” PLM vendors, you are a dinosaur. Of course, this is a great marketing message, but the truth is that it is not the PLM vendor to blame. Yes, some have more friction than the other in some instances, but in my opinion, there is no ultimate single PLM vendor.

Have a look at the well-known Daimler case from some years ago, which made the news because Daimler decided to replace CATIA by NX. Not because NX was superior – it was about maintaining the PLM backbone Smaragd which would be hard to replace. Even in 2010, there was already the notion that the existing data management infrastructure is hard to replace. See a more neutral article about this topic from Monica Schnitger if you want: Update: Daimler chooses NX for Smaragd.  Also here in the end, it became a complete Siemens account for compatibility reasons.

When you look at the significant wins Aras is mentioning in their customer base, GM, Schaeffler or Airbus, you will probably discover Aras is more the connection layer between legacy systems, old PLM or PDM systems. They are not the new PLM replacing old PLM.  A connection layer creates a digital thread, connecting various data sources for traceability but does not provide digital continuity as the data in the legacy systems is untouched. Still it is an intermediate step towards a hybrid environment.

For me the real dinosaurs are these large enterprises that have been implementing their proprietary PLM environments in the previous century and have built a fully automated infrastructure based on custom data models with a lot of proprietary rules. This was the case in Ericsson, but most traditional automotive and aerospace companies share this problem, as they were the early PLM adopters. And they are not the only ones. Many industrial manufacturing companies suffer from the past, opposite to their Asian competitors who can start with less legacy.

What’s next?

It would be great if the PLM community focused more on the current incompatibility of data between current/past concepts and future digital needs and discuss solution paths (for sure standards will pop-up)

Incompatibility means: Do not talk about migration but probably focus on a hybrid landscape with legacy data, managed in a coordinated manner, and modern, growing digital PLM processes based on a connected approach.

This is the discussion I would like to see, instead of vendors claiming that their technology is the best. None of the vendors will talk about this topic – like the old “Rip-and-Replace” approach is what brings the most software revenue combined with the simplification that there is only OnePLM. It is interesting to see how many companies have a kind of OnePLM or OneXXX statement.

The challenge, of course, is to implement a hybrid approach. To have the two different PLM-concepts work together, there is a need to create a reliable overlap. The reliable overlap can come from an enterprise data governance approach if possible based on a normalized PLM data model. So far all PLM vendors that I know have proprietary data models, only ShareAspace from Eurostep is based on the PLCS standard, but their solutions are most of the time part of a larger PLM-infrastructure (the future !)

To conclude: I look forward to discussing this topic with other PLM peers that are really in the field, discovering and understanding the chasm between the past and the future. Contact me directly or join us as the PLM Roadmap and PDT Europe 13-14 November in Paris. Let’s remain fact-based!
(as a matter of fact you can still contribute – call for papers still open)

 

 

 

This time a post that has been on the table already for a long time – the importance of having established processes, in particular with implementing PLM.  By nature, most people hate processes as it might give the idea that their personal creativity is limited, where large organizations love processes as for them this is the way to guarantee a confident performance.  So let’s have a more in-depth look.

Where processes shine

In a transactional world, processes can be implemented like algorithms, assuming the data to be processed has the right quality. That is why MRP (Material Requirement Planning) and ERP (Enterprise Resource Planning) don’t have the mindset of personal creativity. It is about optimized execution driven by financial and quality goals.

When I started my career in the early days of data management, before it was called PDM/PLM, I learned that there is a need for communication-related to product data. Terms are revisions, and versions started to pop-up combined with change processes. Some companies began to talk about configuration management.

Companies were not thinking PLM along the whole lifecycle. It was more PDM for engineering and ERP for manufacturing. Where PDM was ultimate a document-control environment, ERP was the execution engine relying on documented content, but not necessarily connected. Unfortunate this is still the case at many companies, and it has to do with the mindset. Traditionally a company’s performance has been measured based on financial reporting coming from the ERP system. Engineering was an unmanageable cost in the eyes of the manufacturing company’s management and ERP-software vendors.

In de middle of the nineties (previous century now ! ), I had a meeting with an ERP-country manager to discuss a potential partnership. The challenge was that he had no clue about the value and complementary need for PLM. Even after discussing with him the differences between iterative product development (with revisioning) and linear execution (on the released product), his statement was:

“Engineers are just resources that do not want to be managed, but we will get them”

Meanwhile, I can say this company has changed its strategy, giving PLM a space in their portfolio combined with excellent slides about what could be possible.

To conclude, for linear execution the meaning of processes is more or less close to algorithms and when there is no algorithm, the individual steps in place are predictable with their own KPIs.

Process certification

As I mentioned in the introduction, processes were established to guarantee a predictable outcome, in particular when it comes to quality. For that reason, in the previous century when globalization started companies were somehow forced to get ISO 900x certified. The idea behind these certifications was that a company had processes in place to guarantee an expected outcome and for when they failed, they would have procedures in place to fix these gaps. The reason companies were doing this because no social internet could name and shame bad companies. Having ISO 900x certification would be the guarantee to deliver quality.  In the same perspective, we could see, configuration management, a system of best practices to guarantee that product information was always correct.

Certification was and is heaven for specialized external auditors and consultants.  To get certification you needed to invest in people and time to describe your processes, and once these processes were defined, there were regular external audits to ensure the quality system has been followed.  The beauty of this system – the described procedures were more or less “best intentions” not enforced. When the auditor would come the company had to play some theater that processes were followed., the auditor would find some improvements for next year and the management was happy certification was passed.

This has changed early this century. In particular, mid-market companies were no longer motivated to keep up this charade. The quality process manual remained as a source of inspiration, but external audits were no longer needed. Companies were globally connected and reviewed, so reputation could be sourced easily.

The result: there are documented quality procedures, and there is a reality. The more disconnected employees became in a company due to mergers or growth, the more individual best-practices became the way to deliver the right product and quality, combined with accepted errors and fixes downstream or later. The hidden cost of poor quality is still a secret within many companies.  Talking with employees they all have examples where their company lost a lot of money due to quality mistakes. Yet in less regulated industries, there is no standard approach, like CAPA (Corrective And Preventive Actions), APQP or 8D to solve it.

Configuration Management and Change Management processes

When it comes to managing the exact definition of a product, either an already manufactured product or products that are currently made, there is a need for Configuration Management.  Before there were PLM systems configuration management was done through procedures defining configurations based on references to documents with revisions and versions. In the aerospace industry, separate systems for configuration management were developed, to ensure the exact configuration of an aircraft could be retrieved at any time. Less regulated industries used a more document-based procedural approach as strict as possible. You can read about the history of configuration management and PLM in an earlier blog post: PLM and Configuration Management – a happy marriage?

With the introduction of PDM and PLM-systems, more and more companies wanted to implement their configuration management and in particular their change management inside the system, as the changes are always related to product information that can reside in a PLM-system. The change of part can be proposed (ECR), analyzed and approved, leading to and implementation of the change (ECO) which is based on changed specifications, designs (3D Models / Drawings) and more. You can read the basics here: The Issue and ECR/ECO for Dummies (Reprise)

The Challenge (= Problem) of Digital Processes

More and more companies are implementing change processes fully in PLM, and this is the point that creates the most friction for a PLM implementation. The beauty of digital change processes is that they can be full-proof. No change gets unnoticed as everyone is forced to follow the predefined procedures, either a type of fast track in case of lightweight (= low risk) changes or the full change process when the product is already in a mature state.

Like the ISO-900x processes, the PLM-implementer is often playing the role of the consultancy firm that needs to recommend the company how to implement configuration management and change processes. The challenge here is that the company most of the time does not have a standard view for their change processes and for sure the standard change management inside PLM s not identical to their processes.

Here the battle starts….

Management believes that digital change processes, preferable out-of-the-box, a crucial to implement, where users feel their job becomes more an administrative job than a creative job. Users that create information don’t want to be bothered with the decisions for numbering and revisioning.

They expect the system to do that easily for them – which does not happen as old procedures, responsibilities, and methodologies do not align with the system. Users are not measured or challenged for data quality, they are measured on the work they deliver that is needed now. Let’s first get the work done before we make sure all is consisted defined in the PLM-system.

Digital Transformation allows companies to redefine the responsibilities for users related to the data they produce. It is no longer a 3D Model or a drawing, but a complete data set with properties/attributes that can be shared and used for analysis and automation.

Conclusion

Implementing digital processes for PLM is the most painful, but required step for a successful implementation. As long as data and processes are not consistent, we can keep on dreaming about automation in PLM. Therefore, digital transformation inside PLM should focus on new methods and responsibilities to create a foundation for the future. Without an agreement on the digital processes there will be a growing inefficiency for the future.

 

I am writing this post during the Easter weekend in the Netherlands. Easter / Passover / Pascha / are religious festivities that happen around this time, depending on full moons, etc. I am not the expert here, however, what I like about Easter is that is it is an optimistic religious celebration, connecting history, the “dark days,” and the celebration of new life.

Of course, my PLM-twisted brain never stops associating and looking into an analogy, I saw last week a LinkedIn post from Mark Reisig, about Aras ACE 2019 opening with the following statement:

Digital Transformation – it used to be called PLM,” said Aras CEO Peter Schroer, as he opened the conference with some thoughts around attaining sustainable Digital Transformation and owning the lifecycle.

Was this my Easter Egg surprise? I thought we were in the middle of the PLM Renaissance as some other vendors and consultants talk about this era. Have a look at a recent Engineering.com TV-report: Turning PLM on its head

All jokes aside, the speech from Peter Schroer contained some interesting statements and I want to elaborate on them in this post as the space to comment in LinkedIn is not designed for a long answer.

PLM is Digital Transformation?

In the past few years, there has been a discussion if the acronym PLM (Product Lifecycle Management) is perhaps outdated. PTC claimed thanks to IoT (Internet of Things) now PLM equals IoT, as you can read in  Mark Taber’s 2018 guest article in Digital Engineering: IoT Equals PLM.
Note: Mark is PTC’s vice president of marketing and go-to-market marketing according to the bio at the bottom of the article. So a lot of marketing words, which  strengthens the believers of the old world, that everything new is probably marketing.

Also during the PDT conferences, we discussed if PLM should be replaced by a new acronym and I participated in that discussion too – my Nov 2018 postWill MBSE be the new PLM instead of IoT? is a reflection of my thoughts at that time.

For me, Digital Transformation is a metamorphosis from a document-driven, sequential processes towards data-driven, iterative processes. The metamorphosis example used a lot at this moment, is the one from Caterpillar towards the Butterfly. This process is not easy when it comes to PLM-related information, as I described in my PI PLMx 2019 London Presentation and blog post: The Challenges of a Connected Ecosystem for PLM. The question is even: Will there be a full metamorphosis at the end or will we keep on working in two different modes of operations?

However, Digital Transformation does not change the PLM domain. Even after a successful digital transformation, there will be PLM. The only significant difference in the future – PLM boarders will not be so evident anymore when implementing capabilities in a system or a platform. The upcoming of digital platforms will dissolve or fade the traditional PLM-mapped capabilities.

You can see these differences already by taking an in-depth look at how Oracle, SAP or Propel address PLM. Each of them starts from a core platform with different PLM-flavored extensions, sometimes very different from the traditional PLM Vendors. So Digital transformation is not the replacement of PLM.

Back to Peter Schroer’s rebuttal of some myths. Note: DX stands for Digital Transformation

Myth #1: DX leverages disruptive tech

Peter Schroer:

 It’s easy to get excited about AI, AR, and the 3D visual experience. However, let’s be real. The first step is to get rid of your spreadsheets and paper documentation – to get an accurate product data baseline. We’re not just talking a digital CAD model, but data that includes access to performance data, as-built parts, and previous maintenance work history for everyone from technicians to product managers

Here I am fully aligned with Peter. There are a lot of fancy features discussed by marketing teams, however, when working in the field with companies, the main challenge is to get an organization digital aligned, sharing data accessible along the whole lifecycle with the right quality.

This means you need to have a management team, understanding the need for data governance, data quality and understanding the shift from data ownership to data accountability.  This will only happen with the right mix of vision, strategy and the execution of the strategy – marketing does not make it happen

 

Myth #2: DX results in increased market share, revenue, and profit

Peter Schroer:

Though there’s a lot of talk about it – there isn’t yet any compelling data which proves this to be true. Our goal at Aras is to make our products safer and faster. To support a whole suite of industrial applications to extend your DX strategy quite a bit further.

Here I agree and disagree, depending on the context of this statement. Some companies have gone through a digital transformation and therefore increased their market share, revenue, and profit. If you read books like Leading Transformation or Leading Digital, you will find examples of companies that have gone through successful digital transformations. However, you might also discover that most of these companies haven’t transformed their PLM-domain, but other parts of their businesses.

Also, it is interesting to read a 2017 McKinsey post: The case for digital reinvention, where you will get the confirmation that a lot of digital initiatives did not bring more top-line revenue and most of the times lead to extra costs. Interesting to see where companies focus their digital strategies – picture below:

Where only 2 percent of the respondents were focusing on supply chains, this is, according to the authors of the article, one of the areas with the highest potential ROI. And digital supply chains are closely related to modern PLM – so this is an area with enough work to do by all PLM practitioners– connecting ecosystems (in real-time)

Myth #3: Market leaders are the most successful at DX

Peter Schroer:

If your company is hugely profitable at the moment, it’s highly likely that your organization is NOT focused on Digital Transformation. The lifespan of S&P 500 companies continuing to shrink below 20 years.

How to Attain Sustainable Digital Transformation

– Stop buying disposable systems. It’s about an adaptable platform – it needs to change as your company changes.

– Think incremental. Do not lose momentum. Continuous change is a multi-phase journey. If you are in or completed phase I, then that means there is a phase II, a phase III, and so on.

– Align people & processes.  Mistakes will happen, “the tech side is only 50% of DX” – Aras CEO.

Here I agree with Peter on the business side, be it that some of the current market leaders are already digital. Look at Apple, Google, and Amazon. However, the majority of large enterprises have severe problems with various aspects of a digital transformation as the started in the past before digital technologies became affordable..

Digitization allows information to flow without barriers within an organization, leading to rapid insights and almost direct communication with your customers, your supply chain or other divisions within your company. This drives the need to learn and build new, lean processes and get people aligned to them. Learning to work in a different mode.

And this is extremely difficult for a market leader – as market leader fear for the outside changing world is often not felt. Between the C-level vision and people working in the company, there are several layers of middle management. These layers were created to structure and stabilize the old ways of working.

I wrote about the middle management challenge in my last blog post: The Middle Management dilemma. Almost in the same week there was an article from McKinsey: How companies can help midlevel managers navigate agile transformations.
Conclusion: It is not (only) about technology as some of the tech geeks may think.

Conclusion

Behind the myths addressed by Peter Schroer, there is a complex transformation on-going. Probably not a metamorphosis. With the Easter spirit in mind connected to PLM, I believe digital transformations are possible – Not as a miracle but driven by insights into all aspects. I hope this post gave you some more ideas and please read the connected articles – they are quite relevant if you want to discover what’s below the surface.

Image:  21stcenturypublicservant.wordpress.com/

I have talked a lot the past years about Digital Transformation and in particular its relation to PLM. This time I want to focus a little more on Digital Transformation and my observations related to big enterprises and small and medium enterprises. I will take you starting from the top, the C-level to the work floor and then try to reconnect through the middle management. As you can imagine from the title of this post, there is a challenge. And I am aware I am generalizing for the sake of simplicity.

Starting from the C-level of a large enterprise

Large and traditional enterprises are having the most significant challenge when aiming at a digital transformation for several reasons:

  • They have shareholders that prefer short-term benefits above long-term promising but unclear higher benefits. Shareholders most of the time have no personal interest in these companies, they just want to earn money above the average growth.
  • The CEO is the person to define the strategy which has to come with a compelling vision to inspire the shareholders, the customers and the employees in the company – most of the time in that order of priority.
  • The role of the CEO is to prioritize investments and stop or sell core components to make the transformation affordable. Every transformation is about deciding what to stop, what to start and what to maintain.
  • After four to seven years (the seven years’ itch) it is time for a new CEO to create a new momentum as you cannot keep the excitement up too long.
  • Meanwhile, the Stop-activities are creating fear within the organization – people start fearing their jobs and the start-activities are most of the time of such a small-scale that their successes are not yet seen. So at the work floor, there will be reservations about what’s next

Companies like ABB, Ericsson, GE, Philips – in alphabetical order – are all in several stages of their digital transformation and in particular I have followed GE as they were extremely visible and ambitious. Meanwhile, it is fair to say that the initial Digital Transformation plan from GE has stalled and a lot of lessons learned from that.

If you have time – read this article: The Only Way Manufacturers Can Survive – by Vijay Govindarajan & Jeff Immelt (you need to register). It gives useful insights about what the strategy and planning were for digital transformation. And note PLM is not even mentioned there J

Starting from the C-level of a small and medium enterprise

In a small or medium enterprise, the distance between the C-level and the work floor is most of the time much shorter and chances are that the CEO is a long-term company member in case of a long-standing family-owned business. In this type of companies, a long-term vision can exist and you could expect that digital transformation is more sustainable there.

Unfortunate most of the time it is not, as the C-level is often more active in current business strategies and capabilities close to their understanding instead of investing energy and time to digest the full impact of a digital transformation. These companies might invest in the buzz-words you hear in the market, IoT, Digital Twins and Augmented Reality/Virtual Reality, all very visionary topics, however of low value when they are implemented in an isolated way.

In this paragraph, I also need to mention the small and medium enterprises that are in the hands of an investment company.  Here I feel sorry as the investment company is most of the time trying to optimize the current ways of working by simplifying or rationalizing the business, not creating a transformative vision (as they do not have the insights. In this type of companies, you will see on a lower scale the same investments done as in the other category of small and medium enterprises, be it on a lesser scale.

Do people need to change?

Often you hear that the problem with any change within the companies is because people do not want to change. I think this is too much a generalization. I have worked in the past five years with several companies where we explored the benefits and capabilities of PLM in a modern way, sometimes focusing on an item-centric approach, sometimes focusing on a model-based approach. In all these engagements there was no reluctance from the users to change.

However, there were two types of users in these discussions. I would characterize as evolutionary thinkers (most of the time ten years or more in the company) and love-to-change thinkers (most of them five years or less in the company). The difference between these groups was that the evolutionary thinkers were responding in the context of the existing business constraints where the love-to-change thinkers were not yet touched by the “knowledge how good everything was”.

For digital transformation, you need to create the love-to-change attitude while using the existing knowledge as a base to improve. And this is not a people change, it is an organizational change where you need to enable people to work in their best mode. It needs to be an end-to-end internal change – not changing the people, but changing the organizational parameters: KPIs, divisions, departments, priorities. Have a look at this short movie, you can replace the word ERP by PLM, and you will understand why I like this movie (and the relaxing sound)

The Middle Management dilemma

And here comes my last observation. At the C-level we can find inspiring visions often outcome-based, talking about a more agile company, closer to the customer, empowered workers, etc.  Then there is the ongoing business that cannot be disrupted and needs to perform – so the business units, the departments all get their performance KPIs, merely keeping the status quo in place.

Also, new digital initiatives need to be introduced. They don’t fit in the existing business and are often started in separation – like GE Digital division, and you can read Jeff Immelt’ s thoughts and strategy how this could work. (The Only Way Manufacturers Can Survive). However as the majority of the business runs in the old mode, the Digital Business became another business silo in the organization, as the middle management could not be motivated to embed digital in their business (no KPIs or very low significance of new KPIs)

I talked about the hybrid/bimodal approach several times in my blog posts, most recently in The Challenges of a Connected Ecosystem.  One of the points that I did not address was the fact that probably nobody wants to work in the old mode anymore once the new approach is successful and scaled up.

When the new mode of business is still small, people will not care so much and continue business as usual. Once the new mode becomes the most successful part of the company, people do want to join this success if they can. And here the change effort is needed. An interesting article in this context is The End of Two-Speed IT from the Boston Consultancy Group (2016). They already point at the critical role of middle management. Middle management can kill digital transformation or being part of it, by getting motivated and adopting too.

Conclusion

Perhaps too much text in this post and even more content when you dive more in-depth in the provided content. Crucial if you want to understand the digital transformation process in an existing company and the critical place of middle management. They are likely the killers of digital transformation if not give the right coaching and incentives.  Just an observation – not a thought 😉

%d bloggers like this: