You are currently browsing the tag archive for the ‘Marketing’ tag.

Last week I read Verdi Ogewell’s article:  PTC puts the Needle to the Digital Thread on Engineering.com where Verdi raised the question (and concluded) who is the most visionary PLM CEO – Bernard Charles from Dassault Systemes or Jim Heppelman from PTC. Unfortunately again, an advertorial creating more haziness around modern PLM than adding value.

People need education and Engineering.com is/was a respected site for me, as they state in their Engineering.com/about statement:

Valuable Content for Busy Engineers. Engineering.com was founded on the simple mission to help engineers be better.

Unfortunately this is not the case in the PLM domain anymore. In June, we saw an article related to the failing PLM migration at Ericsson – see The PLM migration dilemma. Besides the fact that a big-bang migration had failed at Ericsson, the majority of the article was based on rumors and suggestions, putting the sponsor of this article in a better perspective.

Of course, Engineering.com needs sponsoring to host their content, and vendors are willing to spend marketing money on that. However, it would be fairer to mention in a footnote who sponsored the article – although per article you can guess. Some more sincere editors or bloggers mention their sponsoring that might have influenced their opinion.

Now, why did the article PTC puts the Needle to the Digital Thread made me react?

Does a visionary CEO pay off?

It can be great to have a visionary CEO however, do they make the company and its products/services more successful? For every successful visionary CEO, there are perhaps ten failing visionary CEOs as the stock market or their customers did not catch their vision.

There is no lack of PLM vision as Peter Bilello mapped in 2014 when imagining the gaps between vision, available technology, and implementations at companies (leaders and followers). See below:

The tremendous gap between vision and implementation is the topic that concerns me the most. Modern PLM is about making data available across the enterprise or even across the company’s ecosystem. It is about data democratization that allows information to flow and to be presented in context, without the need to recreate this information again.

And here the marketing starts. Verdi writes:

PTC’s Internet of Things (IoT), Industrial Internet of Things (IIoT), digital twin and augmented reality (AR) investments, as well as the collaboration with Rockwell Automation in the factory automation arena, have definitely placed the company in a leading position in digital product realization, distribution and aftermarket services

With this marketing sentence, we are eager to learn why

“With AR, for example, we can improve the quality control of the engines,” added Volvo Group’s Bertrand Felix, during an on-stage interview by Jim Heppelmann. Heppelmann then went down to a Volvo truck with the engine lifted out of its compartment. Using a tablet, he was able to show how the software identified the individual engine, the parts that were included, and he could also pick up the 3D models of each component and at the same time check that everything was included and in the right place.

Impressive – is it real?

The point is that this is the whole chain for digital product realization–development and manufacturing–that the Volvo Group has chosen to focus on. Sub-components have been set up that will build the chain, much is still in the pilot stage, and a lot remains to be done. But there is a plan, and the steps forward are imminent.

OK, so it is a pilot, and a lot remains to be done – but there is a plan. I am curious about the details of that plan, as a little later, we learn from the CAD story:

The Pro/ENGINEER “inheritor” Creo (engine, chassis) is mainly used for CAD and creation of digital twins, but as previously noted, Dassault Systémes’ CATIA is also still used. Just as in many other large industrial organizations, Autodesk’s AutoCAD is also represented for simpler design solutions.

There goes the efficient digital dream. Design data coming from CATIA needs to be recreated in Creo for digital twin support. Data conversion or recreation is an expensive exercise and needs to be reliable and affordable as the value of the digital twin is gone once the data is incorrect.

In a digital enterprise, you do not want silos to work with their own formats, you want a digital thread based on (neutral) models that share metadata/parameters from design to service.

So I dropped the article and noticed Oleg had already commented faster than me in his post: Does PLM industry need a visionary pageant? Oleg refers also to CIMdata, as they confirmed in 2018 that the concept of a platform for product innovation (PIP), or the beyond PLM is far from reality in companies. Most of the time, a PLM implementation is mainly a beyond PDM environment, not really delivering product data downstream.

I am wholly aligned with Oleg’s  technical conclusion:

What is my(Oleg’s) conclusion? PLM industry doesn’t need another round of visionary pageants. I’d call democratization, downstream usage and openness as biggest challenges and opportunities in PLM applications. Recent decades of platform development demonstrated the important role network platforms played in the development of global systems and services. PLM paradigm change from isolated vertical platforms to open network services required to bring PLM to the next level. Just my thoughts..

My comments to Oleg’s post:

(Jos) I fully agree we do not need more visionary PLM pageants. It is not about technology and therefore I have to disagree with your point about Aras. You call it democratization and openness of data a crucial point – and here I agree – be it that we probably disagree about how to reach this – through standards or through more technology. My main point to be made (this post ) is that we need visionary companies that implement and rethink their processes and are willing to invest resources in that effort. Most digital transformation projects related to PLM fail because the existing status quo/ middle management has no incentive to change. More thoughts to come

And this is the central part of my argumentation – it is not about technology (only).

Organizational structures are blocking digital transformation

Since 2014 I have been following several larger manufacturing companies on their path from pushing products to the market in a linear mode towards a customer-driven, more agile, fast responding enterprise. As this is done by taking the benefit of digital technologies, we call this process: digital transformation.

(image depicting GE’s digital thread)

What I have learned from these larger enterprises, and both Volvo Trucks and GE as examples, is that there is a vision for an end result. For GE, it is the virtual twin of their engines monitored and improved by their Predix platform. For Volvo Trucks, we saw the vision in the quote from Verdi’s article before.

However, these companies are failing in creating a horizontal mindset inside their companies. Data can only be efficiently used downstream if there is a willingness to work on collecting the relevant data upstream and delivering this information in an accessible format, preferably data-driven.

The Middle Management Dilemma

And this leads to my reference to middle management. Middle managers learn about the C-level vision and are pushed to make this vision happen. However, they are measured and driven to solve these demands, mainly within their own division or discipline. Yes, they might create goodwill for others, but when it comes to money spent or changing people’s responsibilities, the status quo will remain.

I wrote about this challenge in The Middle Management dilemma. Digital transformation, of course, is enabled by digital technologies, but it does not mean the technology is creating the transformation. The crucial fact lies in making companies more flexible in their operations, yet establishing better and new contacts with customers.

It is interesting to see that the future of businesses is looking into agile, multidisciplinary teams that can deliver incremental innovations to the company’s portfolio. Somehow going back to the startup culture inside a more significant enterprise. Having worked with several startups, you see the outcome-focus as a whole in the beginning – everyone contributes. Then when the size of the company grows, middle-management is introduced, and most likely silos are created as the middle management gets their own profit & loss targets.

Digital Transformation myths debunked

This week Helmut Romer (thanks Helmut) pointed me to the following HBR-article: Digital does not need to be disruptive where the following myths are debunked:

  1. Myth: Digital requires radical disruption of the value proposition.
    Reality: It usually means using digital tools to better serve the known customer need.
  2. Myth: Digital will replace physical
    Reality: It is a “both/and.”
  3. Myth: Digital involves buying start-ups.
    Reality: It involves protecting start-ups.
  4. Myth: Digital is about technology.
    Reality: It’s about the customer
  5. Myth: Digital requires overhauling legacy systems.
    Reality: It’s more often about incremental bridging.

If you want to understand these five debunked myths, take your time to read the full article, which very much aligned with my argumentation, albeit that my focus is more on the PLM domain.

Conclusions

Vendor sponsoring at Engineering.com has not improved the quality of their PLM articles and creates misleading messages. Especially as the sponsor is not mentioned, and the sponsor is selling technology – the vision gap is too big with reality to compete around a vision.

Transforming companies to take benefit of new technologies requires an end-to-end vision and mindset based on achievable, incremental learning steps. The way your middle management is managed and measured needs to be reworked as the focus is on horizontal flow and understanding of customer/market-oriented processes.

 

observationThe past three weeks I had time to observe some PLM Vendors marketing messages (Autodesk as the major newbie). Some of these message lead to discussions in blogs or (LinkedIn) forums. Always a good moment to smile and think about reality.

In addition the sessions from PLM Innovation 2012 became available for the attendees (thanks MarketKey – good quality).  I had the chance to see the sessions I missed. On my wish list was “The future of PLM Business Models” moderated by Oleg as here according to Oleg some interesting viewpoints came up. This related to my post where I mentioned the various definitions of PLM.

All the above inspired me to write this post, which made me realize we keep on pushing misconceptions around PLM in our customer’s mind, with the main goal to differentiate.

I will address the following four misconceptions. The last one is probably not a surprise, therefore on the last position. Still sometimes taken for granted.

  1. PLM = PLM
  2. On the cloud = Open and Upgradeable
  3. Data = Process Support
  4. Marketing = Reality

1. PLM = PLM

It is interesting to observe that the definition of PLM becomes more and more a marketing term instead of a common definition which applies to all.

plm_shareLet me try to formulate again a very generic definition which captures most of what PLM Vendors target to do.

PLM is about connecting and sharing the company’s intellectual property through the whole product lifecycle. This includes knowledge created at the concept phase going through the whole lifecycle till a product is serviced in the field or decommissioned.

Experiences from the field (services / customers / market input) serve again for the other lifecycle phases as input to deliver a better or innovative product.

Innovation is an iterative process. It is not only about storing data, PLM is also covering the processes of managing the data, especially the change processes. Sharing data is not easy. It requires a different mind set, data is not only created for personal or departmental usage, but also should be found and extended by other roles in the organization. This all makes it a serious implementation, as aligning people is a business change, not an IT driven approach.

Based on this (too long) high-level PLM definition, it does not imply you cannot do PLM without a PLM system. You might also have a collection of tools that are able to provide a complete coverage of the PLM needs.

DIYOleg talks about DIY (Do It Yourself) PLM, and  I have seen examples of Excel spreadsheets managing Excel spreadsheets and Email archives.  The challenge I see with this type of PLM implementations is that after several years it is extremely difficult for a company to change. Possible reasons: the initial gurus do not longer work for the company, new employees need years of experience to find and interpret the right data.

A quick and simple solution can become a burden in the long term if you analyze the possible risks.

Where in the early years of PLM, it was mainly a Dassault Systemes, Siemens and PTC driven approach with deep CAD integrations,  the later years other companies like Aras and now Autodesk, started to change the focus from classical PLM more to managing enterprise metadata. A similar approach SAP PLM is offering. Deep integrations with CAD are the most complex parts of PLM and by avoiding them, you can claim your system is easier to implement, etc., etc.

myplmA Single version of the truth is a fancy PLM expression. It would be nice if this was also valid for the definition of PLM. The PLM Innovation 2012  session at the future of PLM models demonstrated that the vendors in this panel discussion had a complete different opinion about PLM. So how can people inside their company explain to the management and others why the need PLM and which PLM they have in mind ?

2. On the cloud = Open and Upgradeable

cloudDuring the panel discussion Grant Rochelle from Autodesk mentioned the simplicity of their software and how easy it will be upgradeable in the future. Also he referred to Salesforce.com as a proof point.They provide online updates from the software, without the customer having to do anything.

The above statement is true as long as you keep your business coverage simple and do not anticipate changes in the future. Let me share you an analogy with SmarTeam, how it started in 1995

At that time SmarTeam was insanely configurable. The Data Model Wizard contained several PDM templates an within hours you could create a company specific data model. A non-IT skilled person could add attributes, data types, anything they wanted and build the application, almost the same as Autodesk 360. The only difference, SmarTeam was not on the cloud, but it was running on Windows, a revolution at that time as all serious PDM systems were Unix based.

The complexity came however when SmarTeam started to integrate deeply with CAD systems. These integrations created the need for a more standardized data model per CAD system. And as the SmarTeam R&D was not aware of each and every customer’s implementation, it became hard to define a common business logic in the data (and to remain easily upgradable).

warningI foresee similar issues with the new cloud based PLM systems. They seem to be very easy to implement (add what you want – it is easy). As long as you do not integrate to other systems it remains safe. Integrating with other and future systems requires either a common data definition (which most vendors do not like) or specific integrations with the cost of upgrading.

In the beginning everything is always possible with a well-defined system. But  be aware looking back in history, every 10 years a disruptive wave comes in, changing the scope and upgradability.

And to challenge the cloud-based PLM vendors: in the generic definition of PLM that I shared above, PLM integrates also design data.

3. Data = Process Support

Another misconception, which originates from the beginning of PLM is the idea that once you have support for specific data in your system, you support the process.

PDM_ERP_2000First example: Items defined in ERP. When engineers started to use a PDM system and started to define a a new item there were challenges.  I had many discussions with IT-departments, that they did not need or wanted items in PDM. ERP was the source for an item, and when a designer needed a new item, (s)he had to create it in ERP. So we have a single definition of the item.

Or the designer had to request a new item number from the ERP system. And please do not request numbers too often as we do not want to waste them was the message.

Ten years later this looks like a joke, as most companies have an integrated PDM/ERP process and understand that the initial definition of a new item comes from PDM and at a certain stage the matured item is shared (and completed) by the ERP system.  It is clear that the most efficient manner to create a new item is through PLM as the virtual definition (specs / CAD data) also reside there and information is handled in that context.

capaA second more actual example is the fact that compliancy is often handled in ERP. It is correct that in the case you manufacture a product for a specific target market, you need to be able to have the compliancy information available.

However would you do this in your ERP system, where you are late (almost at the end) of the design lifecycle or is it more logical that during your design stages at all time you verify and check compliancy ? The process will work much more efficient and with less cost of change when done in PLM but most companies still see ERP as their primary IT system and PLM is an engineering tool.

Finally on this topic a remark to the simplified PLM vendors. Having the ability to store for example requirements in your system does not mean you have support for a complete requirements management process. It is also about the change and validation of requirements, which should be integrated for a relevant role during product definition (often CAD) and validation. As long as the data is disconnected there is not such a big advantage compared to Excel.

4. Marketing = Reality

plm_modelsIn the future of PLM Business Models
Oleg showed a slide with the functional architectures of the major PLM Vendors. In the diagram all seems to be connected as a single system, but in reality this is usually not the case.

As certain components / technologies are acquired, they provide the process coverage and only in the future you can imagine it works integrated. You cannot blame marketing for doing so, as their role is to position their products in the most appealing way customers will buy it. Without marketing perhaps no-one would buy a PLM system, when understanding the details Smile

Autodesk as a newcomer in PLM has a strong background in marketing. This is understandable as similar to Microsoft, their main revenue comes from selling a large volume of products, where the classical PLM vendors often have a combination with services and business change. And therefore a different price point.

When in the eighties Autodesk introduced AutoCAD, it was a simple, open 2D CAD environment, able to run on a PC. Autodesk’s statement at that time: “We provide 80 percent of the functionality for 20 % of the price”.
Does this sound familiar nowadays ?

As AutoCAD was a basic platform allowing customers and resellers to build their solutions on top of it, this became the mid-market success for Autodesk with AutoCAD.

The challenge with Autodesk PLM 360 is that although the same logic seems to make sense, I believe the challenge is not in the flexible platform. The challenge is in the future, when people want to do more complex things with the system, like integrations with design, enterprise collaboration.

At that time you need people who can specify the change, guide the change and implement the change. And this is usually not a DIY job.

pinoAutodesk is still learning to find the right PLM messages I noticed recently. When attending the Autodesk PLM session during PLM Innovation 2012 (end of February), one of their launching customers ElectronVault presented their implementation – it took only two weeks !!! Incredible

However reading Rob Cohee’s blog post the end of March, he mentions ElectronVault again. Quote:

ElectronVault was searching for something like this for over two years and after 6 weeks they have implemented Project Management, EBOM, MBOM, and starting on their APQP project. Six Weeks!!!

As you see, four weeks later the incredible two weeks have become six weeks and again everything is implemented. Still incredible and I am looking forward to meet ElectronVault in the future as I believe they are a typical young company and they will go through all of the maturity phases a company will go through: people, processes and tools (in this order). A tool driven implementation is more likely to slow down in the long term.

Conclusion: Misconceptions are not new. History can teach us a lot about what we experience now. New technology, new concepts can be a break through. However implementing them at companies requires a change in organizations and this has been the biggest challenge the past 100 years.

Translate

Categories

  1. Good day Jos, I was involved in many implementations over the years (including) Philips…. Indeed smart part numbers was a…

  2. Another Interesting article, I also see this kind of development in our company where terminology shifts and approach methods change.…