You are currently browsing the category archive for the ‘PLM’ category.

In my previous post, the PLM blame game, I briefly mentioned that there are two delivery models for PLM. One approach based on a PLM system, that contains predefined business logic and functionality, promoting to use the system as much as possible out-of-the-box (OOTB) somehow driving toward a certain rigidness or the other approach where the PLM capabilities need to be developed on top of a customizable infrastructure, providing more flexibility. I believe there has been a debate about this topic over more than 15 years without a decisive conclusion. Therefore I will take you through the pros and cons of both approaches illustrated by examples from the field.

PLM started as a toolkit

The initial cPDM/PLM systems were toolkits for several reasons. In the early days, scalable connectivity was not available or way too expensive for a standard collaboration approach. Engineering information, mostly design files, needed to be shared globally in an efficient manner, and the PLM backbone was often a centralized repository for CAD-data. Bill of Materials handling in PLM was often at a basic level, as either the ERP-system (mostly Aerospace/Defense) or home-grown developed BOM-systems(Automotive) were in place for manufacturing.

Depending on the business needs of the company, the target was too connect as much as possible engineering data sources to the PLM backbone – PLM originated from engineering and is still considered by many people as an engineering solution. For connectivity interfaces and integrations needed to be developed in a time that application integration frameworks were primitive and complicated. This made PLM implementations complex and expensive, so only the large automotive and aerospace/defense companies could afford to invest in such systems. And a lot of tuition fees spent to achieve results. Many of these environments are still operational as they became too risky to touch, as I described in my post: The PLM Migration Dilemma.

The birth of OOTB

Around the year 2000, there was the first development of OOTB PLM. There was Agile (later acquired by Oracle) focusing on the high-tech and medical industry. Instead of document management, they focused on the scenario from bringing the BOM from engineering to manufacturing based on a relatively fixed scenario – therefore fast to implement and fast to validate. The last point, in particular, is crucial in regulated medical environments.

At that time, I was working with SmarTeam on the development of templates for various industries, with a similar mindset. A predefined template would lead to faster implementations and therefore reducing the implementation costs. The challenge with SmarTeam, however, was that is was very easy to customize, based on Microsoft technology and wizards for data modeling and UI design.

This was not a benefit for OOTB-delivery as SmarTeam was implemented through Value Added Resellers, and their major revenue came from providing services to their customers. So it was easy to reprogram the concepts of the templates and use them as your unique selling points towards a customer. A similar situation is now happening with Aras – the primary implementation skills are at the implementing companies, and their revenue does not come from software (maintenance).

The result is that each implementer considers another implementer as a competitor and they are not willing to give up their IP to the software company.

SmarTeam resellers were not eager to deliver their IP back to SmarTeam to get it embedded in the product as it would reduce their unique selling points. I assume the same happens currently in the Aras channel – it might be called Open Source however probably it is only high-level infrastructure.

Around 2006 many of the main PLM-vendors had their various mid-market offerings, and I contributed at that time to the SmarTeam Engineering Express – a preconfigured solution that was rapid to implement if you wanted.

Although the SmarTeam Engineering Express was an excellent sales tool, the resellers that started to implement the software began to customize the environment as fast as possible in their own preferred manner. For two reasons: the customer most of the time had different current practices and secondly the money come from services. So why say No to a customer if you can say Yes?

OOTB and modules

Initially, for the leading PLM Vendors, their mid-market templates were not just aiming at the mid-market. All companies wanted to have a standardized PLM-system with as little as possible customizations. This meant for the PLM vendors that they had to package their functionality into modules, sometimes addressing industry-specific capabilities, sometimes areas of interfaces (CAD and ERP integrations) as a module or generic governance capabilities like portfolio management, project management, and change management.

The principles behind the modules were that they need to deliver data model capabilities combined with business logic/behavior. Otherwise, the value of the module would be not relevant. And this causes a challenge. The more business logic a module delivers, the more the company that implements the module needs to adapt to more generic practices. This requires business change management, people need to be motivated to work differently. And who is eager to make people work differently? Almost nobody,  as it is an intensive coaching job that cannot be done by the vendors (they sell software), often cannot be done by the implementers (they do not have the broad set of skills needed) or by the companies (they do not have the free resources for that). Precisely the principles behind the PLM Blame Game.

OOTB modularity advantages

The first advantage of modularity in the PLM software is that you only buy the software pieces that you really need. However, most companies do not see PLM as a journey, so they agree on a budget to start, and then every module that was not identified before becomes a cost issue. Main reason because the implementation teams focus on delivering capabilities at that stage, not at providing value-based metrics.

The second potential advantage of PLM modularity is the fact that these modules supposed to be complementary to the other modules as they should have been developed in the context of each other. In reality, this is not always the case. Yes, the modules fit nicely on a single PowerPoint slide, however, when it comes to reality, there are separate systems with a minimum of integration with the core. However, the advantage is that the PLM software provider now becomes responsible for upgradability or extendibility of the provided functionality, which is a serious point to consider.

The third advantage from the OOTB modular approach is that it forces the PLM vendor to invest in your industry and future needed capabilities, for example, digital twins, AR/VR, and model-based ways of working. Some skeptic people might say PLM vendors create problems to solve that do not exist yet, optimists might say they invest in imagining the future, which can only happen by trial-and-error. In a digital enterprise, it is: think big, start small, fail fast, and scale quickly.

OOTB modularity disadvantages

Most of the OOTB modularity disadvantages will be advantages in the toolkit approach, therefore discussed in the next paragraph. One downside from the OOTB modular approach is the disconnect between the people developing the modules and the implementers in the field. Often modules are developed based on some leading customer experiences (the big ones), where the majority of usage in the field is targeting smaller companies where people have multiple roles, the typical SMB approach. SMB implementations are often not visible at the PLM Vendor R&D level as they are hidden through the Value Added Reseller network and/or usually too small to become apparent.

Toolkit advantages

The most significant advantage of a PLM toolkit approach is that the implementation can be a journey. Starting with a clear business need, for example in modern PLM, create a digital thread and then once this is achieved dive deeper in areas of the lifecycle that require improvement. And increased functionality is only linked to the number of users, not to extra costs for a new module.

However, if the development of additional functionality becomes massive, you have the risk that low license costs are nullified by development costs.

The second advantage of a PLM toolkit approach is that the implementer and users will have a better relationship in delivering capabilities and therefore, a higher chance of acceptance. The implementer builds what the customer is asking for.

However, as Henry Ford said, if I would ask my customers what they wanted, they would ask for faster horses.

Toolkit considerations

There are several points where a PLM toolkit can be an advantage but also a disadvantage, very much depending on various characteristics of your company and your implementation team. Let’s review some of them:

Innovative: a toolkit does not provide an innovative way of working immediately. The toolkit can have an infrastructure to deliver innovative capabilities, even as small demonstrations, the implementation, and methodology to implement this innovative way of working needs to come from either your company’s resources or your implementer’s skills.

Uniqueness: with a toolkit approach, you can build a unique PLM infrastructure that makes you more competitive than the other. Don’t share your IP and best practices to be more competitive. This approach can be valid if you truly have a competing plan here. Otherwise, the risk might be you are creating a legacy for your company that will slow you down later in time.

Performance: this is a crucial topic if you want to scale your solution to the enterprise level. I spent a lot of time in the past analyzing and supporting SmarTeam implementers and template developers on their journey to optimize their solutions. Choosing the right algorithms, the right data modeling choices are crucial.

Sometimes I came into a situation where the customer blamed SmarTeam because customizations were possible – you can read about this example in an old LinkedIn post: the importance of a PLM data model

Experience: When you plan to implement PLM “big” with a toolkit approach, experience becomes crucial as initial design decisions and scope are significant for future extensions and maintainability. Beautiful implementations can become a burden after five years as design decisions were not documented or analyzed. Having experience or an experienced partner/coach can help you in these situations. In general, it is sporadic for a company to have internally experienced PLM implementers as it is not their core business to implement PLM. Experienced PLM implementers vary from size and skills – make the right choice.

 

Conclusion

After writing this post, I still cannot write a final verdict from my side what is the best approach. Personally, I like the PLM toolkit approach as I have been working in the PLM domain for twenty years seeing and experiencing good and best practices. The OOTB-box approach represents many of these best practices and therefore are a safe path to follow. The undecisive points are who are the people involved and what is your business model. It needs to be an end-to-end coherent approach, no matter which option you choose.

 

 

 

After my previous post about the PLM migration dilemma, I had several discussions with peers in the field why these PLM bad news are creating so much debate. For every PLM vendor, I can publish a failure story if I want. However, the reality is that the majority of PLM implementations do not fail.

Yes, they can cause discomfort or friction in an organization as implementing the tools often forces people to work differently.  And often working differently is not anticipated by the (middle) management and causes, therefore, a mismatch for the people, process & tools paradigm.

So we love bad news in real life. We talk about terrorism while meanwhile, a large number of people are dying through guns, cars, and even the biggest killer mosquitos. Fear stories sell better than success stories, and in particular, in the world of PLM Vendors, every failure of the competition is enlarged.  However, there are more actors involved in a PLM implementation, and if PLM systems would be that bad, they would not exist anymore and replace by ………?

Who to blame – the vendor?

Of course, it is the easiest way to blame the vendor as their marketing is promising to solve all problems. However, when you look from a distance to the traditional PLM vendor community, you see they are in a rat-race to deliver the latest and greatest technology ahead of their competition, often driven by some significant customers.

Their customers are buying the vision and expect it to be ready and industrialized, which is not the case – look at the digital twin hype or AI (Artificial Intelligence).  Released PLM software is not at the same maturity compared to office applications. Office applications do not innovate so much and have thousands of users during a beta-cycle and no dependency on processes.

Most PLM vendors are happy when a few customers jump on their latest release, combined with the fact that implementations of the most recent version are not yet a push on the button.  This might change in the long term if PLM Vendors can deliver cloud-based solutions.

PLM implementations within the same industry might look the same but often vary a lot due to existing practices, which will not change due to the tool – so there is a need for customization or configuration.

PLM systems with strong business rules inside their core might more and more develop towards configuration, where PLM toolkit-like systems might focus on ease of customization. Both approaches have their pro’s and con’s (in another blog post perhaps).

Another topic to blame the vendor is lack of openness.  You hear it in many discussions. If vendor X were open, they would not lock the data – a typical marketing slogan. If PLM vendors would be completely open, to which standards should they adhere?  Every PLM has its preferred collection of tools together – if you stay within their portfolio you have a minimum of compatibility or interface issues.

This logic started already with SAP in the previous century. For PLM vendors, there is no business model for openness. For example, the SmarTeam APIs for connecting and extracting data are available free of charge, leading to no revenue for the vendor and significant revenue for service providers. Without any license costs, they can build any type of interface/solution. In the end, when the PLM vendor has no sustainable revenue, the vendor will disappear as we have seen between 2000 and 2010, where several stand-alone PLM systems disappeared.

So yes, we can blame PLM vendors for their impossible expectations – coming to realistic expectations related to capabilities and openness is probably the biggest challenge.

Who to blame – the implementer?

The second partner in a PLM implementation is the implementation partner, often a specialized company related to the PLM vendor. There are two types of implementation partners – the strategic partners and the system integrators.

Let’s see where we can blame them.

Strategic partners, the consultancy firms,  often have a good relationship with the management, they help the company to shape the future strategy, including PLM. You can blame this type of company for their lack of connection to the actual business. What is the impact on the organization to implement a specific strategy, and what does this mean for current or future PLM?

Strategic partners should be the partner to support business change management as they are likely to have experience with other companies. Unfortunate, this type of companies does not have significant skills in PLM as the PLM domain is just a small subset of the whole potential business strategy.

You can blame them that they are useful in building a vision/strategy but fail to create a consistent connection to the field.

Implementation partners, the system integrators, are most of the times specialized in one or two PLM vendor’s software suites, although the smaller the implementation partner, the less broad their implementation skills. These implementation partners sometimes have built their own PLM best practices for a specific vendor and use this as a sales argument. Others just follow blindly what the vendor is promoting or what the customer is asking for.

They will do anything you request, as long as they get paid for it. The larger ones have loads of resources for offshore deliveries – the challenge you see here is that it might look cheap; however, it becomes expensive if there is no apparent convergence of the deliverables.

As I mentioned before they will never say No to a customer and claim to fill all the “gaps,” there are in the PLM environment.

You can blame implementation partners that their focus is on making money from services. And they are right, to remain in business your company needs to be profitable. It is like lawyers; they will invoice you based on their efforts. And the less you take on your plate, the more they will do for you.

The challenge for both consultancy partners as system integrators is to find a balance between experienced people, who really make it happen and educating juniors to become experts too. Often the customer pays for the education of these juniors

Who to blame – your company?

If your company is implementing PLM, then probably the perception is that that you made all the effort to make it successful.  You followed the advice of the strategic consultants, you selected the best PLM Vendor and system integrator, you created a budget – so what could go wrong?

This all depends on your company’s ambition and scope for PLM.

Implementing the as-is processes

If your PLM implementation is just there to automate existing practices and store data in a central location, this might work out. And this is most of the time when PLM implementations are successful. You know what to expect, and your system integrator knows what to expect.

This type of project can run close to budget, and some system integrators might be tempted to offer a fixed price. I am not a fan of fixed priced projects as you never know exactly what needs to be done. The system integrator might raise the target price with 20 – 40 % to cover their risk or you as a company might select the cheapest bid – another guarantee for failure. A PLM implementation is not a one-time project, it is an on-going journey. Therefore your choice needs to be sustainable.

My experience with this type of implementations is that it easy to blame the companies here too. Often the implementation becomes an IT-project, as business people are too busy to run their day-to-day jobs, therefore they only incidentally support the PLM project. The result is that at a specific moment, users confronted with the system feel not connected to the new system – it was better in the past. In particular, configuration management and change processes can become waterproof, leaving no freedom for the users. Then the blaming starts – first the software then the implementer.

But what if you have an ambitious PLM project as part of a business transformation?

In that case, the PLM platform is just one of the elements to consider. It will be the enabler for new ways of working, enabling customer-centric processes, multi-discipline collaboration, and more. All related to a digital transformation of the enterprise. Therefore, I mention PLM platform instead of PLM system. Future enterprises run on data through connected platforms. The better you can connect your disciplines, the more efficient and faster your company will operate. This, as opposed to the coordinated approach, which I have been addressing several times in the past.

A business transformation is a combination of end-to-end understanding of what to change – from management vision connected to the execution in the field. And as there is not an out-of-the-box template for business transformation, it is crucial a company experiments, evaluates and when successful, scales up new habits.

Therefore, it is hard to define upfront all the effort for the PLM platform and the implementation resources. What is sure is that your company is responsible for that, not an external part. So if it fails, your company is to blame.

Is everyone to blame?

You might have the feeling that everyone is to blame when a PLM implementation fails. I believe that is indeed the case. If you know in advance where all players have their strengths and weaknesses, a PLM implementation should not fail, but be balanced with the right resources. Depending on the scope of your PLM implementation, is it a consolidation or a transformation, you should take care of all stakeholders are participating in the anti-blame game.

The anti-blame game is an exercise where you make sure that the other parties in the game cannot blame you.

  • If you are a vendor – do not over commit
  • If you are a consultant or system integrator – learn to say NO
  • If you are the customer – make sure enough resources are assigned – you own the project. It is your project/transformation.

This has been several times my job in the past, where I was asked to mediate in a stalling PLM implementation. Most of the time at that time it was a blame game, missing the target to find a solution that makes sense. Here coaching from experienced PLM consultants makes sense.

 

Conclusion

Most of the time, PLM implementations are successful if the scope is well understood and not transformative. You will not hear a lot about these projects in the news as we like bad news.

To avoid bad news challenging PLM implementations should make sure all parties involved are challenging the others to remain realistic and invest enough. The role of an experienced external coach can help here.

 

 

After two reposts, I have finally the ability to write with full speed, and my fingers were aching, having read some postings in the past four weeks.  It started with Verdi Ogewell’ s article on Engineering.com Telecom Giant Ericsson Halts Its PLM Project with Dassault’s 3DEXPERIENCE followed by an Aras blog post Don’t Be a Dinosaur from Mark Reisig, and of course, I would say Oleg Shilovitsky’s post: What to learn from Ericsson PLM failure?

Setting the scene

Verdi’s article is quite tendentious based on outside observations and insinuations. I let you guess who sponsored this article.  If I had to write an article about this situation,

I would state: Ericsson and Dassault failed to migrate the old legacy landscape into a new environment – an end-to-end migration appeared to be impossible.

The other topics mentioned are not relevant to the current situation.

Mark is chiming in on Verdi’s truth and non-relevant points to data migration, suggesting PLM is chosen over dinner. Of course, decisions are not that simple. It is not clear from Mark’s statement, who are the Dinosaurs:

Finally, don’t bet your future on a buzzword. Before making a huge PLM investment, take the time to make sure your PLM vendor has an actual platform. Have them show you their spider chart.  And here’s the hard reality: they won’t do it, because they can’t.

Don’t be a dinosaur—be prepared for the unexpected with a truly resilient digital platform.

I would state, “Don’t bet your future on a spider chart” if you do not know what the real problem is.

 

Oleg’s post finally is more holistic, acknowledging that a full migration might not be the right target, and I like his conclusion:

Flexibility Vs. Out of the box products – which one do you prefer? Over-customize a new PLM to follow old processes? To use a new system as an opportunity to clean existing processes? To move 25,000 people from one database to another is not a simple job. It is time to think about no upgrade PLM systems. While a cloud environment is not an option for mega-size OEMs like Ericsson, there is an opportunity for OEM IT together with the PLM vendor to run a migration path. The last one is a costly step. But… without this step, the current database oriented single-version of truth PLM paradigm is doomed.

The Migration Problem

I believe migration of data – and sometimes the impossibility of data migration – is the biggest elephant in the room when dealing with PLM projects. In 2015 during the PI PLM conference in Dusseldorf, I addressed this topic for the first time: The Challenge of PLM Upgrades.
You can find the presentation on SlideShare here.

I shared a similar example to the Ericsson case from almost 10 years ago. At that time, one of the companies I was working with wanted to replace their mainframe application, which was managing the configuration of certain airplanes. The application managed the aircraft configuration structures in tables and where needed pointing to specifications in a document repository. The two systems were not connected; integrity was guaranteed through manual verification procedures.

The application was considered as the single version of the truth, and has been treated like that for decades. The reason for migration was that all the knowledge of the application disappeared, tables were documented, but the logic was not. And besides this issue, the maintenance costs for the mainframe was also high – also at that time vendor lock-in existed.

The idea was to implement SmarTeam – flexible data model – rapid deployment based on windows technology  -to catch two birds with one stone, i.e., latest microsoft technology and meanwhile direct link to the controlled documents. As they were using CATIA V5, the SmarTeam-integration was a huge potential benefit. For the migration of data, the estimate was two months. What could go wrong?

Well, technically, almost nothing went wrong. The challenge was to map the relational tables to the objects in the SmarTeam data model. And as the relational tables contained a mix of document and item attributes, splitting these tables was not always easy. Sometimes the same properties were with different values in the original table – which one was the truth? The migration took almost two years also due to limited availability of the last knowledgeable resource who could explain the logic.

After the conversion, the question still remained if the migrated data was accurate? Perhaps 99 %?
But what if it was critical? For this company, it was significant, but not mission critical like in Ericsson, where a lot of automation and rules are linked together between loads of systems.

So my point: Dassault has failed at Ericsson and so will Siemens or Aras or any other PLM vendor as the migration issue is not in the technology – we should stop thinking about this kind of migrations.

Who are the dinosaurs?

Mark is in a way suggesting that when you use PLM software from the “old” PLM vendors, you are a dinosaur. Of course, this is a great marketing message, but the truth is that it is not the PLM vendor to blame. Yes, some have more friction than the other in some instances, but in my opinion, there is no ultimate single PLM vendor.

Have a look at the well-known Daimler case from some years ago, which made the news because Daimler decided to replace CATIA by NX. Not because NX was superior – it was about maintaining the PLM backbone Smaragd which would be hard to replace. Even in 2010, there was already the notion that the existing data management infrastructure is hard to replace. See a more neutral article about this topic from Monica Schnitger if you want: Update: Daimler chooses NX for Smaragd.  Also here in the end, it became a complete Siemens account for compatibility reasons.

When you look at the significant wins Aras is mentioning in their customer base, GM, Schaeffler or Airbus, you will probably discover Aras is more the connection layer between legacy systems, old PLM or PDM systems. They are not the new PLM replacing old PLM.  A connection layer creates a digital thread, connecting various data sources for traceability but does not provide digital continuity as the data in the legacy systems is untouched. Still it is an intermediate step towards a hybrid environment.

For me the real dinosaurs are these large enterprises that have been implementing their proprietary PLM environments in the previous century and have built a fully automated infrastructure based on custom data models with a lot of proprietary rules. This was the case in Ericsson, but most traditional automotive and aerospace companies share this problem, as they were the early PLM adopters. And they are not the only ones. Many industrial manufacturing companies suffer from the past, opposite to their Asian competitors who can start with less legacy.

What’s next?

It would be great if the PLM community focused more on the current incompatibility of data between current/past concepts and future digital needs and discuss solution paths (for sure standards will pop-up)

Incompatibility means: Do not talk about migration but probably focus on a hybrid landscape with legacy data, managed in a coordinated manner, and modern, growing digital PLM processes based on a connected approach.

This is the discussion I would like to see, instead of vendors claiming that their technology is the best. None of the vendors will talk about this topic – like the old “Rip-and-Replace” approach is what brings the most software revenue combined with the simplification that there is only OnePLM. It is interesting to see how many companies have a kind of OnePLM or OneXXX statement.

The challenge, of course, is to implement a hybrid approach. To have the two different PLM-concepts work together, there is a need to create a reliable overlap. The reliable overlap can come from an enterprise data governance approach if possible based on a normalized PLM data model. So far all PLM vendors that I know have proprietary data models, only ShareAspace from Eurostep is based on the PLCS standard, but their solutions are most of the time part of a larger PLM-infrastructure (the future !)

To conclude: I look forward to discussing this topic with other PLM peers that are really in the field, discovering and understanding the chasm between the past and the future. Contact me directly or join us as the PLM Roadmap and PDT Europe 13-14 November in Paris. Let’s remain fact-based!
(as a matter of fact you can still contribute – call for papers still open)

 

 

 

Unfortunate one more time and old post with some new comments in green as I am not yet able to type at regular speed. I promise this will be the last reprise as I am sure in one week from now I will be double-handed again. The reason I chose this six-year-old post is that the topic is still actual, however, at that time, digital transformation was not yet in fashion for PLM.

If you look at the comments to the article at that time (Feb 2013), you will see some well-known names and behaviors.  What I can state for the moment – there are still people doubting there is a need for PLM, there are still people blaming technology  for the lousy perception of PLM, and there is a large group of silent companies out there that have implemented the basics of PLM, perhaps not as advanced as vendors/consultants have suggested, and they are reaping the benefits.

The main question in upcoming blog posts; “Is this enough ?” Happy rereading!

How come PLM is boring? – Feb 2013

PLM is a popular discussion topic in various blogs, LinkedIn discussion groups, PLM Vendor web sites, and for the upcoming Product Innovation Congress in Berlin.  I look forward to the event to meet and discuss with attendees their experience and struggle to improve their businesses using PLM. (Meanwhile, PI PLMx London has passed – for a review look here –The weekend after PI PLMx London 2019)

From the other side, talking about pure PLM becomes boring. Sometimes it looks like PLM is a monotheistic topic:

  • “What is the right definition of PLM ?” (I will give you the right one)
  • “We are the leading PLM vendor” (and they all are)
  • A PLM system should be using technology XYZ (etc., etc.)
  • Digital Transformation and IoT have come into the picture now

Some meetings with customers in the past three weeks and two different blog posts I read recently made me aware of this ambiguity between boring and fun.

PLM dictating Business is boring

Oleg Shilovitsky´s sequence of posts (and comments) starting with A single bill of materials in 6 steps was an example of the boring part. (Sorry Oleg, as you publish so many posts, there are many that I like and some I  can use as an example). When reading the BOM-related posts,  I noticed they are a typical example of an IT- or Academic view on PLM, in particular on the BOM topic.

questionWill these posts help you after reading them? Do they apply to your business? Alternatively, do you feel more confused as a prolific PLM blogger makes you aware of all the different options and makes you think you should use a single bill of materials?

I learned from my customers and coaching and mediating hundreds of PLM implementations that the single BOM discussion is one of the most confusing and complicated topics. Moreover, for sure if you address it from the IT-perspective.

The customer might say:
“Our BOM is already in ERP – so if it is a single BOM, you know where it is – goodbye !”.

A different approach is to start looking for the optimal process for this customer, addressing the bottlenecks and pains they currently face.  It will be no surprise that PLM best practices and technology are often the building blocks for the considered solution. If it will be a single BOM or a collection of structures evolving through time, this depends on the situation, not on the ultimate PLM system.

Note: meanwhile Oleg has further materialized his thinking through OpenBOM, and he has not lost his speed of publishing

Business dictating PLM is fun

Therefore I was happy to read Stephen Porter´s opinion and comments in: The PLM state: Penny-wise Pound Foolish Pricing and PLM (unfortunate this post has disappeared) where he passes a similar message like mine, from a different starting point, the pricing models of PLM Vendors. My favorite part is in his conclusion:

A PLM decision is typically a long term choice so make sure the vendor and partners have the staying power to grow with your company. Also make sure you are identifying the value drivers that are necessary for your company’s success and do not allow yourself to be swayed by the trendy short term technology

Management in companies can be confused by starting to think they just need PLM because they hear from the analysts, that it improves business. They need to think first to solve their business challenges and change the way they currently work to improve. Moreover, next look for the way to implement this change.

Not:e Stephen wrote at that time an interesting series of post and promised a revival. However I haven’t seen new posts. Did anyone of my readers see new materials that I missed?

Changing the way to work is the problem, not PLM.

It is not the friendly user-interface of PLM system XYZ or the advanced technical capabilities of PLM system ABC,  that will make a PLM implementation easier. Nothing is solved on the cloud or by using a mobile device. If there is no change when implementing PLM, why implement and build a system to lock yourself in even more?

abbThis is what Thomas Schmidt (VP Head of Operational Excellence and IS at ABB’s Power Products Division) told last year at PLM Innovation 2012 in Munich. He was one of the keynote speakers and surprised the audience by stating he did not need PLM!

He explained this by describing the business challenges ABB has to solve: Being a global company but acting around the world as a local company. He needed product simplification, part reduction among product lines around the world, compliance, and more.

Note: Thomas Schmidt meanwhile moved forward in his career, identifying himself as Experienced “Change Leader”, digital transformation, mentor and coach

Another customer in a whole different industry mentioned they were looking for improving global instant collaboration as the current information exchange is too slow and error-prone. Besides, they want to capitalize on the work done and make it accessible and reusable in the future, authoring tool independent. However, they do not call it PLM as in their business nobody uses PLM!

Both cases should make a PLM reseller´s mouths water (watertanden in Dutch), as these companies are looking for critical capabilities available in most of the PLM systems. However, none of these companies asked for a single BOM or a service-oriented architecture. They wanted to solve their business issues. Moreover, for sure, it will lead to implementing PLM capabilities when business and IT-people together define and decide on the right balance.

Unfortunate here we still see a function-feature approach – if it is not there, we will build it

Management take responsibility

Combining PLM and new business needs is the responsibility of management in these companies. It is crucial that a business issue (or a new strategy) is the driving force for a PLM implementation. PLM is not about automating what we have.

In too many situations, the management decides that a new strategy is required. One or more bright business leaders decide they need PLM (note -the strategy has now changed towards buying and implementing a system). Together with IT and after doing an extensive selection process, the selected PLM system (disconnected from the strategy) will be implemented.

I believe we read something about such a case recently

Moreover, this is the place where all PLM discussions come together:

  • why PLM projects are difficult
  • why it is unclear what PLM does.

PLM Vendors and Implementers are not connected anymore at this stage to the strategy or business. They implement technology and do what the customer project team tells them to do (or what they think is best for their business model).

Successful implementations are those where the business and management are actively involved during the whole process and the change.  Involvement requires a significant contribution from their side, often delegated to business and change consultants.

PLM Implementations usually lead to a crisis at some moment in time, when the business is not leading, and the focus is on IT and User Acceptance. In the optimal situation, business is driving IT. However, in most cases, due to lack of time and priorities from the business people, they delegate this activity to IT and the implementation team. So here it is a matter of luck if they will be successful: how experienced is the team?

Will they implement a new business strategy or just automate and implement the way the customer worked before, but now in a digital manner? Do we blame the software when people do not change?

Some notes here: I believe the disconnect between management/PLM vendors and on the other side meanwhile, people in business has become more prominent, due to the digital transformation hype. The hype is moving faster than the organization. Second point: I will not talk about people change anymore – organizations can change – people can adapt within a specific range. It is up to the organization where to push the limits.

 

Back to fun

imageI would not be so passionate about PLM if it was boring. However looking back the fun and enthusiasm does not come from PLM. The fun comes from a pro-active business approach knowing that first the motivating the people and preparing the change are defined, before implementing PLM practices

I believe the future success for PLM technologies is when we know to speak and address real business value and only then use (PLM) technologies to solve them.

PLM becomes is a  logical result not the start. And don´t underestimate: change is required. What do you think – is it a dream ?

????

Due to some physical inconvenience the upcoming weeks, I will not be able to write a full blog post at this time. Typing with one finger is not productive.
A video post could be an alternative, however for me, the disadvantage of a video message is that it requires the audience to follow all the information in a fixed speed – no fast or selective reading possible – hard to archive and store in context of other information. Putting pieces of information in a relevant context is a PLM-mission.

So this time my post from December 2008, where I predicted the future for 2050. I think the predictions were not too bad – you will recognize some trends and challenges still ahead. Some newer comments in italic green. I am curious to learn what you think after reading this post. Enjoy, and I am looking forward to your feedback

PLM in 2050

As the year ends (December 2008), I decided to take my crystal ball to see what would happen with PLM in the future.

It felt like a virtual experience and this is what I saw:

  • Data is not replicated any more – every piece of information that exists will have a Unique Universal ID; some people might call it the UUID. In 2020 this initiative became mature, thanks to the merger of some big PLM and ERP vendors, who brought this initiative to reality. This initiative reduced the exchange costs in supply chains dramatically and lead to bankruptcy for many companies providing translators and exchange software. (still the dream of a digital enterprise)
  • Companies store their data in ‘the cloud’ based on the previous concept. Only some old-fashioned companies still have their own data storage and exchange issues, as they are afraid someone will touch their data. Analysts compare this behavior with the situation in the year 1950, when people kept their money under a mattress, not trusting banks (and they were not always wrong) (we are getting there – sill some years to go)
  • After 3D, an entire virtual world, based on holography, became the next step for product development and understanding of products. Thanks to the revolutionary quantum-3D technology, this concept could be even applied to life sciences. Before ordering a product, customers could first experience and describe their needs in a virtual environment (to be replaced by virtual twin / VR / AR)
  • Finally the cumbersome keyboard and mouse were replaced by voice and eye-recognition.
    Initially voice recognition (Siri, Alexia please come to the PLM domain)
    http://www.youtube.com/watch?v=2Y_Jp6PxsSQand eye tracking (some time to go still)

    were cumbersome. Information was captured by talking to the system and capturing eye-movement when analyzing holograms. This made the life of engineers so much easier, as while analyzing and talking, their knowledge was stored and tagged for reuse. No need for designers to send old-fashioned emails or type their design decisions for future reuse (now moving towards AI)

  • Due to the hologram technology, the world became greener. People did not need to travel around the world, and the standard became virtual meetings with global teams(airlines discontinued business class). Even holidays could be experienced in the virtual world thanks to a Dutch initiative based on the experience with coffee. (not sure why I selected this movie. Sorry ….)
    http://www.youtube.com/watch?v=HUqWaOi8lYQThe whole IT infrastructure was powered by efficient solar energy, reducing the amount of carbon dioxide dramatically
  • Then with a shock, I noticed PLM did not longer exist. Companies were focusing on their core business processes. Systems/terms like PLM, ERP, and CRM did not longer exist. Some older people still remembered the battle between these systems to own the data and the political discomfort this gave inside companies (so true …)
  • As people were working so efficient, there was no need to work all week. There were community time slots, when everyone was active, but 50 percent of the time, people had the time to recreate (to re-create or recreate was the question). Some older French and German designers remembered the days when they had only 10 weeks holiday per year, unimaginable nowadays. (the dream remains)

As we still have more than 40  years to reach this future, I wish you all a successful and excellent 2009.

I am looking forward to be part of the green future next year.

I am writing this post during the Easter weekend in the Netherlands. Easter / Passover / Pascha / are religious festivities that happen around this time, depending on full moons, etc. I am not the expert here, however, what I like about Easter is that is it is an optimistic religious celebration, connecting history, the “dark days,” and the celebration of new life.

Of course, my PLM-twisted brain never stops associating and looking into an analogy, I saw last week a LinkedIn post from Mark Reisig, about Aras ACE 2019 opening with the following statement:

Digital Transformation – it used to be called PLM,” said Aras CEO Peter Schroer, as he opened the conference with some thoughts around attaining sustainable Digital Transformation and owning the lifecycle.

Was this my Easter Egg surprise? I thought we were in the middle of the PLM Renaissance as some other vendors and consultants talk about this era. Have a look at a recent Engineering.com TV-report: Turning PLM on its head

All jokes aside, the speech from Peter Schroer contained some interesting statements and I want to elaborate on them in this post as the space to comment in LinkedIn is not designed for a long answer.

PLM is Digital Transformation?

In the past few years, there has been a discussion if the acronym PLM (Product Lifecycle Management) is perhaps outdated. PTC claimed thanks to IoT (Internet of Things) now PLM equals IoT, as you can read in  Mark Taber’s 2018 guest article in Digital Engineering: IoT Equals PLM.
Note: Mark is PTC’s vice president of marketing and go-to-market marketing according to the bio at the bottom of the article. So a lot of marketing words, which  strengthens the believers of the old world, that everything new is probably marketing.

Also during the PDT conferences, we discussed if PLM should be replaced by a new acronym and I participated in that discussion too – my Nov 2018 postWill MBSE be the new PLM instead of IoT? is a reflection of my thoughts at that time.

For me, Digital Transformation is a metamorphosis from a document-driven, sequential processes towards data-driven, iterative processes. The metamorphosis example used a lot at this moment, is the one from Caterpillar towards the Butterfly. This process is not easy when it comes to PLM-related information, as I described in my PI PLMx 2019 London Presentation and blog post: The Challenges of a Connected Ecosystem for PLM. The question is even: Will there be a full metamorphosis at the end or will we keep on working in two different modes of operations?

However, Digital Transformation does not change the PLM domain. Even after a successful digital transformation, there will be PLM. The only significant difference in the future – PLM boarders will not be so evident anymore when implementing capabilities in a system or a platform. The upcoming of digital platforms will dissolve or fade the traditional PLM-mapped capabilities.

You can see these differences already by taking an in-depth look at how Oracle, SAP or Propel address PLM. Each of them starts from a core platform with different PLM-flavored extensions, sometimes very different from the traditional PLM Vendors. So Digital transformation is not the replacement of PLM.

Back to Peter Schroer’s rebuttal of some myths. Note: DX stands for Digital Transformation

Myth #1: DX leverages disruptive tech

Peter Schroer:

 It’s easy to get excited about AI, AR, and the 3D visual experience. However, let’s be real. The first step is to get rid of your spreadsheets and paper documentation – to get an accurate product data baseline. We’re not just talking a digital CAD model, but data that includes access to performance data, as-built parts, and previous maintenance work history for everyone from technicians to product managers

Here I am fully aligned with Peter. There are a lot of fancy features discussed by marketing teams, however, when working in the field with companies, the main challenge is to get an organization digital aligned, sharing data accessible along the whole lifecycle with the right quality.

This means you need to have a management team, understanding the need for data governance, data quality and understanding the shift from data ownership to data accountability.  This will only happen with the right mix of vision, strategy and the execution of the strategy – marketing does not make it happen

 

Myth #2: DX results in increased market share, revenue, and profit

Peter Schroer:

Though there’s a lot of talk about it – there isn’t yet any compelling data which proves this to be true. Our goal at Aras is to make our products safer and faster. To support a whole suite of industrial applications to extend your DX strategy quite a bit further.

Here I agree and disagree, depending on the context of this statement. Some companies have gone through a digital transformation and therefore increased their market share, revenue, and profit. If you read books like Leading Transformation or Leading Digital, you will find examples of companies that have gone through successful digital transformations. However, you might also discover that most of these companies haven’t transformed their PLM-domain, but other parts of their businesses.

Also, it is interesting to read a 2017 McKinsey post: The case for digital reinvention, where you will get the confirmation that a lot of digital initiatives did not bring more top-line revenue and most of the times lead to extra costs. Interesting to see where companies focus their digital strategies – picture below:

Where only 2 percent of the respondents were focusing on supply chains, this is, according to the authors of the article, one of the areas with the highest potential ROI. And digital supply chains are closely related to modern PLM – so this is an area with enough work to do by all PLM practitioners– connecting ecosystems (in real-time)

Myth #3: Market leaders are the most successful at DX

Peter Schroer:

If your company is hugely profitable at the moment, it’s highly likely that your organization is NOT focused on Digital Transformation. The lifespan of S&P 500 companies continuing to shrink below 20 years.

How to Attain Sustainable Digital Transformation

– Stop buying disposable systems. It’s about an adaptable platform – it needs to change as your company changes.

– Think incremental. Do not lose momentum. Continuous change is a multi-phase journey. If you are in or completed phase I, then that means there is a phase II, a phase III, and so on.

– Align people & processes.  Mistakes will happen, “the tech side is only 50% of DX” – Aras CEO.

Here I agree with Peter on the business side, be it that some of the current market leaders are already digital. Look at Apple, Google, and Amazon. However, the majority of large enterprises have severe problems with various aspects of a digital transformation as the started in the past before digital technologies became affordable..

Digitization allows information to flow without barriers within an organization, leading to rapid insights and almost direct communication with your customers, your supply chain or other divisions within your company. This drives the need to learn and build new, lean processes and get people aligned to them. Learning to work in a different mode.

And this is extremely difficult for a market leader – as market leader fear for the outside changing world is often not felt. Between the C-level vision and people working in the company, there are several layers of middle management. These layers were created to structure and stabilize the old ways of working.

I wrote about the middle management challenge in my last blog post: The Middle Management dilemma. Almost in the same week there was an article from McKinsey: How companies can help midlevel managers navigate agile transformations.
Conclusion: It is not (only) about technology as some of the tech geeks may think.

Conclusion

Behind the myths addressed by Peter Schroer, there is a complex transformation on-going. Probably not a metamorphosis. With the Easter spirit in mind connected to PLM, I believe digital transformations are possible – Not as a miracle but driven by insights into all aspects. I hope this post gave you some more ideas and please read the connected articles – they are quite relevant if you want to discover what’s below the surface.

Image:  21stcenturypublicservant.wordpress.com/

I have talked a lot the past years about Digital Transformation and in particular its relation to PLM. This time I want to focus a little more on Digital Transformation and my observations related to big enterprises and small and medium enterprises. I will take you starting from the top, the C-level to the work floor and then try to reconnect through the middle management. As you can imagine from the title of this post, there is a challenge. And I am aware I am generalizing for the sake of simplicity.

Starting from the C-level of a large enterprise

Large and traditional enterprises are having the most significant challenge when aiming at a digital transformation for several reasons:

  • They have shareholders that prefer short-term benefits above long-term promising but unclear higher benefits. Shareholders most of the time have no personal interest in these companies, they just want to earn money above the average growth.
  • The CEO is the person to define the strategy which has to come with a compelling vision to inspire the shareholders, the customers and the employees in the company – most of the time in that order of priority.
  • The role of the CEO is to prioritize investments and stop or sell core components to make the transformation affordable. Every transformation is about deciding what to stop, what to start and what to maintain.
  • After four to seven years (the seven years’ itch) it is time for a new CEO to create a new momentum as you cannot keep the excitement up too long.
  • Meanwhile, the Stop-activities are creating fear within the organization – people start fearing their jobs and the start-activities are most of the time of such a small-scale that their successes are not yet seen. So at the work floor, there will be reservations about what’s next

Companies like ABB, Ericsson, GE, Philips – in alphabetical order – are all in several stages of their digital transformation and in particular I have followed GE as they were extremely visible and ambitious. Meanwhile, it is fair to say that the initial Digital Transformation plan from GE has stalled and a lot of lessons learned from that.

If you have time – read this article: The Only Way Manufacturers Can Survive – by Vijay Govindarajan & Jeff Immelt (you need to register). It gives useful insights about what the strategy and planning were for digital transformation. And note PLM is not even mentioned there J

Starting from the C-level of a small and medium enterprise

In a small or medium enterprise, the distance between the C-level and the work floor is most of the time much shorter and chances are that the CEO is a long-term company member in case of a long-standing family-owned business. In this type of companies, a long-term vision can exist and you could expect that digital transformation is more sustainable there.

Unfortunate most of the time it is not, as the C-level is often more active in current business strategies and capabilities close to their understanding instead of investing energy and time to digest the full impact of a digital transformation. These companies might invest in the buzz-words you hear in the market, IoT, Digital Twins and Augmented Reality/Virtual Reality, all very visionary topics, however of low value when they are implemented in an isolated way.

In this paragraph, I also need to mention the small and medium enterprises that are in the hands of an investment company.  Here I feel sorry as the investment company is most of the time trying to optimize the current ways of working by simplifying or rationalizing the business, not creating a transformative vision (as they do not have the insights. In this type of companies, you will see on a lower scale the same investments done as in the other category of small and medium enterprises, be it on a lesser scale.

Do people need to change?

Often you hear that the problem with any change within the companies is because people do not want to change. I think this is too much a generalization. I have worked in the past five years with several companies where we explored the benefits and capabilities of PLM in a modern way, sometimes focusing on an item-centric approach, sometimes focusing on a model-based approach. In all these engagements there was no reluctance from the users to change.

However, there were two types of users in these discussions. I would characterize as evolutionary thinkers (most of the time ten years or more in the company) and love-to-change thinkers (most of them five years or less in the company). The difference between these groups was that the evolutionary thinkers were responding in the context of the existing business constraints where the love-to-change thinkers were not yet touched by the “knowledge how good everything was”.

For digital transformation, you need to create the love-to-change attitude while using the existing knowledge as a base to improve. And this is not a people change, it is an organizational change where you need to enable people to work in their best mode. It needs to be an end-to-end internal change – not changing the people, but changing the organizational parameters: KPIs, divisions, departments, priorities. Have a look at this short movie, you can replace the word ERP by PLM, and you will understand why I like this movie (and the relaxing sound)

The Middle Management dilemma

And here comes my last observation. At the C-level we can find inspiring visions often outcome-based, talking about a more agile company, closer to the customer, empowered workers, etc.  Then there is the ongoing business that cannot be disrupted and needs to perform – so the business units, the departments all get their performance KPIs, merely keeping the status quo in place.

Also, new digital initiatives need to be introduced. They don’t fit in the existing business and are often started in separation – like GE Digital division, and you can read Jeff Immelt’ s thoughts and strategy how this could work. (The Only Way Manufacturers Can Survive). However as the majority of the business runs in the old mode, the Digital Business became another business silo in the organization, as the middle management could not be motivated to embed digital in their business (no KPIs or very low significance of new KPIs)

I talked about the hybrid/bimodal approach several times in my blog posts, most recently in The Challenges of a Connected Ecosystem.  One of the points that I did not address was the fact that probably nobody wants to work in the old mode anymore once the new approach is successful and scaled up.

When the new mode of business is still small, people will not care so much and continue business as usual. Once the new mode becomes the most successful part of the company, people do want to join this success if they can. And here the change effort is needed. An interesting article in this context is The End of Two-Speed IT from the Boston Consultancy Group (2016). They already point at the critical role of middle management. Middle management can kill digital transformation or being part of it, by getting motivated and adopting too.

Conclusion

Perhaps too much text in this post and even more content when you dive more in-depth in the provided content. Crucial if you want to understand the digital transformation process in an existing company and the critical place of middle management. They are likely the killers of digital transformation if not give the right coaching and incentives.  Just an observation – not a thought 😉

Image: waitbutwhy.com

Two weeks ago I wrote about the simplification discussion around PLM – Why PLM never will be simple.  There I focused on the fact that even sharing information in a consistent, future proof way of working, is already challenging, despite easy to use communication tools like email or social communities.

I mentioned that sharing PLM data is even more challenging due to their potential revision, version, status, and context.  This brings us to the topic of configuration management, needed to manage the consistency of information, a challenge with the increasingly sophisticated products or systems. Simple tools will never fix this complexity.

To manage the consistency of a product,  configuration management (CM) is required. Two weeks ago I read the following interesting post from CMstat: A Brief History of Configuration Management Software.

An excellent introduction if you want to know more about the roots of CM, be it that the post at the end starts to flush out all the disadvantages and reasons why you should not think about CM using PLM systems.

The following part amused me:

 The Reality of Enterprise PLM

It is no secret that PLM solutions were often sold based in good part on their promise to provide full-lifecycle change control and systems-level configuration management across all functions of the enterprise for the OEM as well as their supply and service chain partners. The appeal of this sales stick was financial; the cost and liability to the corporation from product failures or disasters due to a lack of effective change control was already a chief concern of the executive suite. The sales carrot was the imaginary ROI projected once full-lifecycle, system-level configuration control was in effect for the OEM and supply chain.

Less widely known is that for many PLM deployments, millions of budget dollars and months of calendar time were exhausted before reaching the point in the deployment road map where CM could be implemented. It was not uncommon that before the CM stage gate was reached in the schedule, customer requirements, budget allocations, management priorities, or executive sponsors would change. Or if not these disruptions within the customer’s organization, then the PLM solution provider, their software products or system integrators had been changed, acquired, merged, replaced, or obsoleted. Worse yet for users who just had a job to do was when solutions were “reimagined” halfway through a deployment with the promise (or threat) of “transforming” their workflow processes.

Many project managers were silently thankful for all this as it avoided anyone being blamed for enterprise PLM deployment failures that were over budget, over schedule, overweight, and woefully underwhelming. Regrettably, users once again had to settle for basic change control instead of comprehensive configuration management.

I believe the CMstat-writer is generalizing too much and preaches for their parish. Although my focus lies on PLM, I also learned the importance of CM and for that reason I will share a view on CM from the PLM side:

Configuration Management is not a target for every company

The origins of Configuration Management come from the Aerospace and Defense (A&D) industries. These industries have high quality, reliability and traceability constraints. In simple words, you need to prove your product works correctly specified in all described circumstances and keep this consistent along the lifecycle of the product.

Moreover, imagine you delivered the perfect product, next implementing changes require a full understanding of the impact of the change. What is the impact of the change on the behavior or performance? In A&D is the question is it still safe and reliable?

Somehow PLM and CM are enemies. The main reason why PLM-systems are used is Time to Market — bringing a product as fast as possible to the market with acceptable quality. Being first is sometimes more important than high quality. CM is considered as a process that slows down Time to Market as managing consistency, and continuous validating takes time and effort.

Configuration Management in Aviation is crucial as everyone understands that you cannot afford to discover a severe problem during a flight. All the required verification and validation efforts make CM a costly process along the product lifecycle. Airplane parts are 2 – 3 times more expensive than potential the same parts used in other industries. The main reason: airplane parts are tested and validated for all expected conditions along their lifecycle.  Other industries do not spend so much time on validation. They validate only where issues can hurt the company, either for liability or for costs.

Time to Market even impacts the aviation industry  as we can see from the commercial aircraft battle(s) between Boeing and Airbus. Who delivers the best airplane (size/performance) at the right moment in the global economy? The Airbus 380 seemed to miss its targets in the future – too big – not flexible enough. The Boeing 737 MAX appears to target a market sweet spot (fuel economy) however the recent tragic accidents with this plane seemed to be caused by Time to Market pressure to certify the aircraft too early. Or is the complexity of a modern airplane unmanageable?

CM based on PLM-systems

Most companies had their configuration management practices long before they started to implement PLM. These practices were most of the time documented in procedures, leading to all kind of coding systems for these documents. Drawing numbers (the specification of a part/product), Specifications, Parts Lists, all had a meaningful identifier combined with a version/revision and status. For example, the Philips 12NC coding system is famous in the Netherlands and is still used among spin-offs of Philips and their supplier as it offers a consistent framework to manage configurations.

Storing these documents into a PDM/PLM-system to provide centralized access was not a big problem; however, companies also expected the PLM-system to support automation and functionality to support their configuration management procedures.

A challenge for many implementers for several reasons:

  • PLM-systems do not offer a standard way of working – if they would do so, they could only serve a small niche market – so it needs to be “configured/customized.”
  • Company configuration management rules sometimes cannot be mapped to the provided PLM data-model and their internal business logic. This has led to costly customizations where, in the best case, implementer and company agreed somewhere in the middle. Worst case as the writer from the CM blog is mentioning it becomes an expensive, painful project
  • Companies do not have a consistent configuration management framework as Time to Market is leading – we will fix CM later is the idea, and they let their PLM –implementer configure the PLM-system as good a possible. Still, at the management level, the value of CM is not recognized.
    (see also: PLM-CM-ALM – not sexy ?)

In companies that I worked with, those who were interested in a standardized configuration management approach were trained in CMII. CMII (or CM2) is a framework supported by most PLM-systems, sometimes even as a pre-configured template to speed-up the implementation. Still, as PLM-systems serve multiple industries, I would not expect any generic PLM-vendor to offer Commercial Off-The-Shelf (COTS) CM-capabilities – there are too many legacy approaches. You can find a good and more in-depth article related to CMII here: Towards Integrated Configuration Change Management (CMII) from Lionel Grealou.

 

What’s next?

Current configuration management practices are very much based on the concepts of managing document. However, products are more and more described in a data-driven, model-based approach. You can find all the reasons why we are moving to a model-based approach in my last year’s blog post. Important to realize is that current CM practices in PLM were designed with mechanical products and lifecycles as a base. With the combination of hardware and software, integrated and with different lifecycles, CM has to be reconsidered with a new holistic concept. The Institute of Process Excellence provides CM2 training but is also active in developing concepts for the digital enterprise.

Martijn Dullaart, Lead Architect Configuration Management @ ASML & Chair @ IPE/CM2 Global Congress has published several posts related to CM and a Model-Based approach – you find them here related to his LinkedIn profile. As you can read from his articles organizations are trying to find a new consistent approach.

Perhaps CM as a service to a Product Innovation Platform, as the CMstat blog post suggests? (quote from the post below)

In Part 2 of this CMsights series on the future of CM software we will examine the emerging strategy of “Platform PLM” where functional services like CM are delivered via an open, federated architecture comprised of rapidly-deployable industry-configured applications.

I am looking forward to Part2 of CMsights . An approach that makes sense to me as system boundaries will disappear in a digital enterprise. It will be more critical in the future to create consistent data flows in the right context and based on data with the right quality.

Conclusion

Simple tools and complexity need to be addressed in the right order. Aligning people and processes efficiently to support a profitable enterprise remains the primary challenge for every enterprise. Complex products, more dependent on software than hardware, are requiring new ways of working to stay competitive. Digitization can help to implement these new ways of working. Experienced PLM/CM experts know the document-driven past. Now it is time for a new generation of PLM and CM experts to start from a digital concept and build consistent and workable frameworks. Then the simple tools can follow.

 

Every time the message pops-up that there is a problem with PLM because it is not simple. Most of these messages come from new software vendors challenging the incumbents, other from frustrated people who tried to implement PLM directly and failed. The first group of people believe that technology will kill, the second group usually blames the vendors or implementers for complexity, never themselves. Let’s zoom in on both types:

The Vendor pitch

Two weeks ago Oleg Shilovitsky published: Why complexity is killing PLM and what are future trajectories and opportunities?. His post contained some quotes from my earlier posts (thanks for that Oleg J ). Read the article, and you will understand Oleg believes in technology, ending with the following conclusion/thoughts:

What is my conclusion? It is a time for new PLM leadership that can be built of transparency, trust and new technologies helping to build new intelligent and affordable systems for future manufacturing business. The old mantra about complex PLM should go away. Otherwise, PLM will shrink, retire and die. Just my thoughts…

It is a heartbreaking statement. I would claim every business uses these words to the outside world. Transparency as far as possible, as you do not want to throw your strategy on the table unless you are a philanthropist (or too wealthy to care).

Without trust, no long-term relationship can exist, and yes new technology can make some difference, but is current technology making PLM complex?

Vendors like Aras, Arena, FusePLM, Propel PLM all claim the new PLM space with modern technology – without strong support for 3D CAD/Model-Based approaches, as this creates complications. Other companies like OpenBOM, OnShape, and more are providing a piece of the contemporary PLM-puzzle.

Companies using their capabilities have to solve their PLM strategy /architecture themselves. Having worked for SmarTeam, the market leader in easy client-server PLM, I learned that an excellent first impression helps to sell PLM, but only to departments, it does not scale to the larger enterprise. Why?

PLM is about sharing (and connecting)

Let’s start with the most simplistic view of PLM.  PLM is about sharing information along all the lifecycle phases. Current practices are based on a coordinated approach using documents/files. The future is about sharing information through connected data. My recent post: The Challenges of a connected ecosystem for PLM zooms in on these details.

Can sharing be solved by technology? Let’s look at the most common way of information sharing we currently use: email. Initially, everyone was excited, fast and easy! Long live technology!

Email and communities

Now companies start to realize that email did not solve the problems of sharing. Messages with half of the company in CC, long, unstructured stories, hidden, local archives with crucial information all have led to unproductive situations. Every person shares based on guidelines, personal (best) practices or instinct. And this is only office communication.

Product lifecycle management data and practices are xxxx times more complicated. In particular, if we talk about a modern connected product based on hardware and software, managed through the whole lifecycle – here customers expect quality.

I will change my opinion about PLM simplicity as soon as a reasonable, scalable solution for the email problem exists that solves the chaos.

Some companies thought that moving email to (social) communities would be the logical next step see Why Atos Origin Is Striving To Be A Zero-Email Company.  This was in 2011, and digital communities have reduced the number of emails.

Communities on LinkedIn flourished in the beginning, however, now they are also filled with a large amount of ambiguous content and irrelevant puzzles. Also, these platforms became polluted. The main reason: the concept of communities again is implemented as technology, easy to publish anything (read my blog 🙂 ) but not necessarily combined with an attitude change.

Learning to share – business transformation

Traditional PLM and modern data-driven PLM both have the challenge to offer an infrastructure that will be accepted by the end-users, supporting sharing and collaboration guaranteeing at the end that products have the right quality and the company remains profitable.

Versions, revisions, configuration management, and change management are a must to control cost and quality. All kinds of practices the end-user hates, who “just wants to do his/her job.”
(2010 post: PLM, CM and ALM – not sexy 😦 )

And this is precisely the challenge of PLM. The job to do is not an isolated activity. If you want your data to be reused or make your data discoverable after five or ten years from now, there is extra work to do and thinking needed. Engineers are often under pressure to deliver their designs with enough quality for the next step. Investing time in enriching the information for downstream or future use is considered a waste of time, as the engineering department is not rewarded for that. Actually, the feeling is that productivity is dropping due to “extra” work.

The critical mindset needed for PLM is to redefine the job of individuals. Instead of optimizing the work for individuals, a company needs to focus on the optimized value streams. What is the best moment to spend time on data quality and enrichment? Once the data is created or further downstream when it is required?

Changing this behavior is called business transformation, as it requires a redesign of processes and responsibilities. PLM implementations always have a transformational aspect if done right.

The tool will perform the business transformation

At the PLM Innovation Munich 2012 conference, Autodesk announced their cloud-based PLM 360 solution. One of their lead customers explained to the audience that within two weeks they were up and running. The audience as in shock – see the image to the left. You can find the full presentation here on SlideShare: The PLM Identity Crisis

Easy configuration, even sitting at the airport, is a typical PLM Vendor-marketing sentence.

Too many PLM implementations have created frustration as the management believed the PLM-tools would transform the business. However, without a proper top-down redesign of the business, this is asking for failure.

The good news is that many past PLM implementations haven’t entirely failed because they have been implemented close to the existing processes, not really creating the value PLM could offer. They maintained the silos in a coordinated way. Similar to email – the PLM-system may give a technology boost, but five to ten years later the conclusion comes that fundamentally data quality is poor for downstream usage as it was not part of the easy scope.

Who does Change Management?

It is clear PLM-vendors make a living from selling software. They will not talk about the required Change Management as it complicates the deal.  Change Management is a complex topic as it requires a combination of a vision and a restructuring (a part of) the organization. It is impossible to outsource change management –  as a company you need to own your strategy and change. You can hire strategy consultants and coaches but is a costly exercise if you do not own your transformation.

Therefore, it remains a “soft” topic depending on your company’s management and culture.  The longer your company exists, the more challenging change management will be, as we can see from big American / European enterprises, where the individual opinion is strongest, compared to upcoming Asian companies (with less legacy)

Change Management in the context of digital transformation becomes even more critical as, for sure, existing processes and ways of working no longer apply for a digital and connected enterprise.

There is so much learning and rethinking to do for businesses before we can reap all the benefits PLM Vendors are showing. Go to the upcoming Hannover Messe in Germany, you will be impressed by what is (technically) possible – Artificial Intelligence / Virtual Twins and VR/AR. Next, ask around and look for companies that have been able to implement these capabilities and have transformed their business. I will be happy to visit them all.

Conclusions

PLM will never be pure as it requires individuals to work in a sharing mode which is not natural by nature. Digital transformation, where the sharing of information becomes connecting information requires even more orchestration and less individualism. Culture and being able to create a vision and strategy related to sharing will be crucial – the technology is there.

 

“Technology for its own sake is a common trap. Don’t build your roadmap as a series of technology projects. Technology is only part of the story in digital transformation and often the least challenging one.”

(from “Leading Digital: Turning Technology into Business Transformation (English Edition)” by George Westerman, Didier Bonnet, Andrew McAfee)

 

In this post, I will explain the story behind my presentation at PI PLMx London. You can read my review of the event here: “The weekend after ……” and you can find my slides on SlideShare: HERE.

For me, this presentation is a conclusion of a thought process and collection of built-up experiences in the past three to  five years, related to the challenges digital transformation is creating for PLM and what makes it hard to go through compared to other enterprise business domains.  So here we go:

Digital transformation or disruption?

Slide 2 (top image) until 5 are dealing with the common challenges of business transformation. In nature, the transformation from a Caterpillar (old linear business) to a Butterfly (modern, agile, flexible) has the cocoon stage, where the transformation happens. In business unfortunate companies cannot afford a cocoon phase, it needs to be a parallel change.

Human beings are not good at change (slide 3 & 4), and the risk is that a new technology or a new business model will disrupt your business if you are too confident – see examples from the past. The disruption theory introduced by Clayton Christensen in his book, the Innovators Dilemma is an excellent example of how this can happen.  Some of my thoughts are in The Innovator’s dilemma and generation change (2015)

Although I know some PLM vendors consider themselves as disruptor, I give them no chance in the PLM domain. The main reason: The existing PLM systems are so closely tied to the data they manage, that switching from one PLM system to a more modern PLM system does not pay off.  The data models are so diverse that it is better to stay with the existing environment.

What is clear for modern digital businesses is that if you could start from scratch or with almost no legacy you can move faster forward than the rest. But only if supported by a strong leadership , a(understandable) vision and relentless execution.

The impression of evolution

Marc Halpern’s slide presented at PDT 2015 is one of my favorite slides, as it maps business maturity to various characteristics of an organization, including the technologies used.

 

Slide 7 till 18 are zooming in on the terms Coordinated and Connected and the implications it has for data, people and business. I have written about Coordinated and Connected recently: Coordinated or Connected (2018)

A coordinated approach: Delivering the right information at the right moment in the proper context is what current PLM implementations try to achieve. Allowing people to use their own tools/systems as long as they deliver at the right moment their information (documents/files) as part of the lifecycle/delivery process. Very linear and not too complicated to implement you would expect. However it is difficult ! Here we already see the challenge of just aligning a company to implement a horizontal flow of data. Usability of the PLM backbone and optimized silo thinking are the main inhibitors.

In a connected approach: Providing actual information for anyone connected in any context the slide on the left shows the mental picture we need to have for a digital enterprise. Information coming from various platforms needs to be shareable and connected in real-time, leading, in particular for PLM, to a switch from document-based deliverables to models and parameters that are connected.

Slide 15 has examples of some models.  A data-driven approach creates different responsibilities as it is not about ownership anymore but about accountability.

The image above gives my PLM-twisted vision of which are the five core platforms for an enterprise.  The number FIVE is interesting as David Sherburne just published his Five Platforms that Enable Digital Transformation and in 2016 Gartner identified Five domains for the digital platform .- more IT-twisted ? But remember the purpose of digital transformation is: FIVE!

From Coordinated to Connected is Digital Transformation

Slide 19 till 27 further elaborate on the fact that for PLM there is no evolutionary approach possible, going from a Coordinated technology towards a Connected technology.

For three reasons:  different type of data (document vs. database elements), different people (working in a connected environment requires modern digital skills) and different processes (the standard methods for mechanical-oriented PLM practices do not match processes needed to deliver systems (hardware & software) with an incremental delivery process).

Due to the incompatibility of the data, more and more companies discover that a single PLM-instance cannot support both modes – staying with your existing document-oriented PLM-system does not give the capabilities needed for a model-driven approach. Migrating the data from a traditional PLM-environment towards a modern data-driven environment does not bring any value. The majority of the coordinated data is not complete and with the right quality to use a data-driven environment. Note: in  a data-driven environment you do not have people interpreting the data – the data should be correct for automation / algorithms.

The overlay approach, mentioned several times in various PLM-blogs, is an intermediate solution. It provides traceability and visibility between different data sources (PLM, ALM, ERP, SCM, …). However it does not make the information in these systems better accessible.

So the ultimate conclusion is: You need both approaches, and you need to learn to work in a hybrid environment !

What can various stakeholders do?

For the management of your company, it is crucial they understand the full impact of digital transformation. It is not about a sexy customer website, a service platform or Virtual Reality/Augmented Reality case for the shop floor or services. When these capabilities are created disconnected from the source (PLM), they will deliver inconsistencies in the long-term. The new digital baby becomes another silo in the organization. Real digital transformation comes from an end-to-end vision and implementation.  The result of this end-to-end vision will be the understanding that there is a duality in data, in particular for the PLM domain.

Besides the technicalities, when going through a digital transformation, it is crucial for the management to share their vision in a way it becomes a motivational story, a myth, for all employees. As Yuval Harari, writer of the book Sapiens,  suggested, we (Home Sapiens) need an abstract story, a myth to align a larger group of people to achieve a common abstract goal. I discussed this topic in my posts: PLM as a myth? (2017)  and PLM – measurable or a myth?

Finally, the beauty of new digital businesses is that they are connected and can be monitored in real-time. That implies you can check the results continuously and adjust – scale of fail!

Consultants and strategists in a company should also take the responsibility, to educate the management and when advising on less transformational steps, like efficiency improvements: Make sure you learn and understand model-based approaches and push for data governance initiatives. This will at least narrow the gap between coordinated and connected environments.

This was about strategy – now about execution:

For PLM vendors and implementers, understanding the incompatibility of data between current PLM practices – coordinated and connected – it will lead to different business models. Where traditionally the new PLM vendor started first with a rip-and-replace of the earlier environment – no added value – now it is about starting a new parallel environment.  This implies no more big replacement deals, but more a long-term. strategic and parallel journey.  For PLM vendors it is crucial that being able to offer to these modes in parallel will allow them to keep up their customer base and grow. If they would choose for coordinated or connected only it is for sure a competitor will work in parallel.

For PLM users, an organization should understand that they are the most valuable resources, realizing these people cannot make a drastic change in their behavior. People will adapt within their capabilities but do not expect a person who grew up in the traditional ways of working (linear / analogue) to become a successful worker in the new mode (agile / digital). Their value lies in transferring their skills and coaching new employees but do not let them work in two modes. And when it comes to education: permanent education is crucial and should be scheduled – it is not about one or two trainings per year – if the perfect training would exist, why do students go to school for several years ? Why not give them the perfect PowerPoint twice a year?

Conclusions

I believe after three years of blogging about this theme I have made my point. Let’s observe and learn from what is happening in the field – I remain curious and focused about proof points and new insights. This year I hope to share with you new ideas related to digital practices in all industries, of course all associated with the human side of what we once started to call PLM.

Note: Oleg Shilovitsky just published an interesting post this weekend: Why complexity is killing PLM and what are future trajectories and opportunities? Enough food for discussion. One point: The fact that consumers want simplicity does not mean PLM will become simple – working in the context of other information is the challenge – it is human behavior – team players are good in anticipating – big egos are not. To be continued…….

 

 

 

 

 

 

 

 

 

%d bloggers like this: