You are currently browsing the category archive for the ‘Observations’ category.
It’s the beginning of the year. Companies are starting new initiatives, and one of them is potentially the next PLM-project. There is a common understanding that implementing PLM requires a business case with ROI and measurable results. Let me explain why this understanding is a myth and requires a myth.
I was triggered by a re-post from Lionel Grealou, titled: Defining the PLM Business Case. Knowing Lionel is quite active in PLM and digital transformation, I was a little surprised by the content of the post. Then I noticed the post was from January 2015, already 5 years old. Clearly, the world has changed (perhaps the leadership has not changed).
So I took this post as a starting point to make my case.
In 2015, we were in the early days of digital transformation. Many PLM-projects were considered as traditional linear projects. There is the AS-IS situation, there is the TO-BE situation. Next, we know the (linear) path to the solution and we can describe the project and its expected benefits.
It works if you understand and measure exactly the AS-IS situation and know almost entirely the TO-BE situation (misperception #1).
However, implementing PLM is not about installing a new transactional system. PLM implementations deal with changing ways-of-working and therefore implementing PLM takes time as it is not just a switch of systems. Lionel was addressing this point:
“The inherent risks associated with any long term business benefit driven projects include the capability of the organization to maintain a valid business case with a benefit realization forecast that remains above the initial baseline. The more rework is required or if the program delivery slips, the more the business case gets eroded and the longer the payback period.”
Interestingly here is the mentioning “..the business case gets eroded” – this is most of the time the case. Lionel proposes to track business benefits. Also, he mentions the justification of the PLM-project could be done by considering PLM as a business transformation tool (misperception #2) or a way to mitigate risk,s due to unsupported IT-solutions (misperception #3).
Let’s dive into these misperceptions
#1 Compare the TO-BE and the AS-IS situation
Two points here.
- Does your company measure the AS-IS situation? Do you know how your company performs when it comes to PLM related processes? The percentage of time spent by engineers for searching for data has been investigated – however, PLM goes beyond engineering. What about product management, marketing, manufacturing, and service? Typical performance indicators mentioned are:
- Do you know the exact TO-BE situation? In particular, when you implement PLM, it is likely to be in the scope of a digital transformation. If you implement to automate and consolidate existing processes, you might be able to calculate the expected benefits. However, you do not want to freeze your organization’s processes. You need to implement a reliable product data infrastructure that allows you to enhance, change, or add new processes when required. In particular, for PLM, digital transformation does not have a clear target picture and scope yet. We are all learning.
#2 PLM is a business transformation tool
Imagine you install the best product innovation platform relevant for your business and selected by your favorite consultancy firm. It might be a serious investment; however, we are talking about the future of the company, and the future is in digital platforms. So nothing can go wrong now.
Does this read like a joke? Yes, it is, however, this is how many companies have justified their PLM investment. First, they select the best tool (according to their criteria, according to their perception), and then business transformation can start. Later in time, the implementation might not be so successful; the vendor and/or implementer will be blamed. Read: The PLM blame game
When you go to PLM conferences, you will often hear the same mantras: Have a vision, Have C-level sponsoring/involved, No Big Bang, it is a business project, not an IT-project, and more. And vendor-sponsored sessions always talk about amazing fast implementations (or did they mean installing the POC ?)
However, most of the time, C-level approves the budget without understanding the full implications (expecting the tool will do the work); business is too busy or does not get enough allocated time to supporting implementation (expecting the tool will do the work). So often the PLM-project becomes an IT-project executed mainly by the cheapest implementation partner (expecting the tool will do the work). Again this is not a joke!
A business transformation can only be successful if you agree on a vision and a learning path. The learning path will expose the fact that future value streams require horizontal thinking and reallocation of responsibilities – breaking the silos, creating streams.
Small teams can demonstrate these benefits without disrupting the current organization. However, over time the new ways of working should become the standard, therefore requiring different types of skills (people), different ways of working (different KPIs and P&L for departments), and ultimate different tools.
As mentioned before, many PLM-projects start from the tools – a guarantee for discomfort and/or failure.
#3 – mitigate risks due to unsupported IT-solutions
Often PLM projects are started because the legacy environment becomes outdated. Either because the hardware infrastructure is no longer supported/affordable or the software code dependencies on the latest operating systems are no longer guaranteed.
A typical approach to solve this is a big-bang project – the new PLM system needs to contain all the old data and meanwhile, to justify the project, the new PLM system needs to bring additional business value. The latter part is most of the time not difficult to identify as traditional PLM implementations most of the time were in reality, cPDM environments with a focus on engineering only.
However, the legacy migration can have such a significant impact on the new PLM-system that it destroys the potential for the future. I wrote about this issue in The PLM Migration Dilemma
How to approach PLM ROI?
A PLM-project never will get a budget or approval from the board when there is no financial business case. Building the right financial business case for PLM is a skill that is often overlooked. During the upcoming PI PLMx London conference (3 – 4 February), I will moderate a Focus Group where we will discuss how to get PLM on the Exec’s agenda.
Two of my main experiences:
- Connect your PLM-project to the business strategy. As mentioned before, isolated PLM fails most of the time because business transformation, organizational change and the targeted outcome are not included. If PLM is not linked to an actual business strategy, it will be considered as a costly IT-project with all its bad connotations. Have a look at my older post: PLM, ROI and disappearing jobs
- Create a Myth. Perhaps the word Myth is exaggerated – it is about an understandable vision. Myth connects nicely to the observations from behavioral experts that our brain does not decide on numbers but by emotion. Big decisions and big themes in the world or in a company need a myth: “Make our company great again” could be the tagline. In such a case people get aligned without a deep understanding of what is the impact or business case; the myth will do the work – no need for a detailed business case. A typical human behavior, see also my post: PLM as a myth.
Conclusion
There should never be a business case uniquely for PLM – it should always be in the context of a business strategy requiring new ways of working and new tools. In business, we believe that having a solid business case is the foundation for success. Sometimes an overwhelming set of details and numbers can give the impression that the business case is solid. Consultancy firms are experts in this area to build a business case based on emotion. They know how to combine numbers with a myth. Therefore look at their approach – don’t be too technical / too financial. If the myth will hold, at the end depends on the people and organization, not on the investments in tools and services.
In my previous post, the PLM blame game, I briefly mentioned that there are two delivery models for PLM. One approach based on a PLM system, that contains predefined business logic and functionality, promoting to use the system as much as possible out-of-the-box (OOTB) somehow driving toward a certain rigidness or the other approach where the PLM capabilities need to be developed on top of a customizable infrastructure, providing more flexibility. I believe there has been a debate about this topic over more than 15 years without a decisive conclusion. Therefore I will take you through the pros and cons of both approaches illustrated by examples from the field.
PLM started as a toolkit
The initial cPDM/PLM systems were toolkits for several reasons. In the early days, scalable connectivity was not available or way too expensive for a standard collaboration approach. Engineering information, mostly design files, needed to be shared globally in an efficient manner, and the PLM backbone was often a centralized repository for CAD-data. Bill of Materials handling in PLM was often at a basic level, as either the ERP-system (mostly Aerospace/Defense) or home-grown developed BOM-systems(Automotive) were in place for manufacturing.
Depending on the business needs of the company, the target was too connect as much as possible engineering data sources to the PLM backbone – PLM originated from engineering and is still considered by many people as an engineering solution. For connectivity interfaces and integrations needed to be developed in a time that application integration frameworks were primitive and complicated. This made PLM implementations complex and expensive, so only the large automotive and aerospace/defense companies could afford to invest in such systems. And a lot of tuition fees spent to achieve results. Many of these environments are still operational as they became too risky to touch, as I described in my post: The PLM Migration Dilemma.
The birth of OOTB
Around the year 2000, there was the first development of OOTB PLM. There was Agile (later acquired by Oracle) focusing on the high-tech and medical industry. Instead of document management, they focused on the scenario from bringing the BOM from engineering to manufacturing based on a relatively fixed scenario – therefore fast to implement and fast to validate. The last point, in particular, is crucial in regulated medical environments.
At that time, I was working with SmarTeam on the development of templates for various industries, with a similar mindset. A predefined template would lead to faster implementations and therefore reducing the implementation costs. The challenge with SmarTeam, however, was that is was very easy to customize, based on Microsoft technology and wizards for data modeling and UI design.
This was not a benefit for OOTB-delivery as SmarTeam was implemented through Value Added Resellers, and their major revenue came from providing services to their customers. So it was easy to reprogram the concepts of the templates and use them as your unique selling points towards a customer. A similar situation is now happening with Aras – the primary implementation skills are at the implementing companies, and their revenue does not come from software (maintenance).
The result is that each implementer considers another implementer as a competitor and they are not willing to give up their IP to the software company.
SmarTeam resellers were not eager to deliver their IP back to SmarTeam to get it embedded in the product as it would reduce their unique selling points. I assume the same happens currently in the Aras channel – it might be called Open Source however probably it is only high-level infrastructure.
Around 2006 many of the main PLM-vendors had their various mid-market offerings, and I contributed at that time to the SmarTeam Engineering Express – a preconfigured solution that was rapid to implement if you wanted.
Although the SmarTeam Engineering Express was an excellent sales tool, the resellers that started to implement the software began to customize the environment as fast as possible in their own preferred manner. For two reasons: the customer most of the time had different current practices and secondly the money come from services. So why say No to a customer if you can say Yes?
OOTB and modules
Initially, for the leading PLM Vendors, their mid-market templates were not just aiming at the mid-market. All companies wanted to have a standardized PLM-system with as little as possible customizations. This meant for the PLM vendors that they had to package their functionality into modules, sometimes addressing industry-specific capabilities, sometimes areas of interfaces (CAD and ERP integrations) as a module or generic governance capabilities like portfolio management, project management, and change management.
The principles behind the modules were that they need to deliver data model capabilities combined with business logic/behavior. Otherwise, the value of the module would be not relevant. And this causes a challenge. The more business logic a module delivers, the more the company that implements the module needs to adapt to more generic practices. This requires business change management, people need to be motivated to work differently. And who is eager to make people work differently? Almost nobody, as it is an intensive coaching job that cannot be done by the vendors (they sell software), often cannot be done by the implementers (they do not have the broad set of skills needed) or by the companies (they do not have the free resources for that). Precisely the principles behind the PLM Blame Game.
OOTB modularity advantages
The first advantage of modularity in the PLM software is that you only buy the software pieces that you really need. However, most companies do not see PLM as a journey, so they agree on a budget to start, and then every module that was not identified before becomes a cost issue. Main reason because the implementation teams focus on delivering capabilities at that stage, not at providing value-based metrics.
The second potential advantage of PLM modularity is the fact that these modules supposed to be complementary to the other modules as they should have been developed in the context of each other. In reality, this is not always the case. Yes, the modules fit nicely on a single PowerPoint slide, however, when it comes to reality, there are separate systems with a minimum of integration with the core. However, the advantage is that the PLM software provider now becomes responsible for upgradability or extendibility of the provided functionality, which is a serious point to consider.
The third advantage from the OOTB modular approach is that it forces the PLM vendor to invest in your industry and future needed capabilities, for example, digital twins, AR/VR, and model-based ways of working. Some skeptic people might say PLM vendors create problems to solve that do not exist yet, optimists might say they invest in imagining the future, which can only happen by trial-and-error. In a digital enterprise, it is: think big, start small, fail fast, and scale quickly.
OOTB modularity disadvantages
Most of the OOTB modularity disadvantages will be advantages in the toolkit approach, therefore discussed in the next paragraph. One downside from the OOTB modular approach is the disconnect between the people developing the modules and the implementers in the field. Often modules are developed based on some leading customer experiences (the big ones), where the majority of usage in the field is targeting smaller companies where people have multiple roles, the typical SMB approach. SMB implementations are often not visible at the PLM Vendor R&D level as they are hidden through the Value Added Reseller network and/or usually too small to become apparent.
Toolkit advantages
The most significant advantage of a PLM toolkit approach is that the implementation can be a journey. Starting with a clear business need, for example in modern PLM, create a digital thread and then once this is achieved dive deeper in areas of the lifecycle that require improvement. And increased functionality is only linked to the number of users, not to extra costs for a new module.
However, if the development of additional functionality becomes massive, you have the risk that low license costs are nullified by development costs.
The second advantage of a PLM toolkit approach is that the implementer and users will have a better relationship in delivering capabilities and therefore, a higher chance of acceptance. The implementer builds what the customer is asking for.
However, as Henry Ford said, if I would ask my customers what they wanted, they would ask for faster horses.
Toolkit considerations
There are several points where a PLM toolkit can be an advantage but also a disadvantage, very much depending on various characteristics of your company and your implementation team. Let’s review some of them:
Innovative: a toolkit does not provide an innovative way of working immediately. The toolkit can have an infrastructure to deliver innovative capabilities, even as small demonstrations, the implementation, and methodology to implement this innovative way of working needs to come from either your company’s resources or your implementer’s skills.
Uniqueness: with a toolkit approach, you can build a unique PLM infrastructure that makes you more competitive than the other. Don’t share your IP and best practices to be more competitive. This approach can be valid if you truly have a competing plan here. Otherwise, the risk might be you are creating a legacy for your company that will slow you down later in time.
Performance: this is a crucial topic if you want to scale your solution to the enterprise level. I spent a lot of time in the past analyzing and supporting SmarTeam implementers and template developers on their journey to optimize their solutions. Choosing the right algorithms, the right data modeling choices are crucial.
Sometimes I came into a situation where the customer blamed SmarTeam because customizations were possible – you can read about this example in an old LinkedIn post: the importance of a PLM data model
Experience: When you plan to implement PLM “big” with a toolkit approach, experience becomes crucial as initial design decisions and scope are significant for future extensions and maintainability. Beautiful implementations can become a burden after five years as design decisions were not documented or analyzed. Having experience or an experienced partner/coach can help you in these situations. In general, it is sporadic for a company to have internally experienced PLM implementers as it is not their core business to implement PLM. Experienced PLM implementers vary from size and skills – make the right choice.
Conclusion
After writing this post, I still cannot write a final verdict from my side what is the best approach. Personally, I like the PLM toolkit approach as I have been working in the PLM domain for twenty years seeing and experiencing good and best practices. The OOTB-box approach represents many of these best practices and therefore are a safe path to follow. The undecisive points are who are the people involved and what is your business model. It needs to be an end-to-end coherent approach, no matter which option you choose.
After my previous post about the PLM migration dilemma, I had several discussions with peers in the field why these PLM bad news are creating so much debate. For every PLM vendor, I can publish a failure story if I want. However, the reality is that the majority of PLM implementations do not fail.
Yes, they can cause discomfort or friction in an organization as implementing the tools often forces people to work differently. And often working differently is not anticipated by the (middle) management and causes, therefore, a mismatch for the people, process & tools paradigm.
So we love bad news in real life. We talk about terrorism while meanwhile, a large number of people are dying through guns, cars, and even the biggest killer mosquitos. Fear stories sell better than success stories, and in particular, in the world of PLM Vendors, every failure of the competition is enlarged. However, there are more actors involved in a PLM implementation, and if PLM systems would be that bad, they would not exist anymore and replace by ………?
Who to blame – the vendor?
Of course, it is the easiest way to blame the vendor as their marketing is promising to solve all problems. However, when you look from a distance to the traditional PLM vendor community, you see they are in a rat-race to deliver the latest and greatest technology ahead of their competition, often driven by some significant customers.
Their customers are buying the vision and expect it to be ready and industrialized, which is not the case – look at the digital twin hype or AI (Artificial Intelligence). Released PLM software is not at the same maturity compared to office applications. Office applications do not innovate so much and have thousands of users during a beta-cycle and no dependency on processes.
Most PLM vendors are happy when a few customers jump on their latest release, combined with the fact that implementations of the most recent version are not yet a push on the button. This might change in the long term if PLM Vendors can deliver cloud-based solutions.
PLM implementations within the same industry might look the same but often vary a lot due to existing practices, which will not change due to the tool – so there is a need for customization or configuration.
PLM systems with strong business rules inside their core might more and more develop towards configuration, where PLM toolkit-like systems might focus on ease of customization. Both approaches have their pro’s and con’s (in another blog post perhaps).
Another topic to blame the vendor is lack of openness. You hear it in many discussions. If vendor X were open, they would not lock the data – a typical marketing slogan. If PLM vendors would be completely open, to which standards should they adhere? Every PLM has its preferred collection of tools together – if you stay within their portfolio you have a minimum of compatibility or interface issues.
This logic started already with SAP in the previous century. For PLM vendors, there is no business model for openness. For example, the SmarTeam APIs for connecting and extracting data are available free of charge, leading to no revenue for the vendor and significant revenue for service providers. Without any license costs, they can build any type of interface/solution. In the end, when the PLM vendor has no sustainable revenue, the vendor will disappear as we have seen between 2000 and 2010, where several stand-alone PLM systems disappeared.
So yes, we can blame PLM vendors for their impossible expectations – coming to realistic expectations related to capabilities and openness is probably the biggest challenge.
Who to blame – the implementer?
The second partner in a PLM implementation is the implementation partner, often a specialized company related to the PLM vendor. There are two types of implementation partners – the strategic partners and the system integrators.
Let’s see where we can blame them.
Strategic partners, the consultancy firms, often have a good relationship with the management, they help the company to shape the future strategy, including PLM. You can blame this type of company for their lack of connection to the actual business. What is the impact on the organization to implement a specific strategy, and what does this mean for current or future PLM?
Strategic partners should be the partner to support business change management as they are likely to have experience with other companies. Unfortunate, this type of companies does not have significant skills in PLM as the PLM domain is just a small subset of the whole potential business strategy.
You can blame them that they are useful in building a vision/strategy but fail to create a consistent connection to the field.
Implementation partners, the system integrators, are most of the times specialized in one or two PLM vendor’s software suites, although the smaller the implementation partner, the less broad their implementation skills. These implementation partners sometimes have built their own PLM best practices for a specific vendor and use this as a sales argument. Others just follow blindly what the vendor is promoting or what the customer is asking for.
They will do anything you request, as long as they get paid for it. The larger ones have loads of resources for offshore deliveries – the challenge you see here is that it might look cheap; however, it becomes expensive if there is no apparent convergence of the deliverables.
As I mentioned before they will never say No to a customer and claim to fill all the “gaps,” there are in the PLM environment.
You can blame implementation partners that their focus is on making money from services. And they are right, to remain in business your company needs to be profitable. It is like lawyers; they will invoice you based on their efforts. And the less you take on your plate, the more they will do for you.
The challenge for both consultancy partners as system integrators is to find a balance between experienced people, who really make it happen and educating juniors to become experts too. Often the customer pays for the education of these juniors
Who to blame – your company?
If your company is implementing PLM, then probably the perception is that that you made all the effort to make it successful. You followed the advice of the strategic consultants, you selected the best PLM Vendor and system integrator, you created a budget – so what could go wrong?
This all depends on your company’s ambition and scope for PLM.
Implementing the as-is processes
If your PLM implementation is just there to automate existing practices and store data in a central location, this might work out. And this is most of the time when PLM implementations are successful. You know what to expect, and your system integrator knows what to expect.
This type of project can run close to budget, and some system integrators might be tempted to offer a fixed price. I am not a fan of fixed priced projects as you never know exactly what needs to be done. The system integrator might raise the target price with 20 – 40 % to cover their risk or you as a company might select the cheapest bid – another guarantee for failure. A PLM implementation is not a one-time project, it is an on-going journey. Therefore your choice needs to be sustainable.
My experience with this type of implementations is that it easy to blame the companies here too. Often the implementation becomes an IT-project, as business people are too busy to run their day-to-day jobs, therefore they only incidentally support the PLM project. The result is that at a specific moment, users confronted with the system feel not connected to the new system – it was better in the past. In particular, configuration management and change processes can become waterproof, leaving no freedom for the users. Then the blaming starts – first the software then the implementer.
But what if you have an ambitious PLM project as part of a business transformation?
In that case, the PLM platform is just one of the elements to consider. It will be the enabler for new ways of working, enabling customer-centric processes, multi-discipline collaboration, and more. All related to a digital transformation of the enterprise. Therefore, I mention PLM platform instead of PLM system. Future enterprises run on data through connected platforms. The better you can connect your disciplines, the more efficient and faster your company will operate. This, as opposed to the coordinated approach, which I have been addressing several times in the past.
A business transformation is a combination of end-to-end understanding of what to change – from management vision connected to the execution in the field. And as there is not an out-of-the-box template for business transformation, it is crucial a company experiments, evaluates and when successful, scales up new habits.
Therefore, it is hard to define upfront all the effort for the PLM platform and the implementation resources. What is sure is that your company is responsible for that, not an external part. So if it fails, your company is to blame.
Is everyone to blame?
You might have the feeling that everyone is to blame when a PLM implementation fails. I believe that is indeed the case. If you know in advance where all players have their strengths and weaknesses, a PLM implementation should not fail, but be balanced with the right resources. Depending on the scope of your PLM implementation, is it a consolidation or a transformation, you should take care of all stakeholders are participating in the anti-blame game.
The anti-blame game is an exercise where you make sure that the other parties in the game cannot blame you.
- If you are a vendor – do not over commit
- If you are a consultant or system integrator – learn to say NO
- If you are the customer – make sure enough resources are assigned – you own the project. It is your project/transformation.
This has been several times my job in the past, where I was asked to mediate in a stalling PLM implementation. Most of the time at that time it was a blame game, missing the target to find a solution that makes sense. Here coaching from experienced PLM consultants makes sense.
Conclusion
Most of the time, PLM implementations are successful if the scope is well understood and not transformative. You will not hear a lot about these projects in the news as we like bad news.
To avoid bad news challenging PLM implementations should make sure all parties involved are challenging the others to remain realistic and invest enough. The role of an experienced external coach can help here.
After two reposts, I have finally the ability to write with full speed, and my fingers were aching, having read some postings in the past four weeks. It started with Verdi Ogewell’ s article on Engineering.com Telecom Giant Ericsson Halts Its PLM Project with Dassault’s 3DEXPERIENCE followed by an Aras blog post Don’t Be a Dinosaur from Mark Reisig, and of course, I would say Oleg Shilovitsky’s post: What to learn from Ericsson PLM failure?
Setting the scene
Verdi’s article is quite tendentious based on outside observations and insinuations. I let you guess who sponsored this article. If I had to write an article about this situation,
I would state: Ericsson and Dassault failed to migrate the old legacy landscape into a new environment – an end-to-end migration appeared to be impossible.
The other topics mentioned are not relevant to the current situation.
Mark is chiming in on Verdi’s truth and non-relevant points to data migration, suggesting PLM is chosen over dinner. Of course, decisions are not that simple. It is not clear from Mark’s statement, who are the Dinosaurs:
Finally, don’t bet your future on a buzzword. Before making a huge PLM investment, take the time to make sure your PLM vendor has an actual platform. Have them show you their spider chart. And here’s the hard reality: they won’t do it, because they can’t.
Don’t be a dinosaur—be prepared for the unexpected with a truly resilient digital platform.
I would state, “Don’t bet your future on a spider chart” if you do not know what the real problem is.
Oleg’s post finally is more holistic, acknowledging that a full migration might not be the right target, and I like his conclusion:
Flexibility Vs. Out of the box products – which one do you prefer? Over-customize a new PLM to follow old processes? To use a new system as an opportunity to clean existing processes? To move 25,000 people from one database to another is not a simple job. It is time to think about no upgrade PLM systems. While a cloud environment is not an option for mega-size OEMs like Ericsson, there is an opportunity for OEM IT together with the PLM vendor to run a migration path. The last one is a costly step. But… without this step, the current database oriented single-version of truth PLM paradigm is doomed.
The Migration Problem
I believe migration of data – and sometimes the impossibility of data migration – is the biggest elephant in the room when dealing with PLM projects. In 2015 during the PI PLM conference in Dusseldorf, I addressed this topic for the first time: The Challenge of PLM Upgrades.
You can find the presentation on SlideShare here.
I shared a similar example to the Ericsson case from almost 10 years ago. At that time, one of the companies I was working with wanted to replace their mainframe application, which was managing the configuration of certain airplanes. The application managed the aircraft configuration structures in tables and where needed pointing to specifications in a document repository. The two systems were not connected; integrity was guaranteed through manual verification procedures.
The application was considered as the single version of the truth, and has been treated like that for decades. The reason for migration was that all the knowledge of the application disappeared, tables were documented, but the logic was not. And besides this issue, the maintenance costs for the mainframe was also high – also at that time vendor lock-in existed.
The idea was to implement SmarTeam – flexible data model – rapid deployment based on windows technology -to catch two birds with one stone, i.e., latest microsoft technology and meanwhile direct link to the controlled documents. As they were using CATIA V5, the SmarTeam-integration was a huge potential benefit. For the migration of data, the estimate was two months. What could go wrong?
Well, technically, almost nothing went wrong. The challenge was to map the relational tables to the objects in the SmarTeam data model. And as the relational tables contained a mix of document and item attributes, splitting these tables was not always easy. Sometimes the same properties were with different values in the original table – which one was the truth? The migration took almost two years also due to limited availability of the last knowledgeable resource who could explain the logic.
After the conversion, the question still remained if the migrated data was accurate? Perhaps 99 %?
But what if it was critical? For this company, it was significant, but not mission critical like in Ericsson, where a lot of automation and rules are linked together between loads of systems.
So my point: Dassault has failed at Ericsson and so will Siemens or Aras or any other PLM vendor as the migration issue is not in the technology – we should stop thinking about this kind of migrations.
Who are the dinosaurs?
Mark is in a way suggesting that when you use PLM software from the “old” PLM vendors, you are a dinosaur. Of course, this is a great marketing message, but the truth is that it is not the PLM vendor to blame. Yes, some have more friction than the other in some instances, but in my opinion, there is no ultimate single PLM vendor.
Have a look at the well-known Daimler case from some years ago, which made the news because Daimler decided to replace CATIA by NX. Not because NX was superior – it was about maintaining the PLM backbone Smaragd which would be hard to replace. Even in 2010, there was already the notion that the existing data management infrastructure is hard to replace. See a more neutral article about this topic from Monica Schnitger if you want: Update: Daimler chooses NX for Smaragd. Also here in the end, it became a complete Siemens account for compatibility reasons.
When you look at the significant wins Aras is mentioning in their customer base, GM, Schaeffler or Airbus, you will probably discover Aras is more the connection layer between legacy systems, old PLM or PDM systems. They are not the new PLM replacing old PLM. A connection layer creates a digital thread, connecting various data sources for traceability but does not provide digital continuity as the data in the legacy systems is untouched. Still it is an intermediate step towards a hybrid environment.
For me the real dinosaurs are these large enterprises that have been implementing their proprietary PLM environments in the previous century and have built a fully automated infrastructure based on custom data models with a lot of proprietary rules. This was the case in Ericsson, but most traditional automotive and aerospace companies share this problem, as they were the early PLM adopters. And they are not the only ones. Many industrial manufacturing companies suffer from the past, opposite to their Asian competitors who can start with less legacy.
What’s next?
It would be great if the PLM community focused more on the current incompatibility of data between current/past concepts and future digital needs and discuss solution paths (for sure standards will pop-up)
Incompatibility means: Do not talk about migration but probably focus on a hybrid landscape with legacy data, managed in a coordinated manner, and modern, growing digital PLM processes based on a connected approach.
This is the discussion I would like to see, instead of vendors claiming that their technology is the best. None of the vendors will talk about this topic – like the old “Rip-and-Replace” approach is what brings the most software revenue combined with the simplification that there is only OnePLM. It is interesting to see how many companies have a kind of OnePLM or OneXXX statement.
The challenge, of course, is to implement a hybrid approach. To have the two different PLM-concepts work together, there is a need to create a reliable overlap. The reliable overlap can come from an enterprise data governance approach if possible based on a normalized PLM data model. So far all PLM vendors that I know have proprietary data models, only ShareAspace from Eurostep is based on the PLCS standard, but their solutions are most of the time part of a larger PLM-infrastructure (the future !)
To conclude: I look forward to discussing this topic with other PLM peers that are really in the field, discovering and understanding the chasm between the past and the future. Contact me directly or join us as the PLM Roadmap and PDT Europe 13-14 November in Paris. Let’s remain fact-based!
(as a matter of fact you can still contribute – call for papers still open)
I am writing this post during the Easter weekend in the Netherlands. Easter / Passover / Pascha / are religious festivities that happen around this time, depending on full moons, etc. I am not the expert here, however, what I like about Easter is that is it is an optimistic religious celebration, connecting history, the “dark days,” and the celebration of new life.
Of course, my PLM-twisted brain never stops associating and looking into an analogy, I saw last week a LinkedIn post from Mark Reisig, about Aras ACE 2019 opening with the following statement:
“Digital Transformation – it used to be called PLM,” said Aras CEO Peter Schroer, as he opened the conference with some thoughts around attaining sustainable Digital Transformation and owning the lifecycle.
Was this my Easter Egg surprise? I thought we were in the middle of the PLM Renaissance as some other vendors and consultants talk about this era. Have a look at a recent Engineering.com TV-report: Turning PLM on its head
All jokes aside, the speech from Peter Schroer contained some interesting statements and I want to elaborate on them in this post as the space to comment in LinkedIn is not designed for a long answer.
PLM is Digital Transformation?
In the past few years, there has been a discussion if the acronym PLM (Product Lifecycle Management) is perhaps outdated. PTC claimed thanks to IoT (Internet of Things) now PLM equals IoT, as you can read in Mark Taber’s 2018 guest article in Digital Engineering: IoT Equals PLM.
Note: Mark is PTC’s vice president of marketing and go-to-market marketing according to the bio at the bottom of the article. So a lot of marketing words, which strengthens the believers of the old world, that everything new is probably marketing.
Also during the PDT conferences, we discussed if PLM should be replaced by a new acronym and I participated in that discussion too – my Nov 2018 postWill MBSE be the new PLM instead of IoT? is a reflection of my thoughts at that time.
For me, Digital Transformation is a metamorphosis from a document-driven, sequential processes towards data-driven, iterative processes. The metamorphosis example used a lot at this moment, is the one from Caterpillar towards the Butterfly. This process is not easy when it comes to PLM-related information, as I described in my PI PLMx 2019 London Presentation and blog post: The Challenges of a Connected Ecosystem for PLM. The question is even: Will there be a full metamorphosis at the end or will we keep on working in two different modes of operations?
However, Digital Transformation does not change the PLM domain. Even after a successful digital transformation, there will be PLM. The only significant difference in the future – PLM boarders will not be so evident anymore when implementing capabilities in a system or a platform. The upcoming of digital platforms will dissolve or fade the traditional PLM-mapped capabilities.
You can see these differences already by taking an in-depth look at how Oracle, SAP or Propel address PLM. Each of them starts from a core platform with different PLM-flavored extensions, sometimes very different from the traditional PLM Vendors. So Digital transformation is not the replacement of PLM.
Back to Peter Schroer’s rebuttal of some myths. Note: DX stands for Digital Transformation
Myth #1: DX leverages disruptive tech
Peter Schroer:
It’s easy to get excited about AI, AR, and the 3D visual experience. However, let’s be real. The first step is to get rid of your spreadsheets and paper documentation – to get an accurate product data baseline. We’re not just talking a digital CAD model, but data that includes access to performance data, as-built parts, and previous maintenance work history for everyone from technicians to product managers
Here I am fully aligned with Peter. There are a lot of fancy features discussed by marketing teams, however, when working in the field with companies, the main challenge is to get an organization digital aligned, sharing data accessible along the whole lifecycle with the right quality.
This means you need to have a management team, understanding the need for data governance, data quality and understanding the shift from data ownership to data accountability. This will only happen with the right mix of vision, strategy and the execution of the strategy – marketing does not make it happen
Myth #2: DX results in increased market share, revenue, and profit
Peter Schroer:
Though there’s a lot of talk about it – there isn’t yet any compelling data which proves this to be true. Our goal at Aras is to make our products safer and faster. To support a whole suite of industrial applications to extend your DX strategy quite a bit further.
Here I agree and disagree, depending on the context of this statement. Some companies have gone through a digital transformation and therefore increased their market share, revenue, and profit. If you read books like Leading Transformation or Leading Digital, you will find examples of companies that have gone through successful digital transformations. However, you might also discover that most of these companies haven’t transformed their PLM-domain, but other parts of their businesses.
Also, it is interesting to read a 2017 McKinsey post: The case for digital reinvention, where you will get the confirmation that a lot of digital initiatives did not bring more top-line revenue and most of the times lead to extra costs. Interesting to see where companies focus their digital strategies – picture below:
Where only 2 percent of the respondents were focusing on supply chains, this is, according to the authors of the article, one of the areas with the highest potential ROI. And digital supply chains are closely related to modern PLM – so this is an area with enough work to do by all PLM practitioners– connecting ecosystems (in real-time)
Myth #3: Market leaders are the most successful at DX
Peter Schroer:
If your company is hugely profitable at the moment, it’s highly likely that your organization is NOT focused on Digital Transformation. The lifespan of S&P 500 companies continuing to shrink below 20 years.
How to Attain Sustainable Digital Transformation
– Stop buying disposable systems. It’s about an adaptable platform – it needs to change as your company changes.
– Think incremental. Do not lose momentum. Continuous change is a multi-phase journey. If you are in or completed phase I, then that means there is a phase II, a phase III, and so on.
– Align people & processes. Mistakes will happen, “the tech side is only 50% of DX” – Aras CEO.
Here I agree with Peter on the business side, be it that some of the current market leaders are already digital. Look at Apple, Google, and Amazon. However, the majority of large enterprises have severe problems with various aspects of a digital transformation as the started in the past before digital technologies became affordable..
Digitization allows information to flow without barriers within an organization, leading to rapid insights and almost direct communication with your customers, your supply chain or other divisions within your company. This drives the need to learn and build new, lean processes and get people aligned to them. Learning to work in a different mode.
And this is extremely difficult for a market leader – as market leader fear for the outside changing world is often not felt. Between the C-level vision and people working in the company, there are several layers of middle management. These layers were created to structure and stabilize the old ways of working.
I wrote about the middle management challenge in my last blog post: The Middle Management dilemma. Almost in the same week there was an article from McKinsey: How companies can help midlevel managers navigate agile transformations.
Conclusion: It is not (only) about technology as some of the tech geeks may think.
Conclusion
Behind the myths addressed by Peter Schroer, there is a complex transformation on-going. Probably not a metamorphosis. With the Easter spirit in mind connected to PLM, I believe digital transformations are possible – Not as a miracle but driven by insights into all aspects. I hope this post gave you some more ideas and please read the connected articles – they are quite relevant if you want to discover what’s below the surface.
Image: 21stcenturypublicservant.wordpress.com/
I have talked a lot the past years about Digital Transformation and in particular its relation to PLM. This time I want to focus a little more on Digital Transformation and my observations related to big enterprises and small and medium enterprises. I will take you starting from the top, the C-level to the work floor and then try to reconnect through the middle management. As you can imagine from the title of this post, there is a challenge. And I am aware I am generalizing for the sake of simplicity.
Starting from the C-level of a large enterprise
Large and traditional enterprises are having the most significant challenge when aiming at a digital transformation for several reasons:
- They have shareholders that prefer short-term benefits above long-term promising but unclear higher benefits. Shareholders most of the time have no personal interest in these companies, they just want to earn money above the average growth.
- The CEO is the person to define the strategy which has to come with a compelling vision to inspire the shareholders, the customers and the employees in the company – most of the time in that order of priority.
- The role of the CEO is to prioritize investments and stop or sell core components to make the transformation affordable. Every transformation is about deciding what to stop, what to start and what to maintain.
- After four to seven years (the seven years’ itch) it is time for a new CEO to create a new momentum as you cannot keep the excitement up too long.
- Meanwhile, the Stop-activities are creating fear within the organization – people start fearing their jobs and the start-activities are most of the time of such a small-scale that their successes are not yet seen. So at the work floor, there will be reservations about what’s next
Companies like ABB, Ericsson, GE, Philips – in alphabetical order – are all in several stages of their digital transformation and in particular I have followed GE as they were extremely visible and ambitious. Meanwhile, it is fair to say that the initial Digital Transformation plan from GE has stalled and a lot of lessons learned from that.
If you have time – read this article: The Only Way Manufacturers Can Survive – by Vijay Govindarajan & Jeff Immelt (you need to register). It gives useful insights about what the strategy and planning were for digital transformation. And note PLM is not even mentioned there J
Starting from the C-level of a small and medium enterprise
In a small or medium enterprise, the distance between the C-level and the work floor is most of the time much shorter and chances are that the CEO is a long-term company member in case of a long-standing family-owned business. In this type of companies, a long-term vision can exist and you could expect that digital transformation is more sustainable there.
Unfortunate most of the time it is not, as the C-level is often more active in current business strategies and capabilities close to their understanding instead of investing energy and time to digest the full impact of a digital transformation. These companies might invest in the buzz-words you hear in the market, IoT, Digital Twins and Augmented Reality/Virtual Reality, all very visionary topics, however of low value when they are implemented in an isolated way.
In this paragraph, I also need to mention the small and medium enterprises that are in the hands of an investment company. Here I feel sorry as the investment company is most of the time trying to optimize the current ways of working by simplifying or rationalizing the business, not creating a transformative vision (as they do not have the insights. In this type of companies, you will see on a lower scale the same investments done as in the other category of small and medium enterprises, be it on a lesser scale.
Do people need to change?
Often you hear that the problem with any change within the companies is because people do not want to change. I think this is too much a generalization. I have worked in the past five years with several companies where we explored the benefits and capabilities of PLM in a modern way, sometimes focusing on an item-centric approach, sometimes focusing on a model-based approach. In all these engagements there was no reluctance from the users to change.
However, there were two types of users in these discussions. I would characterize as evolutionary thinkers (most of the time ten years or more in the company) and love-to-change thinkers (most of them five years or less in the company). The difference between these groups was that the evolutionary thinkers were responding in the context of the existing business constraints where the love-to-change thinkers were not yet touched by the “knowledge how good everything was”.
For digital transformation, you need to create the love-to-change attitude while using the existing knowledge as a base to improve. And this is not a people change, it is an organizational change where you need to enable people to work in their best mode. It needs to be an end-to-end internal change – not changing the people, but changing the organizational parameters: KPIs, divisions, departments, priorities. Have a look at this short movie, you can replace the word ERP by PLM, and you will understand why I like this movie (and the relaxing sound)
The Middle Management dilemma
And here comes my last observation. At the C-level we can find inspiring visions often outcome-based, talking about a more agile company, closer to the customer, empowered workers, etc. Then there is the ongoing business that cannot be disrupted and needs to perform – so the business units, the departments all get their performance KPIs, merely keeping the status quo in place.
Also, new digital initiatives need to be introduced. They don’t fit in the existing business and are often started in separation – like GE Digital division, and you can read Jeff Immelt’ s thoughts and strategy how this could work. (The Only Way Manufacturers Can Survive). However as the majority of the business runs in the old mode, the Digital Business became another business silo in the organization, as the middle management could not be motivated to embed digital in their business (no KPIs or very low significance of new KPIs)
I talked about the hybrid/bimodal approach several times in my blog posts, most recently in The Challenges of a Connected Ecosystem. One of the points that I did not address was the fact that probably nobody wants to work in the old mode anymore once the new approach is successful and scaled up.
When the new mode of business is still small, people will not care so much and continue business as usual. Once the new mode becomes the most successful part of the company, people do want to join this success if they can. And here the change effort is needed. An interesting article in this context is The End of Two-Speed IT from the Boston Consultancy Group (2016). They already point at the critical role of middle management. Middle management can kill digital transformation or being part of it, by getting motivated and adopting too.
Conclusion
Perhaps too much text in this post and even more content when you dive more in-depth in the provided content. Crucial if you want to understand the digital transformation process in an existing company and the critical place of middle management. They are likely the killers of digital transformation if not give the right coaching and incentives. Just an observation – not a thought 😉
I was happy to take part at the PI PLMx London event last week. It was here and in the same hotel that this conference saw the light in 2011 – you can see my blog post from that event here: PLM and Innovation @ PLMINNOVATION 2011.
At that time the first vendor-independent PLM conference after a long time and it brought a lot of new people together to discuss their experience with PLM. Looking at the audience that time, many of the companies that were there, came back during the years, confirming the value this conference has brought to their PLM journey.
Similar to the PDT conference(s) – just announced for this year last week – here – the number of participants is diminishing.
Main hypotheses:
- the PLM-definition has become too vague. Going to a PLM conference does not guarantee it is your type of PLM discussions you expect to see?
- the average person is now much better informed related to PLM thanks to the internet and social media (blogs/webinars/ etc.) Therefore, the value retrieved from the PLM conference is not big enough any more?
- Digital Transformation is absorbing all the budget and attention downstream the organization not creating the need and awareness of modern PLM to the attention of the management anymore. g., a digital twin is sexier to discuss than PLM?
What do you think about the above three hypotheses – 1,2 and/or 3?
Back to the conference. The discussion related to PLM has changed over the past nine years. As I presented at PI from the beginning in 2011, here are the nine titles from my sessions:
2011 PLM – The missing link
2012 Making the case for PLM
2013 PLM loves Innovation
2014 PLM is changing
2015 The challenge of PLM upgrades
2016 The PLM identity crisis
2017 Digital Transformation affects PLM
2018 PLM transformation alongside Digitization
2019 The challenges of a connected Ecosystem for PLM
Where the focus started with justifying PLM, as well as a supporting infrastructure, to bring Innovation to the market, the first changes became visible in 2014. PLM was changing as more data-driven vendors appeared with new and modern (metadata) concepts and cloud, creating the discussion about what would be the next upgrade challenge.
The identity crisis reflected the introduction of software development / management combined with traditional (mechanical) PLM – how to deal with systems? Where are the best practices?
Then from 2017 on until now Digital Transformation and the impact on PLM and an organization became the themes to discuss – and we are not ready yet!
Now some of the highlights from the conference. As there were parallel sessions, I had to divide my attention – you can see the full agenda here:
How to Build Critical Architecture Models for the New Digital Economy
The conference started with a refreshing presentation from David Sherburne (Carestream) explaining their journey towards a digital economy. According to David, the main reason behind digitization is to save time, as he quoted Harvey Mackay an American Businessman and Journalist,
Time is free, but it is priceless. You cannot own it, but you can use it. You can’t keep it, but you can spend it. Once you have lost it, you never can get it back
I tend to agree with this simplification as it makes the story easy to explain to everyone in your company. Probably I would add to that story that saving time also means less money spent on intermediate resources in a company, therefore, creating a two-sided competitive advantage.
David stated that today’s digital transformation is more about business change than technology and here I wholeheartedly agree. Once you can master the flow of data in your company, you can change and adapt your company’s business processes to be better connected to the customer and therefore deliver the value they expect (increases your competitive advantage).
Having new technology in place does not help you unless you change the way you work.
David introduced a new acronym ILM (Integrated Lifecycle Management) and I am sure some people will jump on this acronym.
David’s presentation contained an interesting view from the business-architectural point of view. An excellent start for the conference where various dimensions of digital transformation and PLM were explored.
Integrated PLM in the Chemical industry
Another interesting session was from Susanna Mäentausta (Kemira oy) with the title: “Increased speed to market, decreased risk of non-compliance through integrated PLM in Chemical industry.” I selected her session as from my past involvement with the process industry, I noticed that PLM adoption is very low in the process industry. Understanding Why and How they implemented PLM was interesting for me. Her PLM vision slide says it all:
There were two points that I liked a lot from her presentation, as I can confirm they are crucial.
- Although there was a justification for the implementation of PLM, there was no ROI calculation done upfront. I think this is crucial, you know as a company you need to invest in PLM to stay competitive. Making an ROI-story is just consoling the people with artificial number – success and numbers depend on the implementation and Susanna confirmed that step 1 delivered enough value to be confident.
- There were an end-to-end governance and a communication plan in place. Compared to PLM projects I know, this was done very extensive – full engagement of key users and on-going feedback – communicate, communicate, communicate. How often do we forget this in PLM projects?
Extracting More Value of PLM in an Engineer-to-Order Business
Sami Grönstrand & Helena Gutierrez presented as an experienced duo (they were active in PI P PLMx Hamburg/Berlin before) – their current status and mission for PLM @ Outotec. As the title suggests, it was about how to extract more value from PL M, in an Engineering to Order Business.
What I liked is how they simplified their PLM targets from a complex landscape into three story-lines.
If you jump into all the details where PLM is contributing to your business, it might get too complicated for the audience involved. Therefore, they aligned their work around three value messages:
- Boosting sales, by focusing on modularization and encouraging the use of a product configurator. This instead of developing every time a customer-specific solution
- Accelerating project deliverables, again reaping the benefits of modularization, creating libraries and training the workforce in using this new environment (otherwise no use of new capabilities). The results in reducing engineering hours was quite significant.
- Creating New Business Models, by connecting all data using a joint plant structure with related equipment. By linking these data elements, an end-to-end digital continuity was established to support advanced service and support business models.
My conclusion from this session was again that if you want to motivate people on a PLM-journey it is not about the technical details, it is about the business benefits that drive these new ways of working.
Managing Product Variation in a Configure-To-Order Business
In the context of the previous session from Outotec, Björn Wilhemsson’s session was also addressing somehow the same topic of How to create as much as possible variation in your customer offering, while internally keep the number of variants and parts manageable.
Björn, Alfa Laval’s OnePLM Programme Director, explained in detail the strategy they implemented to address these challenges. His presentation was very educational and could serve as a lesson for many of us related to product portfolio management and modularization.
Björn explained in detail the six measures to control variation, starting from a model-strategy / roadmap (thinking first) followed by building a modularized product architecture, controlling and limiting the number of variants during your New Product Development process. Next as Alfa Laval is in a Configure-To-Order business, Björn the implementation of order-based and automated addition of pre-approved variants (not every variant needs to exist in detail before selling it), followed by the controlled introduction of additional variants and continuous analysis of quoted and sold variant (the power of a digital portfolio) as his summary slides shows below:
Day 1 closed with an inspirational keynote; Lessons-Learnt from the Mountaineering Experience 8848 Meter above sea level – a mission to climb the highest mountain on each of the continents in 107 days – 9 hours – setting a new world record by Jonathan Gupta.
There are some analogies to discover between his mission and a PLM implementation. It is all about having the total picture in mind. Plan and plan, prepare step-by-step in detail and rely on teamwork – it is not a solo journey – and it is about reaching a top (deliverable phase) in the most efficient way.
The differences: PLM does not need world records, you need to go with the pace an organization can digest and understand. Although the initial PLM climate during implementation might be chilling too, I do not believe you have to suffer temperatures below 50 degrees Celsius.
During the morning, I was involved in several meetings, therefore unfortunate unable to see some of the interesting sessions at that time. Hopefully later available on PI.TV for review as slides-only do not tell the full story. Although there are experts that can conclude and comment after seeing a single slide. You can read it here from my blog buddy Oleg Shilovitsky’s post : PLM Buzzword Detox. I think oversimplification is exactly creating the current problem we have in this world – people without knowledge become louder and sure about their opinion compared to knowledgeable people who have spent time to understand the matter.
Have a look at the Dunning-Kruger effect here (if you take the time to understand).
PLM: Enabling the Future of a Smart and Connected Ecosystem
Peter Bilello from CIMdata shared his observations and guidance related to the current ongoing digital business revolution that is taking place thanks to internet and IoT technologies. It will fundamentally transform how people will work and interact between themselves and with machines. Survival in business will depend on how companies create Smart and Connected Ecosystems. Peter showed a slide from the 2015 World Economic Forum (below) which is still relevant:
Probably depending on your business some of these waves might have touched your organization already. What is clear that the market leaders here will benefit the most – the ones owning a smart and connected ecosystem will be the winners shortly.
Next, Peter explained why PLM, and in particular the Product Innovation Platform, is crucial for a smart and connected enterprise. Shiny capabilities like a digital twin, the link between virtual and real, or virtual & augmented reality can only be achieved affordably and competitively if you invest in making the source digital connected. The scope of a product innovation platform is much broader than traditional PLM. Also, the way information is stored differs – moving from documents (files) towards data (elements in a database). I fully agree with Peter’s opinion here that PLM is conceptually the Killer App for a Smart & Connected Ecosystem and this notion is spreading.
A recent article from Forbes in the category Leadership: Is Your Company Ready For Digital Product Life Cycle Management? shows there is awareness. Still very basic and people are still confused to understand what is the difference with an electronic file (digital too ?) and a digital definition of information.
The main point to remember here: Digital information can be accessed directly through a programming interface (API/Service) without the need to open a container (document) and search for this piece of information.
Peter then zoomed in on some topics that companies need to investigate to reach a smart & connected ecosystem. Security (still a question hardly addressed in IoT/Digital Twin demos), Standards and Interoperability ( you cannot connect in all proprietary formats economically and sustainably) A lot of points to consider and I want to close with Peter’s slide illustrating where most companies are in reality
The Challenges of a Connected Ecosystem for PLM
I was happy to present after Peter Bilello and David Sherburne (on day 1) as they both gave a perspective on digital transformation complementary to what I submitted. My presentation was focusing on the incompatibility of current coordinated business systems and the concept of a connected ecosystem.
You can already download my slides from SlideShare here: The Challenges of a Connected Ecosystem for PLM . I will explain my presentation in an upcoming blog post as slides without a story might lead to the wrong interpretation, and we already reached 2000 words. Few words to come.
How to Run a PLM Project Using the Agile Manifesto
Andrew Lodge, head of Engineering Systems at JCB explained how applying the agile mindset towards a PLM project can lead to faster and accurate results needed by the business. I am a full supporter for this approach as having worked in long and waterfall-type of PLM implementations there was always the big crash and user dissatisfaction at the final delivery. Keeping the business involved every step seems to be the solution. The issue I discovered here is that agile implementation requires a lot of people, in particular, business, to be involved heavily. Some companies do not understand this need and dropped /reduced business contribution to the least, killing the value of an agile approach
Concluding
For me coming back to London for the PI PLMx event was very motivational. Where the past two, three conferences before in Germany might have led to little progress per year, this year, thanks to new attendees and inspiration, it became for me a vivid event, hopefully growing shortly. Networking and listening to your peers in business remains crucial to digest it all.
This is the moment of the year to switch-off from the details. No more talking and writing about digital transformation or model-based approaches. It is time to sit back and relax. Two years ago I shared the PLM Songbook, now it is time to see one or more movies. Here are my favorite top five PLM movies:
Bruce Almighty
Bruce Nolan, an engineer in Buffalo, N.Y., is discontented with almost everything in the company despite his popularity and the love of his draftswoman Grace. At the end of the worst day of his life, Bruce angrily ridicules and rages against PLM and PLM responds. PLM appears in human form and, endowing Bruce with divine powers op collaboration, challenges Bruce to take on the big job to see if he can do it any better.
A movie that makes you modest and you realize there is more than your small ecosystem.
The good, the bad and the ugly
Blondie (The Good PLM consultant) is a professional who is out trying to earn a few dollars. Angel Eyes (The Bad PLM Vendor) is a PLM salesman who always commits to a task and sees it through, as long as he is paid to do so. And Tuco (The Ugly PLM Implementer) is a wanted outlaw trying to take care of his own hide. Tuco and Blondie share a partnership together making money off Tuco’s bounty, but when Blondie unties the partnership, Tuco tries to hunt down Blondie. When Blondie and Tuco come across a PLM implementation loaded with dead bodies, they soon learn from the only survivor (Bill Carson – the PLM admin) that he and a few other men have buried a stash of value on a file server. Unfortunately, Carson dies, and Tuco only finds out the name of the file server, while Blondie finds out the name on the hard disk. Now the two must keep each other alive in order to find the value. Angel Eyes (who had been looking for Bill Carson) discovers that Tuco and Blondie met with Carson and knows they know the location of the value. All he needs is for the two to ..
A movie that makes you realize that it is a challenging journey to find the value out of PLM. It is not only about execution – but it is also about all the politics of people involved – and there are good, bad and ugly people on a PLM journey.
The Grump
The Grump is a draftsman in Finland from the past. A man who knows that everything used to be so much better in the old days. Pretty much everything that’s been done after 1953 has always managed to ruin The Grump’s day. Our story unfolds The Grump opens a 3D Model on his computer, hurting his brain. He has to spend a weekend in Helsinki to attend a model-based therapy. Then the drama unfolds …….
A movie that makes you realize that progress and innovation do not come from grumps. In every environment when you want to do a change of the status quo, grumps will appear. With the exciting Finish atmosphere, a perfect film for Christmas.
Deliverance
The Cahulawassee River Valley company in Northern Georgia is one of the last analog companies in the state, which will soon change with the imminent implementation of a PLM system in the company, breaking down silos everywhere. As such, four Atlanta city slickers, alpha male Lewis Medlock, generally even-keeled Ed Gentry, slightly condescending Bobby Trippe, and wide-eyed Drew Ballinger decide to implement PLM in one trip, with only Lewis and Ed having experience in CAD. They know going in that the area is ethnoculturally homogeneous and isolated, but don’t understand the full extent of such until they arrive and see what they believe is the result of generations of inbreeding. Their relatively peaceful trip takes a turn for the worse when half way through they encounter a couple of hillbilly moonshiners. That encounter not only makes the four battle their way out of the PLM project intact and alive but threatens the relationships of the four as they do.
This movie, from 1972, makes you realize that in the early days of PLM starting a big-bang implementation journey into an area that is not ready for it, can be deadly, for your career and friendship. Not suitable for small children!
Diamonds Are Forever or Tron (legacy)
James Bond’s mission is to find out who has been drawing diamonds, which are appearing on blogs. He adopts another identity in the form of Don Farr. He joins up with CIMdata and acts as if he is developing diamonds, but everyone is hungry for these diamonds. He also has to avoid Mr. Brouwer and Mr. Kidd, the dangerous couple who do not leave anyone in their way when it comes to model-based. And Ernst Stavro Blofeld isn’t out of the question. He may have changed his looks, but is he linked with the V-shape? And if he is, can Bond finally defeat his ultimate enemy?
Sam Flynn, the tech-savvy 27-year-old son of Kevin Flynn, looks into his father’s disappearance and finds himself pulled into the same world of virtual twins and augmented reality where his father has been living for 20 years. Along with Kevin’s loyal confidant Quorra, father and son embark on a life-and-death journey across a visually-stunning cyber universe that has become far more advanced and exceedingly dangerous. Meanwhile, the malevolent program IoT, who dominates the digital world, plans to invade the real world and will stop at nothing to prevent their escape
I could not decide about number five. The future is bright with Boeing’s new representation of Systems Engineering, see my post on CIMdata’s PLM Europe roadmap event where Don Farr presented his diamond(s). However, the future is also becoming a mix of real with virtual and here Tron (legacy) will help my readers to understand the beauty of a mixed virtual and real world. You can decide – or send me your favorite PLM movies.
Note: All movie reviews are based on IMBd.com story lines, and I thank the authors of these story lines for their contribution and hope they agree with the PLM-related twist. Click on the image to find the full details and original review.
Conclusion
2018 has been an exciting year with a lot of buzzwords combined with the reality that the current PLM approach is incompatible with the future. How we can address this issue more in 2019 – first at PI PLMx 2019 in London (be there – last chance to meet people in the UK when they are still Europeans and share/discuss plans for the upcoming year)
Wishing you all the best during the break and a happy and prosperous 2019
Last week I attended the long-awaited joined conference from CIMdata and Eurostep in Stuttgart. As I mentioned in earlier blog posts. I like this conference because it is a relatively small conference with a focused audience related to a chosen theme.
Instead of parallel sessions, all attendees follow the same tracks and after two days there is a common understanding for all. This time there were about 70 people discussing the themes: Digitalizing Reality—PLM’s role in enabling the digital revolution (CIMdata) and Collaboration in the Engineering and Manufacturing Supply Chain –the Extended Digital Thread and Smart Manufacturing (EuroStep)
As you can see all about Digital. Here are my comments:
The State of the PLM Industry:
The Digital Revolution
Peter Bilello kicked off with providing an overview of the PLM industry. The PLM market showed an overall growth of 7.3 % toward 43.6 Billion dollars. Zooming in into the details cPDM grew with 2.9 %. The significant growth came from the PLM tools (7.7 %). The Digital Manufacturing sector grew at 6.2 %. These numbers show to my opinion that in particular, managing collaborating remains the challenging part for PLM. It is easier to buy tools than invest in cPDM.
Peter mentioned that at the board level you cannot sell PLM as this acronym is too much framed as an engineering tool. Also, people at the board have been trained to interpret transactional data and build strategies on that. They might embrace Digital Transformation. However, the Product innovation related domain is hard to define in numbers. What is the value of collaboration? How do you measure and value innovation coming from R&D? Recently we have seen more simplified approaches how to get more value from PLM. I agree with Peter, we need to avoid the PLM-framing and find better consumable value statements.
Nothing to add to Peter’s closing remarks:
An Alternative View of the Systems Engineering “V”
For me, the most interesting presentation of Day 1 was Don Farr’s presentation. Don and his Boeing team worked on depicting the Systems Engineering process for a Model-Based environment. The original “V” looks like a linear process and does not reflect the multi-dimensional iterations at various stages, the concept of a virtual twin and the various business domains that need to be supported.
The result was the diamond symbol above. Don and his team have created a consistent story related to the depicted diamond which goes too far for this blog post. Current the diamond concept is copyrighted by Boeing, but I expect we will see more of this in the future as the classical systems engineering “V” was not design for our model-based view of the virtual and physical products to design AND maintain.
Sponsor vignette sessions
The vignette sponsors of the conference, Aras, ESI,-group, Granta Design, HCL, Oracle and TCS all got a ten minutes’ slot to introduce themselves, and the topics they believed were relevant for the audience. These slots served as a teaser to come to their booth during a break. Interesting for me was Granta Design who are bringing a complementary data service related to materials along the product lifecycle, providing a digital continuity for material information. See below.
The PLM – CLM Axis vital for Digitalization of Product Process
Mikko Jokela, Head of Engineering Applications CoE, from ABB, completed the morning sessions and left me with a lot of questions. Mikko’s mission is to provide the ABB companies with an information infrastructure that is providing end-to-end digital services for the future, based on apps and platform thinking.
Apparently, the digital continuity will be provided by all kind of BOM-structures as you can see below.In my post, Coordinated or Connected, related to a model-based enterprise I call this approach a coordinated approach, which is a current best practice, not an approach for the future. There we want a model-based enterprise instead of a BOM-centric approach to ensure a digital thread. See also Don Farr’s diamond. When I asked Mikko which data standard(s) ABB will use to implement their enterprise data model it became clear there was no concept yet in place. Perhaps an excellent opportunity to look at PLCS for the product related schema.
A general comment: Many companies are thinking about building their own platform. Not all will build their platform from scratch. For those starting from scratch have a look at existing standards for your industry. And to manage the quality of data, you will need to implement Master Data Management, where for the product part the PLM system can play a significant role. See Master Data Management and PLM.
Systems of Systems Approach to Product Design
Professor Martin Eigner keynote presentation was about the concepts how new products and markets need a Systems of Systems approach combined with Model-Based Systems Engineering (MBSE) and Product Line Engineering (PLE) where the PLM system can be the backbone to support the MBSE artifacts in context. All these concepts require new ways of working as stated below:
And this is a challenge. A quick survey in the room (and coherent with my observations from the field) is the fact that most companies (95 %) haven’t even achieved to work integrated for mechatronics products. You can imagine the challenge to incorporate also Software, Simulation, and other business disciplines. Martin’s presentations are always an excellent conceptual framework for those who want to dive deeper a start point for discussion and learning.
Additive Manufacturing (Enabled Supply) at Moog
Moog Inc, a manufacturer of precision motion controls for various industries have made a strategic move towards Additive Manufacturing. Peter Kerl, Moog’s Engineering Systems Manager, gave a good introduction what is meant by Additive Manufacturing and how Moog is introducing Additive Manufacturing in their organization to create more value for their customer base and attract new customers in a less commodity domain. As you can image delivering products through Additive Manufacturing requires new skills (Design / Materials), new processes and a new organizational structure. And of course a new PLM infrastructure.
Jim van Oss, Moog’s PLM Architect and Strategist, explained how they have been involved in a technology solution for digital-enabled parts leveraging blockchain technology. Have a look at their VeriPart trademark. It was interesting to learn from Peter and Jim that they are actively working in a space that according to the Gartner’s hype curve is in the early transform phase. Peter and Jim’s presentation were very educational for the audience.
For me, it was also interesting to learn from Jim that at Moog they were really practicing the modes for PLM in their company. Two PLM implementations, one with the legacy data and the wrong data for the future and one with the new data model for the future. Both implementations build on the same PLM vendor’s release. A great illustration showing the past and the future data for PLM are not compatible
Value Creation through Synergies between PLM & Digital Transformation
Daniel Dubreuil, Safran’s CDO for Products and Services gave an entertaining lecture related to Safran’s PLM journey and the introduction of new digital capabilities, moving from an inward PLM system towards a digital infrastructure supporting internal (model-based systems engineering / multiple BOMs) and external collaboration with their customers and suppliers introducing new business capabilities. Daniel gave a very precise walk-through with examples from the real world. The concluding slide: KEY SUCCESS FACTORS was a slide that we have seen so many times at PLM events.
Apparently, the key success factors are known. However, most of the time one or more of these points are not possible to address due to various reasons. Then the question is: How to mitigate this risk as there will be issues ahead?
Bringing all the digital trends together. What’s next?
The day ended with a virtual Fire Place session between Peter Bilello and Martin Eigner, the audience did not see a fireplace however my augmented twitter feed did it for me:
Some interesting observations from this dialogue:
Peter: “Having studied physics is a good base for understanding PLM as you have to model things you cannot see” – As I studied physics I can agree.
Martin: “Germany is the center of knowledge for Mechanical, the US for Electronics and now China becoming the center for Electronics and Software” Interesting observation illustrating where the innovation will come from.
Both Peter and Martin spent serious time on the importance of multidisciplinary education. We are teaching people in silos, faculties work in silos. We all believe these silos must be broken down. It is hard to learn and experiment skills for the future. Where to start and lead?
Conclusion:
The PLM roadmap had some exciting presentations combined with CIMdata’s PLM update an excellent opportunity to learn and discuss reality. In particular for new methodologies and technologies beyond the hype. I want to thank CIMdata for the superb organization and allowing me to take part. Next week I will follow-up with a review of the PDT Europe conference part (Day 2)
Jos, great thoughts about BOM management. Here are some of my thoughts. I can see how BOM management will evolve…
As a complement, even if more and more of the diversity of a product is managed at the software level…
1) A wiring diagram stores information (wires between ports of the electrical components) that does not exist in most of…
BOM has NEVER been the sole "master" of the Product. The DEFINITION FILE is ! For example the wiring of…
Interesting discussion about part numbers and where they originate. Though there seems to be consensus about the EBOM and MBOM,…