You are currently browsing the category archive for the ‘PLM’ category.
![]()
“Confused? You won’t be after this episode of Soap. “
Who does not remember this tagline from the first official Soap series starting in 1977 and released in the Netherlands in 1979?
Every week the Campbells and the Tates entertained us with all the ingredients of a real soap: murder, infidelity, aliens’ abduction, criminality, homosexuality and more.
The episode always ended with a set of questions, leaving you for a week in suspense , hoping the next episode would give you the answers.
For those who do not remember the series or those who never saw it because they were too young, this was the mother of all Soaps.
What has it to do with PLM?
Soap has to do with strange people that do weird things (I do not want to be more specific). Recently I noticed that this is happening even in the PLM blogger’s world. Two of my favorite blogs demonstrated something of this weird behavior.
First Steve Ammann in his Zero Wait-State blog post: A PLM junkie at sea point-solutions versus comprehensive mentioned sailing from Ventura CA to Cabo San Lucas, Mexico on a 35 foot sailboat and started thinking about PLM during his night shift. My favorite quote:
Besides dealing with a couple of visits from Mexican coast guard patrol boats hunting for suspected drug runners, I had time alone to think about my work in the PLM industry and specifically how people make decisions about what type of software system or systems they choose for managing product development information. Yes only a PLM “junkie” would think about PLM on a sailing trip and maybe this is why the Mexican coast guard was suspicious.
Second Oleg in his doomsday blog post: The End of PLM Communism, was thinking about PLM all the weekend. My favorite quote:
I’ve been thinking about PLM implementations over the weekend and some perspective on PLM concepts. In addition to that, I had some healthy debates over the weekend with my friends online about ideas of centralization and decentralization. All together made me think about potential roots and future paths in PLM projects.
It demonstrates the best thinking is done during out-of-office time and on casual locations. Knowing this from my long cycling tours in the weekend, I know it is true.
I must confess that I have PLM thoughts during cycling.
Perhaps the best thinking happens outside an office?
I leave the follow up on this observation to my favorite Dutch psychologist Diederik Stapel, who apparently is out of office too.
Now back to serious PLM
Both posts touch the topic of a single comprehensive solution versus best-of-breed solutions. Steve is very clear in his post. He believes that in the long term a single comprehensive solution serves companies better, although user performance (usability) is still an issue to consider. He provides guidance in making the decision for either a point solution or an integrated solution.
And I am aligned with what Steve is proposing.
Oleg is coming from a different background and in his current position he believes more in a distributed or network approach. He looks at PLM vendors/implementations and their centralized approach through the eyes of someone who knows the former Soviet Union way of thinking: “Centralize and control”.
The association with communism which was probably not the best choice when you read the comments. This association makes you think as the former Soviet Union does not exist anymore, what about former PLM implementations and the future? According to Oleg PLM implementations should be more focused on distributed systems (on the cloud ?), working and interacting together connecting data and processes.
And I am aligned with what Oleg is proposing.
Confused? You want be after reading my recent experience.
I have been involved in the discussion around the best possible solution for an EPC contractor (Engineering Procurement Construction) in the Oil & Gas industry. The characteristic of their business is different from standard manufacturing companies. EPC contractors provide services for an owner/operator of a plant and they are selected because of their knowledge, their price, their price, their price, quality and time to deliver.
This means an EPC contractor is focusing on execution, making sure they have the best tools for each discipline and this is the way they are organized and used to work. The downside of this approach is everyone is working on its own island and there is no knowledge capitalization or sharing of information. The result each solution is unique, which brings a higher risk for errors and fixes required during construction. And the knowledge is in the head of experience people ….. and they retire at a certain moment.
So this EPC contractor wanted to build an integrated system, where all disciplines are connected and sharing information where relevant. In the Oil & Gas industry, ISO15926 is the standard. This standard is relative mature to serve as the neutral exchange standard of information between disciplines. The ideal world for best in class tools communicating with each other, or not ?
Imagine there are 6 discipline tools, an engineering environment optimized for plant engineering, a project management environment, an execution environment connecting suppliers and materials, a delivery environment assuring the content of a project is delivered in the right stages and finally a knowledge environment, capitalizing lessons learned, standards and best practices.
This results in 6 tools and 12 interfaces to a common service bus connecting these tools. 12 interfaces as information needs to be send and received from the service bus per application. Each tools will have redundant data for its own execution.
What happens if a PLM provider could offer three of these tools on a common platform? This would result into 4 tools to install and only 8 interfaces. The functionality in the common PLM system does not require data redundancy but shares common information and therefore will provide better performance in a cross-discipline scenario.
In the ultimate world all tools will be on one platform, providing the best performance and support for this EPC contractor. However this is utopia. It is almost impossible to have a 100 % optimized system for a group of independent companies working together. Suppliers will not give up their environment and own IP to embed it in a customer´s ideal environment. So there is always a compromise to find between a best integrated platform (optimal performance – reduced cost of interfaces and cost of ownership) and the best connected environment (tools connection through open standards).
And this is why both Steve and Oleg have a viewpoint that makes sense. Depending on the performance of the tools and the interaction with the supplier network the PLM platform can provide the majority of functionality. If you are a market dominating OEM you might even reach 100 % coverage for your own purpose, although the modern society is more about connecting information where possible.
MY CONCLUSION after reading both posts:
- Oleg tries to provoke, and like a soap, you might end up confused after each episode.
- Steve in his post gives a common sense guidance, useful if you spend time on digesting it, not a soap.
Now I hope you are not longer confused and wish you all a successful and meaningful 2013. The PLM soap will continue in alphabetical order:
- Will Aras survive 21-12-2012 and support the Next generation ?
- Will Autodesk get of the cloud or have a coming out ?
- Will Dassault get more Experienced ?
- Will Oracle PLM customers understand it is not a database ?
- Will PTC get out of the CAD jail and receive $ 200 ?
- Will SAP PLM be really 3D and user friendly ?
- Will Siemens PLM become a DIN or ISO standard ?
See the next episodes of my PLM blog in 2013
Early this year Dassault Systèmes (DS) announced their future strategy called 3DExperience and it is only now after their two major events, the 3DExperience forums in the US and Europe, that the discussion has started around the meaning of 3DExperience.
In February, I thought 3DExperience was just a new marketing approach from DS to differentiate themselves from other vendors. A little more 3D, PLM has a bad connotation and as some of the newcomers redefined what is PLM, it made sense to be different again..
One of my fellow European bloggers, Yoann Maingon started a discussion with his provoking blog post: Different Marketing Strategies And Naming in PLM. Read the post and specially the comments from Jim Brown and Joe Barkai, who bring perspective to this post. In addition this post got traction in some closed LinkedIn PLM groups and it was interesting to observe that different interpretations of PLM created somehow the same feeling that I have with religion.
Stay with the book and the definition of PLM and complete your portfolio was the message. But which book and what is PLM ? Even if I would write the book: “The Truth about PLM”, who would consider my book as the authority.
We have learned from religion that concepts based on a book can lead to wars.
I am sure PLM will not go into that direction and it remains important not to focus on the definition of PLM, but at the end you want your customers (current and future) to be more efficient, more innovative and profitable. And in order to achieve that, you need to look at the whole process, starting with market interaction and delivery to the customer.
And for that there are many tasks to perform in a company. During your sales process, you need to make sure you address in the best manner the demands from the customer or market, so you can differentiate yourself from others. It can be based on your track record (best in class since 1845), your price (always the cheapest as you manage the process efficient) or your experience (the price and the good feeling it gives justifies the decision)
What has become clear in the past ten years is that we are in a global, changing market and specially traditional companies struggle to make a change which is future oriented. Customer loyalty was in the past based on the fact that you were in the same region, later the same country, but now everyone is shopping or sourcing around the world. Traditional markets and business are no longer secure.
So companies have to change and to my opinion one of the most important changes they have to go through is managing all information in their company (and from outside) in a shared manner. Products can no longer be defined without taking into account feedback and interaction from the market. Trends (positive or negative) related to your company or products need to be followed as they can kill or hype a product or your company.
To realize this change a company needs to start working different and this leads at then end to the need for different tools to support your changing processes. Here I see PLM systems coming into the picture. And here there are the two approaches: will you be selecting a single vendor with the richest PLM platform, or will you integrate a set of best in class applications. As we saw in the Tech4PD discussion there is no ultimate decision here.
I see the 3DExperience strategy from DS in this light. The classical scope of PLM tools and practices does not provide a base for the current and future markets. The solution is bigger than tools, it is the focus on the total experience (I could not find another name either).
It is a way to become attractive for your customers and not focus only on the product but also the way you can influence your potential customers to choose your product or service above others. DS call this the new era of 3DExperience, others will market it different.

In a consumer market we select products based on experience. Has anyone ever tried to justify the purchase of an iPad as an affordable device they need for their work ?
It is the experience.
There is one thing I dislike from the 3DEXperience approach. Blogging becomes expensive, as writing down the word 3DExperience – a mix of numbers and characters – slows down my efficiency. I prefer 3DE or3DX as the most efficient set of keystrokes related to a TLA.
To my opinion PLM is not dead at all for DS. They just market the bigger picture different to be different from the classical PLM platforms. All PLM vendors have their unique marketing approach. Companies need to define what is their next step to remain in business and they are afraid for the old PLM, due to the horror stories – complexity, expensive, etc.
Is it selling experiences or perhaps is it making sure a new generation of workers will be motivated to work for your company. It remains a mix of classical PLM functionality, but it is also big data, social media and more interactive and friendly interfaces which are expected.
Finally one observation from the 3DExperience forum in Brussels where I believe they could have done a better job. Usually when customers and prospects go to this kind of events, they want to hear that they have chosen the right software provider. So it should be a mix of assuring them they are not alone (many others have chosen our solutions) and excited by the future vision your vendor has. Here they message that all companies need to sell experiences in the future otherwise they will be a commodity created a bad mood. Fear does not push people to change, it paralyzes people.
Conclusion: Dassault Systemes new 3DExperiences is understandable as a way to introduce a bigger picture than PLM alone. If every company needs THE EXPERIENCE approach has to be seen. In addition I believe DS still needs to work on more understandable examples where the 3DE approach is a differentiator. For sure there is PLM inside
A week later after the PLM Innovation conference in the US, I have time to write down my impressions. It was the first time this event was organized in the US, after having successful events the past years in Europe. For me it was a pleasure to meet some of my PLM friends in reality as most of my activities are in Europe.
With an audience of approximate 300 people, there were a lot of interesting sessions. Some of them in parallel, but as all session are recorded I will soon catch up with the sessions I have been missing.
My overall impression of the event: Loud en Positive, which is perhaps a typical difference between the US and Old Europe.
Here some impressions from sessions that caught my attention
Kevin Fowler, Chief Architect Commercial Airplanes Processes and Tools from The Boeing Company presented the PLM journey BCA went through. Their evolution path is very similar to the way Siemens and Dassault Systemes went through (driven by Boeing’s challenges).
Impressive was the amount of parts that need to be managed aircraft (up to a billion) and with that all its related information. Interesting to see that the amount of parts for the 787 have strongly decreased.
After PLM Generation 1 based on Teamcenter and Generation 2 based on Dassault Kevin demonstrated that functionality and cost of ownership increased due to more complexity, it was evident that usability decreased.
And this will be a serious point of attention for Generation 3, the PLM system BCA will be selecting for 2015 and beyond. Usability has to increase.
And as we were among all the PLM vendors and customers, during the breaks there was a discussion, which PLM vendor would be the preferred next partner for PLM. I had a discussion related to PLM vision and visibility with one of the SAP partners (DSC software Inc.). He is convinced that SAP provides one of the best PLM platforms. I am not convinced as I see SAP still as a company that wants to do everything, starting from ERP. And as long as their management and websites do not reflect a PLM spirit I am not convinced. In 2015 I might be wrong with my impression that PLM, Usability and SAP are not connected.
Note: browse to this SAP PLM rapid-deployment solution page and view the Step by Step guide. Now the heading becomes SAP CRM rapid-deployment solution. A missing link, marketing or do they know the difference between PLM and CRM ?
Next Nathan Hartman from Purdue University described his view on future PLM which will be model-based and he presented how PLM tools could work together describing a generic architecture and interfaces. This is somehow the way the big PLM Vendors are describing their platform too, only in their situation more in a proprietary environment.
- Nathan gave an interesting anecdote related to data sharing. He mentioned as example a 3D model that was built by one student and he asked another student to make modifications on it. This was already a challenge and even working with the same software lead to knowledge issues, trying to understand the way the model was built. Demonstrating PLM data sharing is not only about having the right format and application, but also the underlying knowledge needs to be exposed
Monica Schnitger, as business analyst presented her thoughts on PLM justification. Where in Munich I presented Making the case for PLM session, Monica focused on a set of basic questions that you need to ask (as a company) and how you can justify a PLM investment. It is not for the big kids anymore and you can find her presentation here (with another PLM definition).
I liked the approach of keeping things simple, as sometimes people make PLM too complex. (Also as it serves their own businesses). Monica presented that a company should define its own reasons for why and how PLM. Here I have a slight different approach. Often mid-market companies do not want PLM, they have pains they want to get rid of or problems that they want to solve. Often starting from the pain and with guidance from a consultant companies will understand which PLM practices they could use and how it fits in a bigger picture instead of using plasters to fix the pain.
Beth Lange, Chief Scientific Officer from Mary Kay presented how her organization, operating from the US (Texas), managed a portfolio of skin care products sold around the world by an independent local sales force all around the world. In order to do this successfully and meet all the local regulatory requirements, they implemented a PLM system where a central repository of global information is managed.
The challenge for Mary Kay is that from origin a company with a focus on skin care products and an indirect sales force, where sometimes the sales person has no IT skills, this project was also a big cultural change. Beth explained that indeed the support from Kalypso was crucial to manage the change. Something which I believe is always crucial in a global PLM project where the ideal implementation is so different from current, mainly isolated practices.
As regulatory compliance is an important topic for skin care products, Beth explained that due to the compliancy rules for China, where they have to expose their whole IP, the only way to protect their IP was putting a patent on everything, even on changes.
Would NPI mean New Patent Introduction in the CPG market ?
Ron Watson, Director, PLM COE and IT Architecture
from Xylem Inc. presented their global PLM approach. As the company is is relative young (2011) but is a collection of businesses all around the world, they have the challenge to operate as a single company and sharing the synergy.
Ron introduced PDLM (Product Data Lifecycle Management) and he explained there was first a focus getting all data under control and make it the single source for all product data in a digital format, preferably with a minimum of translation needed.
Here you see xylem has chosen for an integrated platform and not the best of breed applications. After having the product data under control the focus can be on standardizing processed overall the company. Something which other companies that have followed this approach, confirm it brings huge benefits.
As it was a PTC case study, Graham Birch, senior director of Product Management from PTC did the closing part. Unfortunate by demoing some pieces of the software. A pity as I believe people do not get impressed by seeing some data on the screen they recognize. Only when there is a new paradigm to demonstrate related to usability I would be interested.
And as-if they have read my mind, Daniel Armour from Joy Global demonstrated the value and attractiveness of 3D Visualization tools in their organization. Joy Global is manufacturer of some of the biggest mining equipment and he demonstrated how 3D Visualization can be used in the sales and marketing process, but also during training and analysis of work scenarios.
His demonstration showed again that 3D as a communication layer is attractive and appeals to the user (serious gaming in some cases).
As it was a SAP case, I was surprised to hear the words from Brian Soaper, explaining the power of 3D for SAP users and how SAP users will benefit from better understanding, higher usability etc. Iw as as-if a 3D-CAD/PLM was talking, was this a dream ?
I woke up out of this dream when someone from the audience asked to Daniel how they would keep the visualizations actual, is there a kind of version management ? Daniel mentioned currently not but you could build a database to perform check-in/checkout of data. Apparently all the 3D we have seen is not connected to this single database SAP always promotes.
Peter Bilello, CIMdata’s president had a closing session with the title: Evaluate the tangible benefits from PLM can prove complex, which indeed is true. Peter’s presentation was partly similar to the presentation he gave early this year in Munich. And this is what I appreciate about CIMdata. Some people in the audience mentioned that many times it is the same story and many of the issues Peter was presenting are somehow known facts. And this is what I like about CIMdata, PLM is not changing per conference or new IT-hype. If you want to understand PLM, you need to keep to the purpose and meaning of PLM. And these known facts apparently are not so known, a majority of PLM projects are executed or lead by people that decided to invent the wheel,as inventing the wheel seems cheaper than renting a wheel, and this lead again to issues later that every experienced consultant could foresee.
The evening with a champagne reception on the paddle boat making a tour around the lake and a dinner at the lake side concluded this first day.
The combination of presentations, scheduled network meetings and enough network time made it a successful first day
Next day I started with a BOM management Think Thank were in the target was to come to some common practices and understanding of BOM management. As the amount of participants was large and the time was short we only had a chance to touch the surface of the cases brought in.
What was clear from this session to me is that most challenges reported were due to the fact that the tools were already in place and only afterwards the PLM team (mostly engineering people) had to struggle to make it into a consistent process. They do not get a real help from PLM vendors or implementers, as their focus is on selling more tools and services.
What is missing for these organizations is a PLM for Dummies methodology guide, which is business centric instead of technology centric. For sure there are people who have published PLM books, but either they are not found of relevant. And as nothing comes for free, these companies try to invent the wheel again. PLM is serious business.
The first keynote speech from the second day was from Dantar Oosterwal, Partner and President of the Milwaukee Consulting Group, who inspired us with Lean and PLM: Operation Excellence and this all related to his experiences with Harley Davidson.
It was interesting how described the process of focusing on the throughput to get market results. There are various parameters how you can influence market share, by a price strategy, by increased marketing , but the most impact on Harley Davidson sales result was the effect of innovation. More model variants being the choice for more potential customers. By measuring and analyzing the throughput of the organization an optimal tuning could be found.
Dantar also shared an interesting anecdote about an engineer that had to study the impact of ethanol as fuel for a certain engine. And after a certain time the engineer came back with the answer: yes we can. He answered the question but left no knowledge behind. Where a similar question was asked about performance to a supplier and he came back with an answer and graphs explaining where the answered was based upon. This answer created knowledge as it could be reused for similar questions. It is a good example how companies should focus on collecting knowledge in their PLM environment instead of answers on a question.
The second keynote speaker was from the world biggest brand, Christopher Boudard, PLM Director from the Coca Cola Company. With its multiple brands and global operations it is a challenge to work towards a single PLM platform. He explained that at this stage they are still busy loading data into the system, where a lot of time is spent on data cleansing as the system has only value when the data is clean and accurate.
And this requires a lot of motivation for the PLM team to keep the executive management involved and sponsoring a project that takes five years to consolidate data and only then through the right processes make sure the data remains correct.
Christopher demonstrated in a passionate manner that leadership is crucial for such a project to be successful and implemented. For me as an European it was interesting to see that the world biggest brand PLM Director is a French citizen inspiring the management of such an American company.
Monica Schnitger conducted an interesting session about the state-of-the-state of multi-platform PLM.
If you cannot understand this tittle, it was a debate between the PLM vendors ( Aras, Autodesk, Dassault Systemes, PTC, SAP and Siemens) about openness, interoperability, cloud and open source.
After the first question from Monica about the openness of each of the vendor’s systems, it was clear there are no problems to expect in the future. All systems were extremely open according to the respondents and I lost my attention for the debate somehow as I had the feeling I was listening to an election debate. Monica did her best to make it an unbiased discussion, however I feel when some people want to make a specific point and use every question to jump on that it becomes an irritation.
Chad Jackson, this time dressed up as the guy that is always killed in the first 5 minutes of a Star Trek episode, shared with us the early findings of the 2012 State of PLM. Tech4PD followers, and who is not a follower, understood he lost the bet of the second episode.
Chad let me know if this picture needs to be removed, as it can kill your future career.
The preliminary findings Chad was sharing with us that manufacturing and service where significant interested and consumers of PLM data, but do not consider it as their data, where they have to contribute too also. The fact that it is available makes them involved in using the data, still these departments do not show active participating in PLM. Somehow this confirms the observation that PLM is still considered as an engineering tool, not as an enterprise wide platform.
As the initial group of participants (n = 100) is small and not random selected from an overall population, the questions remains what the state of PLM is in 2012. I assume Chad will come back on that in a later time.
The last plenary sessions, David Karamian from Flextronics and Michael Grieves, a Virtual Perfect Future, had the ungrateful position being the last two speakers of this event. I have to review David’s presentation again as it was not easy to digest and recall a week later what were his highlights. Michael’s presentation was easier to digest and I also believe with the new upcoming concepts and technology the virtual perfect future is there.
Looking back on a successful event, where I met many of my PLM peers from across the ocean, I will take the upcoming weeks to review the sessions I missed. Final good news for all PLM mind sharers, is the fact that CIMdata and MarketKey announced the coordination of their upcoming events next year – more content and more attendees guaranteed.
If your are reading blogs related to PLM, I am sure you have seen a blog post from Stephen Porter (Zero Wait State), for example: The PLM state the walking dead – PLM projects that never end.
Like Stephen, I am often triggered by an inspiring book, a touching movie or a particular song combined with my PLM-twisted brain I relate the content to PLM (there is no official name for this abnormality yet).
When driving home last week, I was listening to Phil Collins – In the air tonight
As I was just coming back from a discussion around PLM tools, BOMs and possible PLM expansion strategies in a company with customers and resellers, my twisted brain was thinking about two PLM related topics that were in the air tonight (at least for me).
Granularity vs. Integration: Suites vs. Best-in-class PLM
You must have noticed it, and if not, now you are aware: Jim Brown and Chad Jackson started a PLM duel discussion platform at Engineering.com to bring PLM related topics to the table: TECH4PD. Watch them argue and I hope with your feedback and the feedback of the PLM community, it will help us to make up your mind.
The topic they discussed in their first session was about two different approaches you can have for PLM. Either start from a best in class PLM platform or build your PLM support by using dedicated applications and integrate them.
This is to my opinion one of the fundamental PLM topics to discuss. And if I would have to vote (as Jim and Chad ask you to do so), I would first vote for Chad (integration of software) and after a second thought, vote for Jim (best in class PLM). So see my problem.
If I relate the discussion to my experiences with different companies, I realized that probably both answers are correct. In case you are an OEM you likely would benefit from a best in class PLM platform, as PLM systems aim to cover and integrate all data through the product lifecycle in a single system, single data model, etc.. So a good PLM platform would have the lowest cost of ownership in the long term. OEMs are by definition not the smallest companies and in general have the highest need for global coverage.
But not every company is an OEM. Many mid-market companies are specific suppliers,serving different OEMs and although they also develop products, it is in a different context of market delivery. There is a need to be flexible, as their products used by OEMs might become obsolete in the near term, they need to be more flexible, reactive. and the best in class companies innovate and are proactive. For that reason they do want to invest in a best in class PLM system, which somehow brings some rigidness , but keep on optimizing these areas where improvement is needed in their organization, instead of changing the organization.
I believe this question will remain in the air until we get a clear split between these two types of PLM. There is a trend splitting classic PLM (OEM oriented) and new upcoming PLM solutions. Till that time, we will be confused by the two approaches. It is a typical PLM disease and the reason you do not experience the same discussion for ERP is obvious. ERP is much more a linear process that both for the OEMs and mid-market companies is aiming to manufacture products or goods at a single location. The differentiation is in global manufacturing. Where do you manufacture your products ? Here the OEMs might have a bigger challenge. Global manufacturing is a PLM challenge too, which is in the air.
Where is the MBOM ?
This is the topic most visited in my blog and I am preparing a session with the MBOM as theme combined with PLM for the upcoming PLM Innovation US conference end of October in Atlanta. I am not going to disclose all the content here, but I will give you some thoughts that are in the air.
Companies historically manage their BOM in ERP, but as a result of globalization they now need to manage their manufacturing BOM at different locations. But each location has its own ERP and a local (M)BOM. What to do ?
This is in the air:
I hope you will participate in both discussions that are the air, either by commenting to this blog, through Tech4PD, your blog (Oleg ? 😉 ) or your participation at PLM Innovation US.
Looking forward to discuss with you about what you believe is in the air tonight.
Conclusion (as usual): It is a busy time – we are heading towards the end of the year, which for some reason is a deadline for many companies. So no long thought processes this time, just what is in the air.
It is interesting to read management books and articles and reflect the content in the context of PLM. In my previous post How the brain blocks PLM acceptance and in Stephen Porter´s (not yet finished) serial The PLM state: the 7 habits of highly effective PLM adoption, you can discover obvious points that we tend to forget in the scope of PLM as we are so focused on our discipline.
This summer holiday I was reading the Innovator´s Dilemma: When New Technologies Cause Great Firms to Fail by Clayton Christensen. Christensen is an associated professor at the Harvard Business School and he published this book already in 1997. Apparently not everyone has read the book and I recommend that if you are involved in the management of a PLM company to read it.
Sustaining technology
Christensen states there are two types of technologies. Leading companies are supporting their customers and try to serve them better and better by investing a lot in improving their current products. Christensen calls this sustaining technology as the aim is to improve existing products. Sustaining technologies lead to every time more and more effort to improve the current product performance and capabilities due to the chosen technology and solution concepts. These leading companies are all geared up around this delivery process and resources are optimized to sustain leadership, till ….
Disruptive technology
The other technology Christensen describes is disruptive technology, which initially is not considered as competition for existing technologies as it under performs in the same scope, so no way to serve the customer in the same way. The technology underperforms if you would apply to the same market, but it has unique capabilities that make it fit for another market. Next if the improvement path of disruptive technology can be faster than the improvement path for the sustaining technology, it is possible that their paths meet at a certain point. And although coming from a different set of capabilities, due to the faster improvement process the disruptive technology becomes the leading one and companies that introduced the disruptive technology became the new market leaders.
Why leading companies failed..
Christensen used the disk drive industry as an example as there the change in technology was so fast that it was a perfect industry to follow it´s dynamics. Later he illustrates the concepts with examples from other industries where the leading firms failed and stopped to exist because disruptive technologies overtook them and they were not able to follow that path too.
Although the leading companies have enough resources and skills, he illustrates that it is a kind of logical path – big companies will always fail as it is in their nature to focus on sustaining technology. Disruptive technologies do not get any attention as they are targeting a different unclear market in the beginning and in addition it is not clear where the value from this disruptive technology comes from, so which manager wants to risk his or her career to focus on something uncertain in an existing company.
Christensen therefore advises these leading companies, if they expect certain technologies to become disruptive for their business, to start a separate company and take a major share position there. Leave this company focus on its disruptive technology and in case they are successful and cross the path of the sustaining technology embed them again in your organization. Any other approach is almost sure to fail, quote:
Expecting achievement-driven employees in a large organization to devote critical mass of resources, attention and energy to disruptive projects targeted at a small market is equivalent to flapping one´s arms in an effort to fly
As the book was written in 1997, it was not in the context of PLM. Now let´s start with some questions.
Is ERP in the stage of sustaining technology?
Here I would say Yes. ERP vendors are extending their functional reach to cover more than the core functionality for two reasons: they need continuous growth in revenue and their customers ask for more functionality around the core. For sustaining technologies Christensen identifies four stages. Customers select a product for functionality, when other vendors have the same functionality reliability becomes the main differentiation. And after reliability the next phase is convenience and finally price.
From my personal observations, not through research, I would assume ERP for the major vendors is in the phase between convenience and price. If we follow Christensen´s analysis for SAP and Oracle it means they should not try to develop disruptive technologies inside their organization, neither should they try to downscale their product for the mid-market or add a different business model. Quote:
What goes up – does not go down. Moving to a high-end market is possible (and usually the target) – they will not go to small, poor defined low-end markets
How long SAP and Oracle will remain market leaders will depend on disruptive technologies that will meet the path of ERP vendors and generate a new wave. I am not aware of any trends in that area as I am not following the world of ERP closely
Is PLM in the stage of sustaining technology?
Here I would say No because I am not sure what to consider as a clear definition of PLM. Different vendors have a different opinion of what a PLM system should provide as core technologies. This makes it hard to measure it along the lifecycle of sustaining technology with the phases: functionality, reliability, convenience and price.
Where the three dominant PLM providers (DS/PTC/Siemens) battle in the areas of functionality, reliability and convenience others are focusing on convenience and price.
Some generalized thoughts passed my mind:
- DS and PTC somehow provoke their customers by launching new directions where they believe the customer will benefit from. This somehow makes it hard to call it sustaining technology.
- · Siemens claiming they develop their products based on what customers are asking for. According to Christensen they are at risk in the long term as customers keep you captive and do not lead you to disruptive technologies.
- · All three focus on the high-end and should not aim for smaller markets with the same technology. This justifies within DS the existence of CATIA and SolidWorks and in Siemens the existence of NX and SolidEdge. Unifying them would mean the end of their mid-market revenue and open it for others.
Disruptive technologies for PLM
Although PLM is not a sustained technology to my opinion, there are some disruptive technologies that might come into the picture of mainstream PLM.
First of all there is the Open Source software model, introduced by Aras, which initially is not considered as a serious threat for the classical PLM players – “big customers will never rely on open source”. However the Open Source model allows product improvements to move faster than main stream, reaching at a certain point the same level of functionality, reliability and convenience. The risk for Open Source PLM is that it is customer driven, which according Christensen is the major inhibitor for disruptive steps in the future
Next there is the cloud. Autodesk PLM and Kenesto are the two most visible companies in this domain related to PLM. Autodesk is operating from a comfort zone – it labels its product PLM, it does not try to match what the major PLM vendors try to do and they come from the small and medium mid-size market. Not too many barriers to come into the PLM mid-market in a disruptive manner. But does the mid-market need PLM? Is PLM a bad annotation for its cloud based product? Time will tell.
The management from Kenesto obviously has read the book. Although the initially concept came from PLM++ (bad marketing name), they do not to compete with mainstream PLM and aim their product at a different audience – business process automation. Then if their product picks up in the engineering / product domain, it might enter the PLM domain in a disruptive manner (all according to the book – they will become market leaders)
Finally Search Based Applications which are also a disruptive technology for the PLM domain. Many companies struggle with the structured data approach a classical PLM system requires and especially for mid-market companies this overhead is a burden. They are used to work in a cognitive manner, the validation and formalization is often done in the brain of experienced employees. Why cannot search based technology not be used to create structured data and replace or support the experienced brain?
If I open my Facebook page, I see new content related to where I am, what I have been saying or surfing for. Imagine an employee´s desktop that works similar, where your data is immediately visible and related information is shown. Some of the data might come from the structured system in the background, other might be displayed based on logical search criteria; the way our brain works. Some startups are working in this direction and Inforbix (congratulations Oleg & team) has already been acquired by Autodesk or Exalead by DS.
For both companies if they believe in the above concept, they should remain as long as possible independent from the big parent company as according to Christensen they will not get the right focus and priorities if they are part of the sustainable mainstream technology
Conclusion
This blog post was written during a relaxing holiday in Greece. The country here is in a crisis, they need disruptive politicians. They did it 3500 years ago and I noticed the environment is perfect for thinking as you can see below.
Meanwhile I am looking forward to your thoughts on PLM, in which state we are what the disruptive technologies are.
The brain has become popular in the Netherlands in the past two years. Brain scientists have been publishing books sharing their interpretations on various topics of human behavior and the brain. Common theme of all: The brain is influencing your perceptions, thoughts and decisions without you even being aware of it.
Some even go that far by claiming certain patterns in the brain can be a proof if you have a certain disorder. It can be for better or for worse.
“It was not me that committed this crime; it was my brain and more…”
Anyway this post will be full of quotes as I am not the brain expert, still giving the brain an important role (even in PLM)
“My brain? That´s my second favorite organ” – Woody Allen
It is good to be aware of the influence of the brain. I wrote about this several times in the past, when discussing PLM vendor / implementer selection or when even deciding for PLM. Many of my posts are related to the human side of justifying and implementing PLM.
As implementing PLM for me primary is a business change instead of a combination of IT-tools to implement, it might be clear that understanding the inhibitors for PLM change are important to me.
In the PLM communities, we still have a hard job to agree between each other what is the meaning of PLM and where it differs from ERP. See for example this post and in particular the comments on LinkedIn (if you are a member of this group): PLM is a business process, not a (software) tool
And why it is difficult for companies to implement PLM beside ERP (and not as an extension of ERP) – search for PLM and ERP and you find zillions of thoughts and answers (mine too).
The brain plays a major role in the Why PLM we have ERP battle (blame the brain). A week ago I read an older publication from Charles Roxburgh (published in May 2003 for McKinsey) called: Hidden flaws in strategy subtitle: Can insights from behavioral economics explain why good executives back bad strategies. You can read, hear and download the full article here if you are a registered user.
The article has been written long before the financial and global crises were on the agenda and Mr. Roxburgh describes 8 hidden flaws that influence our strategic decision making (and PLM is a strategy). I recommend all of you to read the full article, so the quotes I will be making below will be framed in the bigger picture as described by Mr. Roxburgh. Note all quotes below are from his publication.
Flaw 1: Overconfidence
We often make decisions with too much confidence and optimism as the brain makes us feel overconfident and over optimistic about our own capabilities.
Flaw 2: Mental accounting
Avoiding mental accounting traps should be easier if you adhere to a basic rule: that every pound (or dollar or euro) is worth exactly that, whatever the category. In this way, you will make sure that all investments are judged on consistent criteria and be wary of spending that has been reclassified. Be particularly skeptical of any investment labeled “strategic.”
Here I would relate to the difference in IT-spending and budget when you compare ERP and PLM. ERP spending is normal (or strategic) where PLM spending is not understood.
Flaw 3: The status quo bias
People would rather leave things as they are. One explanation for the status quo bias is aversion to loss—people are more concerned about the risk of loss than they are excited by the prospect of gain.
Another reason why adapting and implementing PLM in an organization is more difficult than for example just automating what we already do.
Flaw 4: Anchoring
Anchoring can be dangerous—particularly when it is a question of becoming anchored to the past
PLM has been anchored with being complex and expensive. Autodesk is trying to change the anchoring. Other PLM-like companies stop talking about PLM due to the anchoring and name what they do different: 3DExperience, Business Process Automation, …..
Flaw 5: The sunk-cost effect
A familiar problem with investments is called the sunk-cost effect, otherwise known as “throwing good money after bad.” When large projects overrun their schedules and budgets, the original economic case no longer holds, but companies still keep investing to complete them.
I have described several cases in the past anonymously; where companies kept on investing and customizing their ERP environment in order to achieve PLM goals. Although it never reached the level of acceptance and quality a PLM system could offer, stopping these projects was impossible.
Flaw 6: The herding instinct
This desire to conform to the behavior and opinions of others is a fundamental human trait and an accepted principle of psychology.
Warren Buffett put his finger on this flaw when he wrote, “Failing conventionally is the route to go; as a group, lemmings may have a rotten image, but no individual lemming has ever received bad press.”
A quote in a quote but so true. Innovative thinking, introducing PLM in a company requires a change. Who needs to be convinced? If you do not have consensus (which usually happens as PLM is vague) you battle against the other lemmings.
Flaw 7: Misestimating future hedonic states
Social scientists have shown that when people undergo major changes in circumstances, their lives typically are neither as bad nor as good as they had expected—another case of how bad we are at estimating. People adjust surprisingly quickly, and their level of pleasure (hedonic state) ends up, broadly, where it was before
A typical situation every PLM implementation faces: users complaining they cannot work as efficient anymore due to the new system and their work will be a mess if we continue like this. Implementers start to customize quickly and we are trapped. Let these people ‘suffer’ with the right guidance and motivation for some months (but this is sometimes not the business model the PLM implementer pushes as they need services as income)
Flaw 8: False consensus
People tend to overestimate the extent to which others share their views, beliefs, and experiences—the false-consensus effect. Research shows many causes, including these:
- confirmation bias, the tendency to seek out opinions and facts that support our own beliefs and hypotheses
- selective recall, the habit of remembering only facts and experiences that reinforce our assumptions
- biased evaluation, the quick acceptance of evidence that supports our hypotheses, while contradictory evidence is subjected to rigorous evaluation and almost certain rejection; we often, for example, impute hostile motives to critics or question their competence
- groupthink, the pressure to agree with others in team-based cultures
Although positioned as number 8 by Mr. Roxburgh, I would almost put it as the top when referring to PLM and PLM selection processes. So often a PLM decision has not been made in an objective manner and PLM selection paths are driven to come to the conclusion we already knew. (Or is this my confirmation bias too
)
Conclusion
As scientists describe, and as Mr. Roxburgh describes (read the full article !!!) our strategic thinking is influenced by the brain and you should be aware of that. PLM is a business strategy and when rethinking your PLM strategy tomorrow, be prepared to avoid these flaws mentioned in this post today.
The problem with a TLA is that there is a limited number of combinations that make sense. And even once you have found the right meaning for a TLA, like PLM you discover so many different interpretations.
For PLM I wrote about this in my post PLM misconceptions –: PLM = PLM ?
I can imagine an (un)certain person, who wants to learn about PLM, might get confused (and should be – if you take it too serious).
At the end your company’s goal should be how to drive innovation, increase profitability and competiveness and not about how it is labeled.
As a frequent reader of my blog, you might have noticed I wrote sometimes about ALM and here a similar confusion might exist as there are three ALMs that might be considered in the context I am blogging.
Therefore this post to clarify which ALM I am dedicated to.
So first I start with the other ALMs:
ALM = Application Lifecycle Management
This is an upcoming discipline in the scope of PLM due to the fact that more and more in the product development world embedded software becomes a part of the product. And like in PLM where we want to manage the product data through its lifecycle, ALM should become a logical part of a modern PLM implementation. Currently most of the ALM applications in this context are isolated systems dealing only with the software lifecycle, see this Wiki Page
ALM = Asset Lifecycle Management (operational)
In 2009 I started to focus on (my type of) ALM, called Asset Lifecycle Management, and I discovered the same confusion as when you talk about a BOM. What BOM really means is only clear when you understand the context. Engineers will usually think of an Engineering BOM, representing product as specified by engineering (managed by PDM). Usually the rest of the organization will imagine the Manufacturing BOM, representing the product the way it will be produced (managed mostly in ERP).
The same is valid for ALM. The majority of people in a production facility, plant or managed infrastructure will consider ALM as the way to optimize the lifecycle of assets. This means optimizing the execution of the plant, when to service or replace an asset ? What types of MRO activities to perform. Sounds a lot like ERP and as it has direct measurable impact on finance, it is the area that gets most of the attention by the management.
ALM = Asset Lifecycle Management (information management)
Here we talk about the information management of assets. When you maintain your assets only in a MRO system, it is similar like in a manufacturing company when only using an ERP system. You have the data for operations, but you do not have the process in place to manage the change and quality of data. In the manufacturing world this is done in PDM and PLM system and I believe owners/operators of plant can learn from that.
I wrote a few posts about this topic, see Asset Lifecycle Management using a PLM system, PLM CM and ALM – not sexy or using a PLM system for Asset Lifecycle Management requires a vision and I am not going to rewrite them in this post. So get familiar with my thoughts if you read the first time about ALM in my blog.
What I wanted to share is that thanks to modern PLM systems, IT infrastructure/technologies and SBA it becomes achievable for owner/operators to implement an Asset Lifecycle Management vision for their asset information and I am happy to confirm that in my prospect and customer base, I see companies investing and building this ALM vision.
And why do they do this:
Reduce maintenance time (incidental and planned) by days or weeks due to the fact that people have been working with the right and complete data. Depending on the type of operations, one week less maintenance can bring millions (power generation, high demand/high cost chemicals and more)
.
Reduce the failure costs dramatically. As maintenance is often a multi-disciplinary activity errors due to miscommunication are considered as normal in this industry (10 % up and even more). It is exactly this multi-disciplinary coordination that PLM systems can bring to this world. And the more you can do in a virtual world the more you can assure you do the right thing during real maintenance activities. Here industries similar as for the previous bullet, but also industries where high-costly materials and resources are used, the impact on reducing failure costs is high.
.
Improve the quality of data. Often the MRO system contains a lot of operational parameters that were entered there at a certain time by a certain person with certain skills – the fact that although I used the word certain three times, the result is uncertainty as there is no separate tracing and validation of the parameters per discipline and an uncertain person looking at the data might not discover there is an error, till it goes wrong. Here industries where a human error can be dramatic benefit the most from it (nuclear, complex chemical processes)
Conclusion: The PLM system based ALM implementations are more and more becoming reality next to the ALM operational world. After spending more then three years focused on this area, I believe we can see and learn from the first results.
Are you interested in more details or do you want to share your experience ? Please let me know and I will be happy to extend the discussion
Note: On purpose I used as much TLA’s to assure it looks like an specialist blog, but you can always follow the hyperlink to the wiki explanation, when the TLA occurs the first time.
Sorry for the provoking title in a PLM blog, but otherwise you would not read my post till the end.
In the past months I have been working closely with several large companies (not having a mid-market profile). And although they were all in different industries and have different business strategies, they still had these common questions and remarks:
- How to handle more and more digital data and use it as valuable information inside the company or for their customers / consumers ?
- What to do with legacy data (approved in the previous century) and legacy people (matured and graduated in the previous century) preventing them to change ?
- We are dreaming of a new future, where information is always up-to-date and easy to access – will this ever happen ?
- They are in the automotive industry, manufacturing industry, infrastructure development and maintenance, plant engineering, construction and plant maintenance
- They all want data to be managed with (almost) zero effort
- And please, no revolution or change for the company
Although I have been focusing on the mid-market, it is these bigger enterprises that introduce new trends and as you can see from the observations above, there is a need for a change. But also it looks like the demands are in a contradiction to each other.
I believe it is just about changing the game.
If you look at the picture to the left, you see one of the contradictions that lead to PLM.
Increasing product quality, reducing time to market and meanwhile reducing costs seemed to be a contradiction at that time too.
Change ?
Although PLM has not been implemented (yet) in every company that could benefit from it, it looks like the bigger enterprises are looking for more.
the P from PLM becomes vague – it is no longer only the product that has the focus, it is also the whole context around the product that might influence it, that they want to take in consideration
the L from PLM remains – they still want to connect all information that is related to the lifecycle of their products or plants.
the M from Management has a bad association – companies believe that moving from their current state towards a managed environment of data is a burden. Too much overhead is the excuse to not manage dat. And their existing environments to manage data do not excel in user-friendliness. And therefore people jump towards using Excel.
Next
So if the P is not longer relevant, the M is a burden, what remains of PLM ?
Early June I presented at the Dassault Systems 3DExperience forum the topic of digital Asset Lifecycle Management for owners / operators. One of the areas where I believe PLM systems can contribute a lot to increase business value and profitability (quality and revenue – see using a PLM system for Asset Lifecycle Management )
Attending the key note speech it was clear that Dassault Systems does not talk about PLM anymore as a vision. Their future dream is a (3D) lifelike experience of the virtual world. And based on that virtual model, implement the best solution based on various parameters: revenue, sustainability, safety and more. By trying to manage the virtual world you have the option to avoid real costly prototypes or damaging mistakes.
I believe it is an ambitious dream but it fits in the above observations. There is more beyond PLM.
In addition I learned from talking with my peers (the corridor meetings) that also Siemens and PTC are moving towards a more industry or process oriented approach, trying to avoid the association with the generic PLM label.
Just at the time that Autodesk and the mid-market started to endorse PLM, the big three are moving away from this acronym.
This reminds me of what happened in the eighties when 3D CAD was introduced. At the time the mid-market was able to move to mainstream 3D (price / performance ratio changed dramatically) the major enterprises started to focus on PDM and PLM. So it is logical that the mid-market is 10 – 15 years behind new developments – they cannot afford experiments with new trends.
So let’s see what are the new trends:![]()
- the management of structured and unstructured data as a single platform. We see the rise of Search Bases Application and business intelligence based on search and semantic algorithms. Using these capabilities integrated with a structured (PLM ? ) environment is the next big thing.
- Apps instead of generic applications that support many roles. The generic applications introduce such a complexity to the interface that they become hard to use by a casual user. Most enterprise systems, but also advanced CAD or simulation tools with thousands of options suffer from this complexity. Would not it be nice if you only had to work with a few dedicated apps as we do in our private life ?
- Dashboards (BI) that can be created on the
fly
representing actual data and trends based
on structured and unstructured data.
It reminded me of a PLM / ERP discussion I had with a company, where the general manager all the time stated the types of dashboards he wanted to see. He did not talk about PLM, ERP or other systems – he wanted the on-line visibility - Cloud services are coming. Not necessary centralizing all data on the cloud to reduce it cost. But look at SIRE and other cloud services that support a user with data and remote processing power at the moment required.
- Visual navigation through a light 3D Model providing information when required. This trend is not so recent but so far not integrated with other disciplines, the Google maps approach for 3D.
So how likely are these trends to change enterprise systems like PLM, ERP or CRM. In the table below I indicated where it could apply:
As you can see the PLM row has all the reasons to introduce new technologies and change the paradigm. For that reason combined with the observations I mentioned in the beginning, I am sure there is a new TLA (Three Letter Acronym) upcoming.
The good news is that PLM is dynamic and on the move. The bad news for potential PLM users is that the confusion remains – too many different PLM definitions and approaches currently – so what will be the next thing after PLM ?
Conclusion: The acronym PLM is not dead and becomes mainstream. On the high-end there is for sure a trend to a wider and different perspective of what was initially called PLM. After EDM, TDM, PDM and PLM we are waiting for the next TLA
The trigger for this post is based was a discussion I had around the Autodesk 360 cloud based PLM solution. To position this solution and to simplify the message for my conversation partner Joe the plumber, I told him”: “You can compare the solution with Excel on-line. As many small mid-market companies are running around with metadata (no CAD files) in Excel, the simplified game changer with this cloud based PLM offering is that the metadata is now in the cloud, much easier to access and only a single version exists.”
(sorry for Autodesk, if I simplified it too much, but sometimes your conversation partner does not have an IT background as they are plumbers)
Interesting enough Joe said: “But what is the difference with Google docs or SharePoint where I can centralize my Excel files too – and Google Docs is like a cloud solution, right ?”
He was right and I had to go more in-depth to explain difference. This part of the conversation was similar to discussions I had in some meetings with owner / operators in the civil and energy sector, discussing the benefits of PLM practices for their industry.
I wrote about this in previous posts:
Using a PLM system for asset lifecycle management requires a vision
PLM practices for the engineering / construction industry
The trouble with dumb documents
Here it was even more a key point of the discussion that most of the legacy data is stored in dumb documents. And the main reason dumb documents are used is because the data needs to be available during the long lifecycle of the the plant, application independent if possible. So in the previous century this was paper, later scanned documents (TIFF – PDF) and currently mainly PDF. Most of the data now is digital but where is the intelligence ?
The challenges these companies have is that despite the fact information is now stored in a digital file, the next step is how to deal with the information in an intelligent manner. A document or an Excel file is a collection of information, you might call it knowledge, but to get access to the knowledge you need to find it.
Did you try to find a specific document in Google docs or SharePoint ? The conclusion will be the file name becomes very important, and perhaps some keywords ?
Is search the solution ?
To overcome this problem, full text search and search based applications were developed, that allow us to index and search inside the documents. A piece of cake for Google and a niche for others to index not only standard documents but also more technical data (drawings, scans from P&ID, etc, etc).
Does this solve the problem ?
Partly, as suddenly the user finds a lot more data. Search on Google for the words “Right data” and you have 3.760.000.000 hits (or more). But what is the right data ? The user can only decide what is the right data by understanding the context.
- Is it the latest version ?
- Is it reflecting the change we made at that functional position ?
- What has changed ?
And here comes the need for more intelligent data. And this is typically where a PLM system provides the answer.
A PLM systems is able to manage different types of information, not only documents. In the context of a plant or a building, the PLM system would also contain:
- a functional definition / structure (linked to its requirements)
- a logical definition / structure (how is it supposed to be ?)
- a physical definition / structure (what is physically there ?)
- a location definition / structure (where in the plant / building ?)
and this is all version managed and related to the supported documents and other types of information. This brings context to the documents and therefore it exposes knowledge.
As there is no automatic switch from dumb documents towards intelligent data, it will be a gradual process to move towards this vision. I see a major role for search based applications to support data discovery. Find a lot of information, but than have the capability to capture the result (or generate a digest of the result) and store it connected to your PLM system, where is it managed in the future and provides the context.
Conclusion: We understand that paper documents are out of time. Moving these documents to digital files stored in a central location, either in SharePoint or a cloud-based storage location is a step we will regret in ten years from now, as intelligent data is not only inside the digital files but also depending on its context.



“My brain? That´s my second favorite organ” – Woody Allen
[…] (The following post from PLM Green Global Alliance cofounder Jos Voskuil first appeared in his European PLM-focused blog HERE.) […]
[…] recent discussions in the PLM ecosystem, including PSC Transition Technologies (EcoPLM), CIMPA PLM services (LCA), and the Design for…
Jos, all interesting and relevant. There are additional elements to be mentioned and Ontologies seem to be one of the…
Jos, as usual, you've provided a buffet of "food for thought". Where do you see AI being trained by a…
Hi Jos. Thanks for getting back to posting! Is is an interesting and ongoing struggle, federation vs one vendor approach.…