You are currently browsing the tag archive for the ‘Product lifecycle management’ tag.
The last month I haven’t been able to publish much of my experiences as I have been in the middle of several PLM selection processes for various industries. Now in a quiet moment looking back, I understand it is difficult for a company to choose a PLM solution for the future.
I hope this post will generate some clarity and may lead to some further discussion with other experts in the audience. I wrote about the do’s and don’ts of PLM selection in 2010, and most of it is still actual; however, there is more. Some of the topics explained:
Do you really need PLM ?
This is where it starts. PLM is not Haarlemerolie, an old Dutch medicine that was a cure for everything since the 17th century. The first step is that you need to know what you want to achieve and how you are aiming to achieve it. Just because a competitor has a PLM system installed, does not mean they use it properly or that your company should do it too. If you do not know why your company needs PLM, stop reading and start investigating.
If you are still reading this, you are part of the happy few, as justifying the need for PLM is not easy. Numerous of companies have purchased a PLM system just because they think they needed PLM. Or there was someone convinced that this software would bring PLM.
Most of these cases there was the confusion with PDM. Simply stating: PDM is more a departmental tool (engineering – multidisciplinary) where PLM is a mix of software, infrastructure to connect all departments in a company and support the product through its entire lifecycle.
Implementing “real” PLM is a business change, as people have to start sharing data instead of pushing documents from department to department. And this business transformation is a journey. It is not a fun journey, nicely characterized in Ed Lopategui’s blog post, the PLM Trail.
Although I believe it is not always that dramatic, Ed set the expectations right. Be well prepared before you start.
Why do companies still want PLM, while it is so difficult to implement?
The main reason is to remain competitive. If margins are under pressure, you can try to be more efficient, get better and faster tools. But by working in the old way, you can only be a little better.
Moving from a sequential, information pushing approach towards an on-line, global information sharing manner is a change in business processes. It is interaction between all stakeholders. Doing things different requires courage, understanding and trust you made the right choice. When it goes wrong, there are enough people around you to point fingers at why it went wrong – hindsight is so easy.
Doing nothing and becoming less and less competitive is easier (the boiling frog again) as in that case the outside world will be blamed, and there is nobody to point fingers at (although if you understand the issue you should make the organization aware the future is at stake)
Why is PLM so expensive?
Assuming you are still reading, and you and your management are aligned there is a need for PLM, a first investigation into possible solutions will reveal that PLM is not cheap.
When you calculate the overall investment required in PLM, the management often gets discouraged by the estimated costs. Yes, the benefits are much higher, but to realize these benefits, you need to have a clear understanding of your own business and a realistic idea how the future would look like. The benefits are not in efficiency. The main benefits come from capabilities that allow you to respond better and faster than by just optimizing your departments. I read a clarifying post recently, which is addressing this issue: Why PLM should be on every Executive’s agenda !
From my experience with PLM projects, it is surprising to learn that companies do not object to spend 5 to 20 times more money for an ERP implementation. It is related to the topic: management by results or management by means.
PLM is not expensive compared to other enterprise systems. It can become expensive (like ERP implementations) if you lose control. Software vendors have a business in selling software modules, like car resellers have a business in selling you all the comfort beyond the basics.
The same for implementation partners, they have a business in selling services to your company, and they need to find the balance between making money and delivering explainable value. Squeezing your implementation partner will cause a poor delivery. But giving them an open check means that, at a certain moment, someone will stand up and shutdown the money drain as the results are no longer justifiable. Often I meet companies in this stage, the spirit has gone. It is all about the balance between costs and benefits.
This happens in all enterprise software projects, and the only cure is investing in your own people. Give your employees time and priority to work in a PLM project. People with knowledge of the business are essential, and you need IT resources to implement. Do not make the mistake to leave business uncommitted to the PLM implementation. Management and middle management does not take the time to understand PLM as they are too busy or not educated / interested.
Make business owners accountable for the PLM implementation – you will see stress (it is not their daily job – they are busy), but in the longer time you will see understanding and readiness of the organization to achieve the expected results.
We are the largest – why select the largest ?
When your assignment is to select a new enterprise system, life could be easy for you. Select a product or service from the largest business and your career is saved. Nobody gets blamed for selecting the largest vendor, although if you work for a small mid-sized company, you might think twice.
Many vendors and implementers start their message with:
“…. Market leader in ABC, though leader in XYZ, recognized by 123”
The only thing you should learn from this message is that this company probably has delivered a trustworthy solution in the past. Looking at the past you get an impression of its readiness and robustness for the future. Many promising companies have been absorbed by the larger ones and disappeared. As Clayton Christensen wrote in The Innovators Dilemma:
“What goes up does not go down”.
Meaning these large companies focus on their largest clients and will focus less on the base of the business pyramid (where the majority is), making them vulnerable for disruptive innovation.
Related to this issue there is an interesting post (and its comments), written by Oleg Shilovitsky recently: How many PLM vendors disappear in disruption predicted by Gartner.
Still when selecting a PLM vendor it is essential to know if they have the scale to support you in the future and if they have the vision to guide you into the future.
The future of PLM is towards managing data in a connected manner, not necessary coming from a single database, not necessary using only structured data. If your PLM vendor or implementer is pushing you to realize document and file management, they are years late and not the best for your future.
PLM is a big elephant
PLM is considered as a big elephant, and I agree if you address everything in one shot that PLM can do. PLM has multiple directions to start from – I wrote about it: PLM at risk – it does not have a single job
PLM has a huge advantage compared to a transactional system like ERP and probably CRM. You can implement a PLM infrastructure and its functionality step by step in the organization, start with areas that are essential and produce clear benefits for the organization. That is the main reason that PLM implementations can take 2 – 3 years. You give the organization time to learn, to adapt and to extend.
We lose our flexibility ?
Nobody in an organization likes to be pushed in a cooperate way of working, which by definition is not as enjoyable and as flexible as they way you currently work. It is still an area where PLM implementations can improve: provide the user with an environment that is not too rigid and does not feel like a rigid system. You seen this problem with old traditional large PLM implementations for example with automotive OEMs. For them, it is almost impossible to switch to a new PLM implementation as everything has been built and connected in such a proprietary way, almost impossible to move to more standard systems and technologies. Late PLM implementations should learn from these lessons learned.
PLM vendor A says PLM vendor B will be out of business
One of the things I personally dislike is FUD (Fear, Uncertainty and Doubt). It has become a common practice in politics and I have seen PLM vendors and implementers using the same tactics. The problem with FUD is that it works. Even if the message is not verifiable, the company looking for a PLM system might think there must be some truth in this statement.
My recommendation to a company that gets involved in FUD during a PLM selection process, they should be worried about the company spreading the FUD. Apparently they have no stronger arguments to explain to you why they are the perfect solution; instead they tell you indirectly we are the less worst.
Is the future in the cloud ?
I think there are two different worlds. There is the world of smaller businesses that do not want to invest in an IT-infrastructure and will try anything that looks promising – often tools oriented. This is one of my generalizations of how US businesses work – sorry for that. They will start working with cloud based systems and not be scared by performance, scalability and security. As long all is easy and does not disturb the business too much.
Larger organizations, especially with a domicile in Europe, are not embracing cloud solutions at this moment. They think more in private or on-premise environments. Less in cloud solutions as security of information is still an issue. The NSA revelations prove that there is no moral limit for information in the sake of security – combined with the fear of IP theft from Asia, I think European companies have a natural resistance for storing data outside of their control.
For sure you will see cloud advocates, primarily coming from the US, claiming this is the future (and they are right), but there is still work to do and confidence to be built.
PLM selection often has a focus on checking hundreds of requirements coming from different departments. They want a dream system. I hope this post will convince you that there are so many other thoughts relevant to a PLM selection you should take into account. And yes you still need requirements (and a vision).
Your thoughts ?
- CIMdata Publishes PLM Geography Report (detroit.cbslocal.com)
Last year, I read Clayton Christensen’s book “The Innovator’s dilemma – When New Technologies Cause Great Firms to Fail “. I was intrigued how his theory also applies to PLM and wrote about it in a blog posts last year.
Recently, I attended an HBR Webinar “Innovating over the Horizon: How to Survive Disruption and Thrive” , which raises serious implications for PLM. As presented by Clayton Christensen and Max Wessel, both professors in the Harvard Business School, I foresaw numerous consequences demanding attention.
I’d like to highlight some observations for you:
- Disruptive innovation will hit any domain – so also the PLM domain
- You are less impacted if your products/services are targeting a job to be done
- ERP has a well defined job – so not much discussion there
- PLM does not have a clear job – so vulnerable for disruption
- Will PLM disappear?
The above diagram explains it all. Often products come into the market with a performance below customer expectations. The product will improve in time, and at a certain moment it will reach that expectation level. Through sustaining innovation, the company keeps improving their product(s) to attract more customers, and start delivering more than a single customer is asking for.
This is for sure the case in PLM. All the PLM vendors are now able to deliver a lot of functionality around global collaboration, covering the whole product lifecycle. Companies that implement PLM, just implement a fraction of these capabilities and still have additional demands. Still the known PLM vendors nearly always win when a company is searching for a new PLM solution.
Disruption comes from other technologies and products. In the beginning, they are not even considered by companies in that product space as a possible solution. As these products improve in time at a certain moment, they reach that level of functionality and performance, a potential customer can use these products to address their demands.
At this stage, the disrupters will nearly always win the battle. The reason is that they are more close to what the customer wants than the incumbents. Their product performance and price point are most likely to be more attractive than the incumbents´ portfolio.
Translating this to PLM it would mean: “Do not look for PLM systems as they already provide too much functionality, way above the line of customer desire”
As a PLM consultant, I need to provide some second thoughts to keep my job. There is much more behind Prof. Christensen’s theory, and I recommend before agreeing with what I write, read his books ! And although there is a horizontal time axis where the disruptive technology comes in, it does not indicate it will be this year or next year.
If you are aware that disruption can kill your business, how likely is it that it will happen in your business and when?
Professor Christensen makes two key points:
- Disruption will always happen, but this does not mean it is going to be fast and totally overtaking the old products. It might be a slower process as expected and incomplete. Here, I was thinking about disruptive cloud technology, which came in fast on the consumer level, but will it reach the business level too, in the same manner that it overrules the classical PLM platforms ? I am not sure about that (yet)
- If your company’s value is on delivering products, instead of delivering means to get the job done for your customer, you are extremely vulnerable for disruption.
As companies are looking to get their job done in the most efficient manner, they will switch at any time to new solutions that provide a better way to get the job done, often with a better performance and at a lower price point.
ERP has a well defined job
I realized that this is one of the big differences between PLM and ERP. Why is there such a discussion around the need for PLM and I do not catch the same messages from the ERP domain ? Maybe because I am a PLM consultant?
ERP has a clear mission: “To get the job done – deliver a product as efficient and fast as possible to the customer”. ERP is an execution system. Although ERP vendors as well are delivering more than their individual customers ask for, the job is more clear defined.
PLM does not have a clear job
For PLM, it becomes fuzzy. What is the job that PLM does ? Here, we get a lot of different answers. Have a look at these definitions from some vendors
CIMdata calls PLM “the most effective investment you can make to achieve product leadership.” AMR Research says “Companies committed to time to value in product innovation certainly cannot succeed without a sound PLM foundation.”
Product Lifecycle Management, or PLM, is a driver of successful product development, and a strategic contributor to business value across the enterprise. PLM helps product manufacturers manage complex, cross-functional processes, coordinating the efforts of distributed teams to consistently and efficiently create the best possible products
For companies of any size, Autodesk PLM 360 helps to streamline your business processes for more efficient product development, improved profitability, and higher product quality.
I also reviewed the websites from the other PLM vendors, and I can confirm: None of them is talking in a clear way which job needs to be done. All PLM solutions are around technology and products.
Companies want to get the job done
And here I come back to the webinar’s conclusion. If you want to secure your future as a company, you need to focus on the job to be done. And even better, focus on the experience to do the job and the best integration of these experiences in a total framework. See the slide below:
My interpretation is that PLM has not even reached level 1. Still many companies are struggling to understand the fundamental need(s) for PLM.
Interesting to see is that Dassault Systemes in their messaging and approach is already targeting level 2 – the experiences. If potential customers will embrace the experience approach without passing level 1, is something to observe.
Will PLM disappear ?
In my December 2008 blog post PLM in 2050 and recently in The Innovator’s dilemma and PLM, I wrote that I believe PLM as it is currently defined, will disappear. Perhaps made redundant by a collection of disruptive technologies. Main reason is that PLM does not do a single, clear job.
One of these disruptive candidates to my opinion is Kenesto. They deliver “social business enterprise software to empower teams” as stated on their website. Kenesto is not considered as a competitor of classic PLM, starting on a different trajectory. For sure there will be more disruptive candidates aiming at different pieces of the PLM scope.
What do you think:
- Does PLM have too many jobs ?
- Will PLM survive disruption ?
Last week I attended the Product Innovation Conference in Berlin, an event that revitalized the discussions and information exchange around PLM.
I have been blogging about this event since it started in London 2011, the year after in Munich and Atlanta and now in Berlin. The event has grown in size, both for the amount of speakers and participants. There were many parallel sessions per interest stream, and for that reason I cannot give a full overview of the event as I did in my previous blog posts.
This time I will describe only my personal highlights, being aware there was much more to learn. A nice service to the more than 350 attendees is that they will be able to see all sessions online soon as they were all recorded.
Some of my personal highlights
The first keynote speaker was Steve Wozniak and for me one of the guys that changed my professional life. The Apple IIe was my first personal affordable computer to explore a new world of automation, the peeks and pokes, the analog/digital converter, programming and application software, like Visicalc. I somehow feel the same excitement with 3D printing. How is this going to affect our future life?
The Apple IIe was an innovation and Steve Wozniak led us through the successes and failures he experienced within Apple. Steve´s presentation was a clear motivation for all of us to think different, to have your goals in mind. Do not focus on the common sense or be part of the organization. There will be failure but also success if you are clear about your goals. Engineers should follow their creativity and be original, instead of copying books. Creativity and Innovation are like humor (some have it and some will never have it). It was a good inspiring start for the two days, and these themes came back several times.
During the rest of the day, I learned about The Human Factor and Managing Cultural Change by Dagmar Heinrich, which can lead to damaged bike or car.
Stan Przybylinski provided interesting statistics and insights about investments in discrete manufacturing related software around the world (US, Japan, Germany, India, China) demonstrating there is still an enormous gap between the traditional economies in the west and the emerging countries.
An excellent presentation was given by Caterpillar – Beth Hinchee representing the PLM / business side, John Berg representing the IT/Infrastructure side, combined with Accenture Rüdiger Stern – Innovation and Product Development Lead.
Their presentation was a blueprint how large PLM implementations should be executed, and it was a confirmation of what I am preaching.
As a PLM implementation is always about changing the way a company works, you need to make sure you have a strong involvement from both business and IT. Without a third party that brings in the best practices, the coaching and moderation between the two disciplines it often fails due to different viewpoints and a different focus. The role of the consultancy partner is to be the glue, the motivator and source of bringing outside experience from other implementations into the discussion. As normally a company might have experience with one or two PLM implementations, a consultancy firm should be able to bring in much more experiences from all their customer engagements.
In the afternoon Michael Grieves, author of Virtually Perfect: Driving Innovative and Lean Products through Product Lifecycle Management talked about the value of innovating starting from virtual products, and how they contribute to faster mature, better validated products, benefitting from a lower overall investment for innovation. He also stated it is more important to focus on practices instead of standardized processes inside PLM.
This matched perfectly with my presentation; Innovation loves PLM, explaining the huge value that PLM brings for Innovation in relation to the company’s culture and approach towards open innovation.
The two closing keynotes sessions from the afternoon were interesting. Peter Bilello from CIMdata talked about The Future of PLM: Enabling Radical Collaboration. For me the first time I saw such a change from CIMdata, now looking forward to the upcoming generation instead of presenting more common, consolidated PLM wisdom. My blog buddy Oleg wrote about it in more detail in his recent blog post: Product Development as we have known it is dying.
The last session of the day was from Marc Chapman: Designing the World Land Speed Record. It was inspiring for all of us, demonstrating the beauty and challenges of engineering when trying to break the world land speed record. See more at bloodhoundssc.com. Not so much PLM related, but full of challenges and a need for innovative approaches.
And after a network session with drinks and a short night
The next day started with an inspiring speech, please pay extra attention to this topic. Massoud Hassani, born in Afghanistan, is striving for awareness of the global land-mine problem through his innovative decommissioning device Mine Kafon. Traditional mine discovery and detonation programs are expensive. Affected countries and the UN are not spending significant money to solve the problem as an exploding mine is no longer world news (unless it is a famous person).
Still people get injured or killed by these mines – forgotten victims. Have a look at Massoud´s project on kickstarter.com and get inspired where you can contribute. Massoud’s initial design was based on his childhood experiences, knowledge gained at the design academy and now looking for engineering support to optimize his extreme low cost, but innovative solution.
Some other sessions from the second day: The lessons learned from previous failed PLM projects by Andritz: When Things Go Wrong: How to Put Them Right. They decided not to follow the common approach that many companies try to make: one size (type of PLM) fits all, learning from their failed PLM project now rolling out several PDM systems.
This presentation somehow has a connection to what Marc Halpern from Gartner mentioned. One of my favorite opening statements he made about PLM upgrades was:
“Upgrading your PLM system, is like rewiring the house with the electricity on”.
As Gartner’s focus is more on the IT-side of the business, he explained that current PLM implementations cannot be maintained in the long term future as they become too expensive and complex to maintain. He mentioned the risk when selecting one provider for PLM, you would suffer probably from being locked-in by the vendor. This made me think what about if you would implement SAP PLM ? The SAP message is clear: one single platform for PLM and Execution!
The counter approach from this vendor lock-in is the approach to work towards open standards. Here, I attended the session EUROSTEP: Achieving business benefits by using PLM standards such as STEP and PLCS.
Currently I am involved in several projects where standardization of data for the long term and efficient data exchange between various systems is important. It is somehow a battle against all odds. Standardization is making small steps forward, but it requires companies to have a long-term vision and most of the time they chose for the short-term proprietary data formats from their software vendors. As time and less complexity is money – the problem will come later for the next generation of managers and software.
Of course this always has to be considered in the context of the dynamics of your industry – the longevity of data plays an important role.
Second last keynote speaker of the day was Prof. Martin Eigner, a long term visionary and icon when it comes to PLM. Prof. Eigner provoked the audience again that almost no company actually has implemented PLM.
Most companies are stuck with a form of PDM combined with complex customizations. They do not keep it simple – PLM is for Product Development and definition and ERP is only for execution. Companies tend to invest in their expensive ERP systems which have less impact on the future business as PLM and Innovation have.
Companies should invest much more in the design process as here it is where almost 70 % of the costs are defined and innovative products are born. To innovate better we should add Model Based engineering which includes the steps of systems engineering into the design process. Mr. Eigner was talking about a new term for PLM: sysLM. His speech was consistent and logical for all of us. But why do companies not adopt this vision?
I will come back to that in my conclusion.
The last keynote speech was from Doug McCuistion, program manager from the NASA Curiosity Mars Exploration mission. Doug guided us through all the challenges the mission went through. He shared with us the reasons for the mission, the complexity and challenges of the landing procedure and the upcoming discoveries expected. It was the last session of the congress and I feel sorry for those who had to leave earlier for their travels as it was the most inspiring session of the congress. Going for the almost impossible and such a contrast to the “boring” world of PLM.
And here comes the link between NASA´s Curiosity project and Prof Eigner´s PLM presentation.
The Curiosity project is a challenge, not on this planet, it is on the edge of what is possible and has no competition (or it must be budget cuts by the government). For most other companies, the challenge lies on this earth, and they want to stay ahead of the competition. Here it is about being able to fund your innovation and assure future funding by introducing innovative products to the market that generate enough margin to invest in the future. PLM presentations seem to be “boring” as the business value is not clear for the management (the do not attend PLM conferences), they get more enthusiastic from short-term financial figures.
One of the (younger) attendees told me that it was impressive to see so many PLM icons at this conference, but where is the new generation of PLM to-be icons ?
Fixing this disconnect is probably related to the magic we need to find to bring Innovation and PLM to the next generations.
Who starts ???
- The conference has become a “must” for companies looking for experiences related to PLM. Why and how PLM contributes to your business
- Companies are looking for their second PLM implementation trail. Learning from their previous mistakes they learned it is not an IT-only project, business should be leading, cloud becomes an option.
- The awareness of a new upcoming generation of workers. Everyone is aware of it, still at PLM conferences we are waiting for the first thought leaders of this generation to speak.
- Excitement comes from innovations that seem to be unachievable. Some go extremely fast, some detonate mines and some go to Mars, the rest has to be achieved in a competitive and global market.
Innovation loves PLM.
PLM is a popular discussion topic in various blogs, LinkedIn discussion groups, PLM Vendor web sites and for the upcoming Product Innovation congress in Berlin. I look forward to the event to meet and discuss with attendees their experience and struggle to improve their businesses using PLM.
From the other side talking about pure PLM becomes boring. Sometimes it looks like PLM is a monotheistic topic:
- “What is the right definition of PLM ?” (I will give you the right one)
- “We are the leading PLM vendor” (and they all are)
- A PLM system should be using technology XYZ (etc, etc)
Some meetings with customers in the past three weeks and two different blog posts I read recently made me aware of this ambiguity between boring and fun.
PLM dictating Business is boring
Oleg Shilovitsky´s sequence of posts (and comments) starting with A single bill of materials in 6 steps was an example of the boring part. (Sorry Oleg, as you publish so many posts, there are many that I like and some I can use as an example). When reading the BOM-related posts, I noticed they are a typical example of an IT- or Academic view on PLM, in particular on the BOM topic.
Will these posts help you after reading them ? Do they apply to your business ? Or do you feel more confused as a prolific PLM blogger makes you aware of all the different options and makes you think you should use a single bill of materials ?
I learned from my customers and coaching and mediating hundreds of PLM implementations, that the single BOM discussion is one of the most confusing and complex topics. And for sure if you address it from the IT-perspective
The customer might say:
“Our BOM is already in ERP – so if it is a single BOM you know where it is – goodbye !”.
A different approach is to start looking for the optimal process for this customer, addressing the bottlenecks and pains they currently face. It will be no surprise that PLM best practices and technology are often the building blocks for the considered solution. If it will be a single BOM or a collection of structures evolving through time, this depends on the situation, not on the ultimate PLM system.
Business dictating PLM is fun
Therefore I was happy to read Stephen Porter´s opinion and comments in: The PLM state: Pennywise Pound Foolish Pricing and PLM where he passes a similar message as mine, from a different starting point, the pricing models of PLM Vendors. My favorite part is in his conclusion:
A PLM decision is typically a long term choice so make sure the vendor and partners have the staying power to grow with your company. Also make sure you are identifying the value drivers that are necessary for your company’s success and do not allow yourself to be swayed by the trendy short term technology
Management in companies can be confused by starting to think they just need PLM because they hear from the analysts, that it improves business. They need to think first to solve their business challenges and change the way they currently work in order to improve. And next look for the way to implement this change.
Changing the way to work is the problem, not PLM.
It is not the friendly user-interface of PLM system XYZ or the advanced technical capabilities of PLM system ABC, that will make a PLM implementation easier. Nothing is solved on the cloud or by using a mobile device. If there is no change when implementing PLM, why implement and build a system to lock yourself in even more?
This is what Thomas Schmidt (VP Head of Operational Excellence and IS at ABB’s Power Products Division) told last year at PLM Innovation 2012 in Munich. He was one of the keynote speakers and surprised the audience by stating he did not need PLM !
He explained this by describing the business challenges ABB has to solve: Being a global company but acting around the world as a local company. He needed product simplification, part reduction among product lines around the world, compliance and more.
Another customer in a total different industry mentioned they were looking for improving global instant collaboration as the current information exchange is too slow and error prone. In addition they want to capitalize on the work done and make it accessible and reusable in the future, authoring tool independent. But they do not call it PLM as in their business nobody uses PLM !
Both cases should make a PLM reseller´s mouths water (watertanden in Dutch), as these companies are looking for key capabilities available in most of the PLM systems. But none of these companies asked for a single BOM or a service oriented architecture. They wanted to solve their business issues. And for sure it will lead into implementing PLM capabilities when business and IT-people together define and decide on the right balance.
Management take responsibility
And here lies the management responsibility of these companies. It is crucial that a business issue (or a new strategy) is the driving force for a PLM implementation.
In too many situations, the management decides that a new strategy is required. One or more bright business leaders decide they need PLM (note -the strategy has now changed towards buying and implementing a system). Together with IT and after an extensive selection process is done, the selected PLM system (disconnected from the strategy) will be implemented.
- why PLM projects are difficult
- why it is unclear what PLM does.
PLM Vendors and Implementers are not connected anymore at this stage to the strategy or business. They implement technology and do what the customer project team tells them to do (or what they think is best for their business model).
Successful implementations are those where the business and management are actively involved during the whole process and the change. And this requires a significant contribution from their side, often delegated to business and change consultants.
PLM Implementations usually lead to a crisis at some moment in time, when the business is not leading and the focus is on IT and User Acceptance. In the optimal situation business is driving IT. However in most cases due to lack of time and priorities from the business people, they delegate this activity to IT and the implementation team. And here it is a matter of luck if they will be successful:
- how experienced is the team ?
- Will they really implement a new business strategy or just automate and implement they way the customer worked before, but now in a digital manner ?
- Do we blame the software when the people do not change ?
Back to fun
I would not be so passionate about PLM if it was boring. However looking back the fun and enthusiasm does not come from PLM. The fun comes from a pro-active business approach knowing that first the motivating the people and preparing the change are defined, before implementing PLM practices
I believe the future success for PLM technologies is when we know to speak and address real business value and only then use (PLM) technologies to solve them.
PLM becomes is a logical result not the start.
And don´t underestimate: change is required.
What do you think – is it a dream ?
First of all happy new year to all of you. As there is no “End of the World” risk anymore in the near future , we can start looking forward and set our goals for the next 5 years or is it a 7-years plan Oleg ?.
Christmas, the moment the light returns on the Northern hemisphere, plus the food , cycling and the preparations for the next Product Innovation conference in Berlin were the drivers for this blog post.
The title might give you the impression that it is an IQ-quiz: “Which word does not fit in this sequence”? Well, It’s not, they are all related. Let’s put them in a chronological order.
Frogs existed first, and were exploring the world before us humans. Paleontologists assume they had no notion of what was global. In their world it was probably a few ponds in size. For certain, they did not have anything to do with innovation. At that time, survival depended on the slow process of evolution.
Millions of years later, the first Homos appeared on the earth surface; Homo Sapiens, Homo Erectus, Homo Ludens and perhaps more. They all had something in common: Instead of waiting for the evolution which was ongoing, they started in parallel to innovate. First by walking upright, using a more advanced language to communicate and learning to have tools to achieve more. Their world was still within a reasonable walking distance and probably they started to eat frogs.
This evolution continued for thousands of years. Human beings started to spread around the world and in waves they brought innovation. They built stone temples, learned to sail, discovered gunpowder, electricity, the universe, the internet and more. It is interesting to see that every time a major innovation was born, these innovators enriched their region in wealth and culture, using their innovation as a competitive advantage to dominate their neighbors.
In many cases 1000 years later, this innovation became a commodity and other civilizations stood up with their innovation and dominated their regional environment which became bigger and bigger in size. Where possible they made use of the cheap resources (modern word for what was initially called slaves) to enrich their civilization. For certain, the most civilized were eating frogs!
Market expansion – innovation pace
During the last century, the pace of innovation went faster and faster. New ways of communication and transportation became available and affordable, which made it impossible for innovations to stay within a specific civilization. Innovation became available for everyone around the world and the domination shifted towards companies and markets.
Companies with a strategy to innovate, discovered that there were new ways needed to respond faster than before to market opportunities. This was the driving force behind PDM, as an first attempt to get a better grip and understanding of their fast evolving, more complex products, that require more and more global collaboration between design teams.
PDM is now accepted as critical by all manufacturing companies around the world, to guarantee quality and efficiency. Customer focus became the next demand from the market and interestingly enough, the demand for frogs decreased.
However this wave of innovation was followed by a wave with even greater impact on the global society. New technologies, the availability of internet and social media, suddenly changed society. Combined with the financial crisis in the US and Europe, it became clear that the way we worked in the past is no longer the way to survive in the future.
Faster and global
PLM was introduced early this century as a new strategy to become more customer-centric, being able to respond faster and better to market demands by bringing innovation to the market before the competition. PLM requires a different approach by companies to work internally and interact with the (global) outside world. The need to implement the PLM vision requires change and as it cannot be considered as an evolutionary process over several generations, it will be a business change. However, in general, human beings do not like rapid change. Here the frogs come back into the picture, now as the boiling frog metaphor.
It is based on 19th century anecdote describing a frog slowly being boiled alive. The premise is that if a frog is placed in boiling water, it will jump out, but if it is placed in cold water that is slowly heated, it will not perceive the danger and will be cooked to death. The story is often used as a metaphor for the inability of people to react to significant changes that occur gradually. This metaphor is very applicable for the classical approach companies bring their products to the market, where innovation is more a lucky coincidence than a result of a strategy.
Here it all comes together again.
Innovation is the only way for companies to avoid becoming a commodity – not able to differentiate for your potential customers. Now the title of this post should be clear: “Do not be a boiling frog, use PLM to support your innovation and become available for the global market”
As the new year has started and it is still time to extend your good intentions, add Innovation, PLM and Change to your survival list.
I look forward to your comments and hope to discuss with you the relation between PLM and Innovation during the upcoming Product Innovation event in Berlin, where I present a session with the title: “PLM loves Innovation ?”
(when you know me, you know the answer, but there are always surprises)
- Are we as dumb as “Slowly Boiling Brainless Frogs”? (1) (blogs.redding.com)
It is interesting to read management books and articles and reflect the content in the context of PLM. In my previous post How the brain blocks PLM acceptance and in Stephen Porter´s (not yet finished) serial The PLM state: the 7 habits of highly effective PLM adoption, you can discover obvious points that we tend to forget in the scope of PLM as we are so focused on our discipline.
This summer holiday I was reading the Innovator´s Dilemma: When New Technologies Cause Great Firms to Fail by Clayton Christensen. Christensen is an associated professor at the Harvard Business School and he published this book already in 1997. Apparently not everyone has read the book and I recommend that if you are involved in the management of a PLM company to read it.
Christensen states there are two types of technologies. Leading companies are supporting their customers and try to serve them better and better by investing a lot in improving their current products. Christensen calls this sustaining technology as the aim is to improve existing products. Sustaining technologies lead to every time more and more effort to improve the current product performance and capabilities due to the chosen technology and solution concepts. These leading companies are all geared up around this delivery process and resources are optimized to sustain leadership, till ….
The other technology Christensen describes is disruptive technology, which initially is not considered as competition for existing technologies as it under performs in the same scope, so no way to serve the customer in the same way. The technology underperforms if you would apply to the same market, but it has unique capabilities that make it fit for another market. Next if the improvement path of disruptive technology can be faster than the improvement path for the sustaining technology, it is possible that their paths meet at a certain point. And although coming from a different set of capabilities, due to the faster improvement process the disruptive technology becomes the leading one and companies that introduced the disruptive technology became the new market leaders.
Why leading companies failed..
Christensen used the disk drive industry as an example as there the change in technology was so fast that it was a perfect industry to follow it´s dynamics. Later he illustrates the concepts with examples from other industries where the leading firms failed and stopped to exist because disruptive technologies overtook them and they were not able to follow that path too.
Although the leading companies have enough resources and skills, he illustrates that it is a kind of logical path – big companies will always fail as it is in their nature to focus on sustaining technology. Disruptive technologies do not get any attention as they are targeting a different unclear market in the beginning and in addition it is not clear where the value from this disruptive technology comes from, so which manager wants to risk his or her career to focus on something uncertain in an existing company.
Christensen therefore advises these leading companies, if they expect certain technologies to become disruptive for their business, to start a separate company and take a major share position there. Leave this company focus on its disruptive technology and in case they are successful and cross the path of the sustaining technology embed them again in your organization. Any other approach is almost sure to fail, quote:
Expecting achievement-driven employees in a large organization to devote critical mass of resources, attention and energy to disruptive projects targeted at a small market is equivalent to flapping one´s arms in an effort to fly
As the book was written in 1997, it was not in the context of PLM. Now let´s start with some questions.
Is ERP in the stage of sustaining technology?
Here I would say Yes. ERP vendors are extending their functional reach to cover more than the core functionality for two reasons: they need continuous growth in revenue and their customers ask for more functionality around the core. For sustaining technologies Christensen identifies four stages. Customers select a product for functionality, when other vendors have the same functionality reliability becomes the main differentiation. And after reliability the next phase is convenience and finally price.
From my personal observations, not through research, I would assume ERP for the major vendors is in the phase between convenience and price. If we follow Christensen´s analysis for SAP and Oracle it means they should not try to develop disruptive technologies inside their organization, neither should they try to downscale their product for the mid-market or add a different business model. Quote:
What goes up – does not go down. Moving to a high-end market is possible (and usually the target) – they will not go to small, poor defined low-end markets
How long SAP and Oracle will remain market leaders will depend on disruptive technologies that will meet the path of ERP vendors and generate a new wave. I am not aware of any trends in that area as I am not following the world of ERP closely
Is PLM in the stage of sustaining technology?
Here I would say No because I am not sure what to consider as a clear definition of PLM. Different vendors have a different opinion of what a PLM system should provide as core technologies. This makes it hard to measure it along the lifecycle of sustaining technology with the phases: functionality, reliability, convenience and price.
Where the three dominant PLM providers (DS/PTC/Siemens) battle in the areas of functionality, reliability and convenience others are focusing on convenience and price.
Some generalized thoughts passed my mind:
- DS and PTC somehow provoke their customers by launching new directions where they believe the customer will benefit from. This somehow makes it hard to call it sustaining technology.
- · Siemens claiming they develop their products based on what customers are asking for. According to Christensen they are at risk in the long term as customers keep you captive and do not lead you to disruptive technologies.
- · All three focus on the high-end and should not aim for smaller markets with the same technology. This justifies within DS the existence of CATIA and SolidWorks and in Siemens the existence of NX and SolidEdge. Unifying them would mean the end of their mid-market revenue and open it for others.
Disruptive technologies for PLM
Although PLM is not a sustained technology to my opinion, there are some disruptive technologies that might come into the picture of mainstream PLM.
First of all there is the Open Source software model, introduced by Aras, which initially is not considered as a serious threat for the classical PLM players – “big customers will never rely on open source”. However the Open Source model allows product improvements to move faster than main stream, reaching at a certain point the same level of functionality, reliability and convenience. The risk for Open Source PLM is that it is customer driven, which according Christensen is the major inhibitor for disruptive steps in the future
Next there is the cloud. Autodesk PLM and Kenesto are the two most visible companies in this domain related to PLM. Autodesk is operating from a comfort zone – it labels its product PLM, it does not try to match what the major PLM vendors try to do and they come from the small and medium mid-size market. Not too many barriers to come into the PLM mid-market in a disruptive manner. But does the mid-market need PLM? Is PLM a bad annotation for its cloud based product? Time will tell.
The management from Kenesto obviously has read the book. Although the initially concept came from PLM++ (bad marketing name), they do not to compete with mainstream PLM and aim their product at a different audience – business process automation. Then if their product picks up in the engineering / product domain, it might enter the PLM domain in a disruptive manner (all according to the book – they will become market leaders)
Finally Search Based Applications which are also a disruptive technology for the PLM domain. Many companies struggle with the structured data approach a classical PLM system requires and especially for mid-market companies this overhead is a burden. They are used to work in a cognitive manner, the validation and formalization is often done in the brain of experienced employees. Why cannot search based technology not be used to create structured data and replace or support the experienced brain?
If I open my Facebook page, I see new content related to where I am, what I have been saying or surfing for. Imagine an employee´s desktop that works similar, where your data is immediately visible and related information is shown. Some of the data might come from the structured system in the background, other might be displayed based on logical search criteria; the way our brain works. Some startups are working in this direction and Inforbix (congratulations Oleg & team) has already been acquired by Autodesk or Exalead by DS.
For both companies if they believe in the above concept, they should remain as long as possible independent from the big parent company as according to Christensen they will not get the right focus and priorities if they are part of the sustainable mainstream technology
This blog post was written during a relaxing holiday in Greece. The country here is in a crisis, they need disruptive politicians. They did it 3500 years ago and I noticed the environment is perfect for thinking as you can see below.
Meanwhile I am looking forward to your thoughts on PLM, in which state we are what the disruptive technologies are.
The problem with a TLA is that there is a limited number of combinations that make sense. And even once you have found the right meaning for a TLA, like PLM you discover so many different interpretations.
For PLM I wrote about this in my post PLM misconceptions –: PLM = PLM ?
I can imagine an (un)certain person, who wants to learn about PLM, might get confused (and should be – if you take it too serious).
At the end your company’s goal should be how to drive innovation, increase profitability and competiveness and not about how it is labeled.
As a frequent reader of my blog, you might have noticed I wrote sometimes about ALM and here a similar confusion might exist as there are three ALMs that might be considered in the context I am blogging.
Therefore this post to clarify which ALM I am dedicated to.
So first I start with the other ALMs:
ALM = Application Lifecycle Management
This is an upcoming discipline in the scope of PLM due to the fact that more and more in the product development world embedded software becomes a part of the product. And like in PLM where we want to manage the product data through its lifecycle, ALM should become a logical part of a modern PLM implementation. Currently most of the ALM applications in this context are isolated systems dealing only with the software lifecycle, see this Wiki Page
ALM = Asset Lifecycle Management (operational)
In 2009 I started to focus on (my type of) ALM, called Asset Lifecycle Management, and I discovered the same confusion as when you talk about a BOM. What BOM really means is only clear when you understand the context. Engineers will usually think of an Engineering BOM, representing product as specified by engineering (managed by PDM). Usually the rest of the organization will imagine the Manufacturing BOM, representing the product the way it will be produced (managed mostly in ERP).
The same is valid for ALM. The majority of people in a production facility, plant or managed infrastructure will consider ALM as the way to optimize the lifecycle of assets. This means optimizing the execution of the plant, when to service or replace an asset ? What types of MRO activities to perform. Sounds a lot like ERP and as it has direct measurable impact on finance, it is the area that gets most of the attention by the management.
ALM = Asset Lifecycle Management (information management)
Here we talk about the information management of assets. When you maintain your assets only in a MRO system, it is similar like in a manufacturing company when only using an ERP system. You have the data for operations, but you do not have the process in place to manage the change and quality of data. In the manufacturing world this is done in PDM and PLM system and I believe owners/operators of plant can learn from that.
I wrote a few posts about this topic, see Asset Lifecycle Management using a PLM system, PLM CM and ALM – not sexy or using a PLM system for Asset Lifecycle Management requires a vision and I am not going to rewrite them in this post. So get familiar with my thoughts if you read the first time about ALM in my blog.
What I wanted to share is that thanks to modern PLM systems, IT infrastructure/technologies and SBA it becomes achievable for owner/operators to implement an Asset Lifecycle Management vision for their asset information and I am happy to confirm that in my prospect and customer base, I see companies investing and building this ALM vision.
And why do they do this:
- Reduce maintenance time (incidental and planned) by days or weeks due to the fact that people have been working with the right and complete data. Depending on the type of operations, one week less maintenance can bring millions (power generation, high demand/high cost chemicals and more)
- Reduce the failure costs dramatically. As maintenance is often a multi-disciplinary activity errors due to miscommunication are considered as normal in this industry (10 % up and even more). It is exactly this multi-disciplinary coordination that PLM systems can bring to this world. And the more you can do in a virtual world the more you can assure you do the right thing during real maintenance activities. Here industries similar as for the previous bullet, but also industries where high-costly materials and resources are used, the impact on reducing failure costs is high.
- Improve the quality of data. Often the MRO system contains a lot of operational parameters that were entered there at a certain time by a certain person with certain skills – the fact that although I used the word certain three times, the result is uncertainty as there is no separate tracing and validation of the parameters per discipline and an uncertain person looking at the data might not discover there is an error, till it goes wrong. Here industries where a human error can be dramatic benefit the most from it (nuclear, complex chemical processes)
Conclusion: The PLM system based ALM implementations are more and more becoming reality next to the ALM operational world. After spending more then three years focused on this area, I believe we can see and learn from the first results.
Are you interested in more details or do you want to share your experience ? Please let me know and I will be happy to extend the discussion
Note: On purpose I used as much TLA’s to assure it looks like an specialist blog, but you can always follow the hyperlink to the wiki explanation, when the TLA occurs the first time.
Sorry for the provoking title in a PLM blog, but otherwise you would not read my post till the end.
In the past months I have been working closely with several large companies (not having a mid-market profile). And although they were all in different industries and have different business strategies, they still had these common questions and remarks:
- How to handle more and more digital data and use it as valuable information inside the company or for their customers / consumers ?
- What to do with legacy data (approved in the previous century) and legacy people (matured and graduated in the previous century) preventing them to change ?
- We are dreaming of a new future, where information is always up-to-date and easy to access – will this ever happen ?
- They are in the automotive industry, manufacturing industry, infrastructure development and maintenance, plant engineering, construction and plant maintenance
- They all want data to be managed with (almost) zero effort
- And please, no revolution or change for the company
Although I have been focusing on the mid-market, it is these bigger enterprises that introduce new trends and as you can see from the observations above, there is a need for a change. But also it looks like the demands are in a contradiction to each other.
I believe it is just about changing the game.
If you look at the picture to the left, you see one of the contradictions that lead to PLM.
Increasing product quality, reducing time to market and meanwhile reducing costs seemed to be a contradiction at that time too.
Although PLM has not been implemented (yet) in every company that could benefit from it, it looks like the bigger enterprises are looking for more.
the L from PLM remains – they still want to connect all information that is related to the lifecycle of their products or plants.
the M from Management has a bad association – companies believe that moving from their current state towards a managed environment of data is a burden. Too much overhead is the excuse to not manage dat. And their existing environments to manage data do not excel in user-friendliness. And therefore people jump towards using Excel.
So if the P is not longer relevant, the M is a burden, what remains of PLM ?
Early June I presented at the Dassault Systems 3DExperience forum the topic of digital Asset Lifecycle Management for owners / operators. One of the areas where I believe PLM systems can contribute a lot to increase business value and profitability (quality and revenue – see using a PLM system for Asset Lifecycle Management )
Attending the key note speech it was clear that Dassault Systems does not talk about PLM anymore as a vision. Their future dream is a (3D) lifelike experience of the virtual world. And based on that virtual model, implement the best solution based on various parameters: revenue, sustainability, safety and more. By trying to manage the virtual world you have the option to avoid real costly prototypes or damaging mistakes.
I believe it is an ambitious dream but it fits in the above observations. There is more beyond PLM.
In addition I learned from talking with my peers (the corridor meetings) that also Siemens and PTC are moving towards a more industry or process oriented approach, trying to avoid the association with the generic PLM label.
Just at the time that Autodesk and the mid-market started to endorse PLM, the big three are moving away from this acronym.
This reminds me of what happened in the eighties when 3D CAD was introduced. At the time the mid-market was able to move to mainstream 3D (price / performance ratio changed dramatically) the major enterprises started to focus on PDM and PLM. So it is logical that the mid-market is 10 – 15 years behind new developments – they cannot afford experiments with new trends.
- the management of structured and unstructured data as a single platform. We see the rise of Search Bases Application and business intelligence based on search and semantic algorithms. Using these capabilities integrated with a structured (PLM ? ) environment is the next big thing.
- Apps instead of generic applications that support many roles. The generic applications introduce such a complexity to the interface that they become hard to use by a casual user. Most enterprise systems, but also advanced CAD or simulation tools with thousands of options suffer from this complexity. Would not it be nice if you only had to work with a few dedicated apps as we do in our private life ?
- Dashboards (BI) that can be created on the
flyrepresenting actual data and trends based
on structured and unstructured data.
It reminded me of a PLM / ERP discussion I had with a company, where the general manager all the time stated the types of dashboards he wanted to see. He did not talk about PLM, ERP or other systems – he wanted the on-line visibility
- Cloud services are coming. Not necessary centralizing all data on the cloud to reduce it cost. But look at SIRE and other cloud services that support a user with data and remote processing power at the moment required.
- Visual navigation through a light 3D Model providing information when required. This trend is not so recent but so far not integrated with other disciplines, the Google maps approach for 3D.
So how likely are these trends to change enterprise systems like PLM, ERP or CRM. In the table below I indicated where it could apply:
As you can see the PLM row has all the reasons to introduce new technologies and change the paradigm. For that reason combined with the observations I mentioned in the beginning, I am sure there is a new TLA (Three Letter Acronym) upcoming.
The good news is that PLM is dynamic and on the move. The bad news for potential PLM users is that the confusion remains – too many different PLM definitions and approaches currently – so what will be the next thing after PLM ?
Conclusion: The acronym PLM is not dead and becomes mainstream. On the high-end there is for sure a trend to a wider and different perspective of what was initially called PLM. After EDM, TDM, PDM and PLM we are waiting for the next TLA
The trigger for this post is based was a discussion I had around the Autodesk 360 cloud based PLM solution. To position this solution and to simplify the message for my conversation partner Joe the plumber, I told him”: “You can compare the solution with Excel on-line. As many small mid-market companies are running around with metadata (no CAD files) in Excel, the simplified game changer with this cloud based PLM offering is that the metadata is now in the cloud, much easier to access and only a single version exists.”
(sorry for Autodesk, if I simplified it too much, but sometimes your conversation partner does not have an IT background as they are plumbers)
He was right and I had to go more in-depth to explain difference. This part of the conversation was similar to discussions I had in some meetings with owner / operators in the civil and energy sector, discussing the benefits of PLM practices for their industry.
I wrote about this in previous posts:
The trouble with dumb documents
Here it was even more a key point of the discussion that most of the legacy data is stored in dumb documents. And the main reason dumb documents are used is because the data needs to be available during the long lifecycle of the the plant, application independent if possible. So in the previous century this was paper, later scanned documents (TIFF – PDF) and currently mainly PDF. Most of the data now is digital but where is the intelligence ?
The challenges these companies have is that despite the fact information is now stored in a digital file, the next step is how to deal with the information in an intelligent manner. A document or an Excel file is a collection of information, you might call it knowledge, but to get access to the knowledge you need to find it.
Did you try to find a specific document in Google docs or SharePoint ? The conclusion will be the file name becomes very important, and perhaps some keywords ?
Is search the solution ?
To overcome this problem, full text search and search based applications were developed, that allow us to index and search inside the documents. A piece of cake for Google and a niche for others to index not only standard documents but also more technical data (drawings, scans from P&ID, etc, etc).
Does this solve the problem ?
Partly, as suddenly the user finds a lot more data. Search on Google for the words “Right data” and you have 3.760.000.000 hits (or more). But what is the right data ? The user can only decide what is the right data by understanding the context.
- Is it the latest version ?
- Is it reflecting the change we made at that functional position ?
- What has changed ?
And here comes the need for more intelligent data. And this is typically where a PLM system provides the answer.
A PLM systems is able to manage different types of information, not only documents. In the context of a plant or a building, the PLM system would also contain:
- a functional definition / structure (linked to its requirements)
- a logical definition / structure (how is it supposed to be ?)
- a physical definition / structure (what is physically there ?)
- a location definition / structure (where in the plant / building ?)
and this is all version managed and related to the supported documents and other types of information. This brings context to the documents and therefore it exposes knowledge.
As there is no automatic switch from dumb documents towards intelligent data, it will be a gradual process to move towards this vision. I see a major role for search based applications to support data discovery. Find a lot of information, but than have the capability to capture the result (or generate a digest of the result) and store it connected to your PLM system, where is it managed in the future and provides the context.
Conclusion: We understand that paper documents are out of time. Moving these documents to digital files stored in a central location, either in SharePoint or a cloud-based storage location is a step we will regret in ten years from now, as intelligent data is not only inside the digital files but also depending on its context.