You are currently browsing the category archive for the ‘PLM 2.0’ category.
In my previous post describing the various facets of the EBOM, I mentioned several times classification as an important topic related to the PLM data model. Classification is crucial to support people to reuse information and, in addition, there are business processes that are only relevant for a particular class of information, so it is not only related to search/reuse support.
In 2008, I wrote a post about classification, you can read it here. Meanwhile, the world has moved on, and I believe more modern classification methods exist.
Why classification ?
First of all classification is used to structure information and to support retrieval of the information at a later moment, either for reuse or for reference later in the product lifecycle. Related to reuse, companies can save significant money when parts are reused. It is not only the design time or sourcing time that is reduced. Additional benefits are lower risks for errors (fewer discoveries), reduced process and approval time (human overhead), reduced stock (if applicable), and more volume discount (if applicable) and reduced End-Of-Life handling.
Classification can also be used to control access to certain information (mainly document classification), or classification can be used to make sure certain processes are followed, e.g. export control, hazardous materials, budget approvals, etc. Although I will speak mainly about part classification in this post, classification can be used for any type of information in the PLM data model.
Depending on the industry you are working in, there are various classification standards for parts. When I worked in the German-speaking countries (the DACH-länder) the most discussed classification at that time was DIN4000 (Sachmerkmal-liste), a must have standard for many of the small and medium sized manufacturing companies. The DIN 4000 standard had a predefined part hierarchy and did not describe the necessary properties per class. I haven’t met a similar standard in other countries at that time.
Another very generic classification I have seen are the UNSPC standard, again a hierarchical classification supporting everything in the universe but no definition of attributes.
Other classification standards like ISO13399, RosettaNET, ISO15926 and IFC exist to support collaboration and/or the supply chain. When you want to exchange data with other disciplines or partners. The advantage of a standard definition (with attributes) is that you can exchange data with less human processing (saving labor costs and time – the benefit of a digital enterprise).
I will not go deeper into the various standards here as I am not the expert for all the standards. Every industry has its own classification standards, a hierarchical standard, and if more advanced the hierarchy is also supported by attributes related to each class. But let´s go into the data model part.
Classification and data model
The first lesson I learned when implementing PLM was that you should not build your classification hard-coded into the PLM, data model. When working with SmarTeam is was very easy to define part classes and attributes to inherit. Some customers had more than 300 classes represented in their data model just for parts. You can imagine that it looks nice in a demo. However when it comes to reality, a hard-coded classification becomes a pain in the model. (left image, one of the bad examples from the past)
1 – First of all, classification should be dynamic, easy to extend.
2 – The second problem however with a hard-coded classification was that once a part is defined for the first time the information object has a fixed class. Later changes need a lot of work (relinking of information / approval processes for the new information).
3 – Finally, the third point against a hard-coded classification is that it is likely that parts will be classified according to different classifications at the same time. The image bellow shows such a multiple classification.
So the best approach is to have a generic part definition in your data model and perhaps a few subtypes. Companies tend to differentiate still between hardware (mechanical / electrical) parts and software parts.
Next a part should be assigned at least to one class, and the assignment to this class would bring more attributes to the part. Most of the PLM systems that support classification have the ability to navigate through a class hierarchy and find similar parts.
When parts are relevant for ERP they might belong to a manufacturing parts class, which add particular attributes required for a smooth PLM – ERP link. Manufacturing part types can be used as templates for ERP to be completed.
Think part of the challenge moving forward is we’ve always handled these as parts under different methodologies, which requires specific data structures for each, etc. The next gen take on all this needs to be more malleable perhaps. So there are just parts. Be they service or make/buy or some combination – say a long lead functional standard part and they would acquire the properties, synchronizations, and behaviors accordingly. People have trouble picking the right bucket, and sometimes the buckets change. Let the infrastructure do the work. That would help the burden of multiple transitions, where CAD BOM to EBOM to MBOM to SBOM eventually ends up in a chain of confusion.
I fully agree with his statement and consider this as the future trend of modern PLM: Shared data that will be enriched by different usage through the lifecycle.
Why don’t we classify all data in PLM?
There are two challenges for classification in general.
- The first one is that the value of classification only becomes visible in the long-term, and I have seen several young companies that were only focusing on engineering. No metadata in the file properties, no part-centric data management structure and several years later they face the lack of visibility what has been done in the past. Only if one of the engineers remembers a similar situation, there is a chance of reuse.
- The second challenge is that through a merger or acquisition suddenly the company has to manage two classifications. If the data model was clean (no hard-coded subclasses) there is hope to merge the information together. Otherwise, it might become a painful activity to discover similarities.
SO THINK AHEAD EVEN IF YOU DO NOT SEE THE NEED NOW !
Modern search based applications
There are ways to improve classification and reuse by using search-based application which can index archives and try to find similarity in properties / attributes. Again if the engineers never filled the properties in the CAD model, there is little to nothing to recover as I experienced in a customer situation. My PLM US peer, Dick Bourke, wrote several articles about search-based applications and classification for engineering.com, which are interesting to read if you want to learn more: Useful Search Applications for Finding Engineering Data
So much to discuss on this topic, however I reached my 1000 words again
Classification brings benefits for reuse and discovery of information although benefits are long-term. Think long-term too when you define classifications. Keep the data model simple and add attributes groups to parts based on functional classifications. This enables a data-driven PLM implementation where the power is in the attributes not longer in the part number. In the future, search-based applications will offer a quick start to classify and structure data.
Everyone wants to be a game changer and in reality almost no one is a game changer. Game changing is a popular term and personally I believe that in old Europe and probably also in the old US, we should have the courage and understanding changing the game in our industries.
Why ? Read the next analogy.
With my Dutch roots and passion for soccer, I saw the first example of game changing happening in 1974 with soccer. The game where 22 players kick a ball from side to side, and the Germans win in the last minute.
My passion and trauma started that year where the Dutch national team changed the soccer game tactics by introducing totaalvoetbal.
Defenders could play as forwards and they other way around. Combined with the offside-trap; the Dutch team reached the finals of the world championship soccer both in 1974 and 1978. Of course losing the final in both situations to the home playing teams (Germany in 74 – Argentina in 78 with some help of the referee we believe)
This concept brought the Dutch team for several years at the top, as the changed tactics brought a competitive advantage. Other teams and players, not educated in the Dutch soccer school could not copy that concept so fast
At the same time, there was a game changer for business upcoming in 1974, the PC.
On the picture, you see Steve Jobs and Steve Wozniak testing their Apple 1 design. The abbreviation IT was not common yet and the first mouse device and Intel 8008 processor were coming to the market.
This was disruptive innovation at that time, as we would realize 20 years later. The PC was a game changer for business.
Johan Cruyff remained a game changer and when starting to coach and influence the Barcelona team, it was his playing concept tika-taka that brought the Spanish soccer team and the Barcelona team to the highest, unbeatable level in the world for the past 8 years
Instead of having strong and tall players to force yourself to the goal, it was all about possession and control of the ball. As long as you have the ball the opponent cannot score. And if you all play very close together around the ball, there is never a big distance to pass when trying to recapture the ball.
This was a game changer, hard to copy overnight, till the past two years. Now other national teams and club teams have learned to use these tactics too, and the Spanish team and Barcelona are no longer lonely at the top.
Game changers have a competitive advantage as it takes time for the competition to master the new concept. And the larger the change, the bigger the impact on business.
Also, PLM was supposed to be a game changer in 2006. The term PLM became more and more accepted in business, but was PLM really changing the game ?
PLM at that time was connecting departments and disciplines in a digital manner with each other, no matter where they were around the globe. And since the information was stored in centralized places, databases and file sharing vaults, it created the illusion that everyone was working along the same sets of data.
The major successes of PLM in this approach are coming from efficiency through digitization of data exchange between departments and the digitization of processes. Already a significant step forward and bringing enough benefits to justify a PLM implementation.
Still I do not consider PLM in 2006 a real game changer. There was often no departmental or business change combined with it. If you look at the soccer analogy, the game change is all about a different behavior to reach the goal, it is not about better tools (or shoes).
The PLM picture shows the ideal 2006 picture, how each department forwards information to the next department. But where is PLM supporting after sales/services in 2006 ? And the connection between After Sales/Services and Concept is in most of the companies not formalized or existing. And exactly that connection should give the feedback from the market, from the field to deliver better products.
The real game changer starts when people learn and understand sharing data across the whole product or project lifecycle. The complexity is in the word sharing. There is a big difference between storing everything in a central place and sharing data so other people can find it and use it.
People are not used to share data. We like to own data, and when we create or store data, we hate the overhead of making data sharable (understandable) or useful for others. As long as we know where it is, we believe our job is safe.
But our job is no longer safe as we see in the declining economies in Europe and the US. And the reason for that:
Data is changing the game
In the recent years the discussion about BI (Business Intelligence) and Big Data emerged. There is more and more digital information available. And it became impossible for companies to own all the data or even think about storing the data themselves and share it among their dispersed enterprises. Combined with the rise of cloud-based platforms, where data can be shared (theoretically) no matter where you are, no matter which device you are using, there is a huge potential to change the game.
It is a game changer as it is not about just installing the new tools and new software. There are two major mind shifts to make.
- It is about moving from documents towards data. This is an extreme slow process. Even if your company is 100 % digital, it might be that your customer, supplier still requires a printed and wet-signed document or drawing, as a legal confirmation for the transaction. Documents are comfortable containers to share, but they are killing for fast and accurate processing of the data that is inside them.
- It is about sharing and combining data. It does not make sense to dump data again in huge databases. The value only comes when the data is shared between disciplines and partners. For example, a part definition can have hundreds of attributes, where some are created by engineering, other attributes created by purchasing and some other attributes directly come from the supplier. Do not fall in the ERP-trap that everything needs to be in one system and controlled by one organization.
Because of the availability of data, the world has become global and more transparent for companies. And what you see here is that the traditional companies in Europe and the US struggle with that. Their current practices are not tuned towards a digital world, more towards the classical, departmental approach. To change this, you need to be a game changer, and I believe many CEOs know that they need to change the game.
The upcoming economies have two major benefits:
- Not so much legacy, therefore, building a digital enterprise for them is easier. They do not have to break down ivory towers and 150 years of proud ownership.
- The average cost of labor is lower than the costs in Europe and the US, therefore, even if they do not do it right at the first time; there is enough margin to spend more resources to meet the objectives.
The diagram I showed in July during the PI Apparel conference was my interpretation of the future of PLM. However, if you analyze the diagram, you see that it is not a 100 % classical PLM scope anymore. It is also about social interaction, supplier execution and logistics. These areas are not classical PLM domains and therefore I mentioned in the past, the typical PLM system might dissolve in something bigger. It will be all about digital processes based on data coming for various sources, structured and unstructured. Will it still be PLM or will we call it different ?
The big consultancy firms are all addressing this topic – not necessary on the PLM level:
2012 Cap Gemini – The Digital advantage: …..
2013 Accenture – Dealing with digital technology’s disruptive impact on the workforce
For CEOs it is important to understand that the new, upcoming generations are already thinking in data (generation Y and beyond). By nature, they are used to share data instead of owning data in many aspects. Making the transition to the future is, therefore, also a process of connecting and understanding the future generations. I wrote about it last year: Mixing past and future generations with a PLM sauce
This cannot be learned from an ivory tower. The easiest way is not to be worried by this trend and continue working as before, losing business and margin slowly year by year.
As in many businesses people are fired for making big mistakes, doing nothing unfortunate is most of the time not considered as a big mistake, although it is the biggest mistake.
During the upcoming PI Conference in Berlin I will talk about this topic in more detail and look forward to meet and discuss this trend with those of you who can participate.
The soccer analogy stops here, as the data approach kills the the old game.
In soccer, the maximum remains 11 players on each side and one ball. In business, thanks to global connectivity, the amount of players and balls involved can be unlimited.
Because the leagues I was playing in, were always limited in scope: by age, local,regional, etc. Therefore it was easy to win in a certain scope and there are millions of soccer champions beside me. For business, however, there are almost no borders.
Global competition will require real champions to make it work !!!
When you are in a peaceful holiday accommodation close to the sea, it is about swimming, reading sleeping and food. I read two books this time Profit Beyond Measure from H. Thomas Johnson (2000) and Fast Future from David Burnstein (2013).
In a earlier post, PLM Statistics, I already referred to Johnson´s book. Now I had the time to read the whole book. Johnson is an advocate for MBM (Manage By Means) as compared to the most practiced MBM (Manage By Results) approach.
In Fast Future, Burnstein explains why his generation of Millennials (Generation Y) is not lazy and egocentric (etc. etc.) but different and ready for the future. Different from the Boomers, generation X and
These two books on two different topics have nothing in common you might think. But all you need is a PLM twisted brain, and it will be connected.
Let’s start with Profit Beyond Measure
Johnson in his introduction explains how manufacturing companies were gradually pushed into a MBR approach (Manage By Results). The Second World War was the moment that companies started to use accounting information to plan business activities. The growing presence of accountants in business started due to more regulations and financial regulations. Corporate executives were educated by professors of accounting and finance how to use their accounting information to plan and control business activities.
The result (quoting Johnson):
“..teaching a new generation of managers to put aside understanding the concrete particulars of how business organizes work. They taught them instead to focus exclusively on abstract quantitative generalizations about financial results”
And as he writes a little later:
“The unique feature of the multidivisional organization was the introduction of a level of managers that had not existed before. Managers at this level ran what appeared to be self-standing, fully articulated multifunctional companies known as divisions. The manager of a division, however, reported to a top management group that represented in effect, the market for capital and the market for managers”
The PLM-twisted brain understands that Johnson is describing one of the major inhibitors for PLM. PLM requires departments and individuals TO SHARE and work CONCURRENT on information. Meanwhile, department and division leaders are trained, pushed and measured to optimize their silo businesses to deliver the right financial results. Executives above the management monitor the consolidated numbers and have the slightest understanding of the real business challenges PLM can solve. Here, innovative ways of working are not discussed; numbers (costs /ROI) are discussed.
To proceed with Johnson, he believes in MBM (Manage by Means). Manage by Means could be compared with the way an organic life system is behaving. Johnson describes it as:
“Every entity is focusing on doing work, not on manipulating quantitative abstractions about work. In a company this would mean every person’s activity will embody that most fundamental condition of natural life systems – namely that all knowing is doing and that all doing is knowing”
Although Johnson is focusing on manufacturing companies (Toyota and Scania as two major examples of MBM), the PLM-twisted mind reads this as a concept that matches the PLM vision.
Everything and everyone is connected to the process and having the understanding how to interpret the data and what do to. This is how I imagine PLM implementations. Provide the right information to every person not matter where this person is in the lifecycle of the product. Too much automation prevents the system to be flexible and adapt to changes an in addition, it does not challenge the user anymore to think.
Enough about Profit Beyond Measure, ending with a quote about Manage by Means:
“…. which will bring a change in thinking for the next generation of managers more revolutionary than that which every previous generation has ever experienced”
Now the Fast Future
In Fast Future, David Burnstein talks about his generation, the Millennials, and how they are different. The Millennials are people who are now between 20 and 35. They grew up with one foot in the old analogue world and came to full wisdom in a digital, social connected manner during several shocking crises that formed their personality and behavior ( 9/11 – financial crisis – globalization – huge unemployment) according to Burnstein. People also referred to them as Generation Y.
In the context of this post we have the need to imagine four generations:
- The Pré-boomers, who build up the economy after the second world war, and as we learned from Johnson who introduced the mechanical thinking for business (MBR – management by results)
- The Boomers (my generation) who had the luxury to study and discuss the ultimate change for the world (make love not war), idealistic to change the world, but now most of us working in an MBR mode
- Generation X, they introduced punk, skeptics. They are supposed to be cynical, very ego-centric and materialistic. I am sure they also have positive points, but I haven’t read a book about them and you do not meet Generation X in the context of a particular change to something new (yet)
- Generation Y, the Millennials, who considered by the Boomers, is another lazy generation, all the time surfing the internet, not committing to significant causes, but seem to enjoy themselves. Burnstein in his book changes the picture as we will see below.
According to Burnstein the Millennials are forced to behave different as the traditional society is falling apart due to different crises and globalization. They have to invent a new purpose. And as they are so natural with all the digital media they can connect to anyone or any group to launch ideas, initiatives and build companies. The high unemployment numbers in their generation force them to take action and to become an entrepreneur, not always for profit but also for social or sustainable reasons.
They understand they will have to live with uncertainty and change all their lives. No guaranteed job after education, no certain pension later and much more uncertainty. This creates a different attitude. You embrace change, and you do not go for a single dream anymore like many of the boomers did.
Choosing the areas that are essential for you and where you think you can make a significant impact become important. Burnstein points to several examples of his generation and the impact they already have on society. Mark Zuckerberg – Facebook founder is a Millennial, many modern social apps are developed by Millennials, Obama won the elections twice, due to the impact and connectivity of the Millennials generation, the Facebook revolutions in the Middle East (Tunisia / Egypt/Libya) al lead by desperate Millennials that want to make a change.
When reading these statements, I wondered:
Would there also be Millennials in Germany?
As in Germany the impact of 9/11, the financial crisis and unemployment numbers did not touch that much. Are they for that reason the same as generation X? Perhaps a German reader in the millennial age can provide an answer here?
What I liked about the attitude described by Burnstein is that the Millennials network together for a better cause, a meaningful life. This could be by developing products, offer different types of services all through a modern digital means. The activities all in the context of social responsibility and sustainability, not necessary to become rich.
As noticed, they think different, they work different and here Johnson’s quote came to my mind:
“…. which will bring a change in thinking for the next generation of managers more revolutionary than that which every previous generation has ever experienced”
And the PLM-twisted brain started drifting
Is this the generation of the Millennials Johnson is hoping for? The high-level concept of Management by Means is based on the goal to have every entity directly linked to the cause – a customer order, flexibility, ability to change when needed. Not working with abstract mechanical models. I think the Millennials should be able to understand and lead these businesses.
This culture change and a different business approach to my opinion are about modern PLM. For me, modern PLM focuses on connecting the data, instead of building automated processes with a lot of structured data.
Current the modern PLM system as I described is does not exist (or I haven’t seen it yet). Also I have not worked with Millennials in a leading role in a company. Therefore, I kept on dreaming during my holiday – everything is possible if you believe it –even standing on the water:
And although after reading these books and seeing the connection, you can have the feeling that you are able to walk on the water. There are also potential pitfalls (a minute later) ahead to be considered as you can see below:
My PLM-twisted mind as you noticed combines everything.
What do you think?
Did I hallucinate or is there a modern future for business and PLM.
I am looking forward to learning your dreams.
Sorry for the provoking title in a PLM blog, but otherwise you would not read my post till the end.
In the past months I have been working closely with several large companies (not having a mid-market profile). And although they were all in different industries and have different business strategies, they still had these common questions and remarks:
- How to handle more and more digital data and use it as valuable information inside the company or for their customers / consumers ?
- What to do with legacy data (approved in the previous century) and legacy people (matured and graduated in the previous century) preventing them to change ?
- We are dreaming of a new future, where information is always up-to-date and easy to access – will this ever happen ?
- They are in the automotive industry, manufacturing industry, infrastructure development and maintenance, plant engineering, construction and plant maintenance
- They all want data to be managed with (almost) zero effort
- And please, no revolution or change for the company
Although I have been focusing on the mid-market, it is these bigger enterprises that introduce new trends and as you can see from the observations above, there is a need for a change. But also it looks like the demands are in a contradiction to each other.
I believe it is just about changing the game.
If you look at the picture to the left, you see one of the contradictions that lead to PLM.
Increasing product quality, reducing time to market and meanwhile reducing costs seemed to be a contradiction at that time too.
Although PLM has not been implemented (yet) in every company that could benefit from it, it looks like the bigger enterprises are looking for more.
the L from PLM remains – they still want to connect all information that is related to the lifecycle of their products or plants.
the M from Management has a bad association – companies believe that moving from their current state towards a managed environment of data is a burden. Too much overhead is the excuse to not manage dat. And their existing environments to manage data do not excel in user-friendliness. And therefore people jump towards using Excel.
So if the P is not longer relevant, the M is a burden, what remains of PLM ?
Early June I presented at the Dassault Systems 3DExperience forum the topic of digital Asset Lifecycle Management for owners / operators. One of the areas where I believe PLM systems can contribute a lot to increase business value and profitability (quality and revenue – see using a PLM system for Asset Lifecycle Management )
Attending the key note speech it was clear that Dassault Systems does not talk about PLM anymore as a vision. Their future dream is a (3D) lifelike experience of the virtual world. And based on that virtual model, implement the best solution based on various parameters: revenue, sustainability, safety and more. By trying to manage the virtual world you have the option to avoid real costly prototypes or damaging mistakes.
I believe it is an ambitious dream but it fits in the above observations. There is more beyond PLM.
In addition I learned from talking with my peers (the corridor meetings) that also Siemens and PTC are moving towards a more industry or process oriented approach, trying to avoid the association with the generic PLM label.
Just at the time that Autodesk and the mid-market started to endorse PLM, the big three are moving away from this acronym.
This reminds me of what happened in the eighties when 3D CAD was introduced. At the time the mid-market was able to move to mainstream 3D (price / performance ratio changed dramatically) the major enterprises started to focus on PDM and PLM. So it is logical that the mid-market is 10 – 15 years behind new developments – they cannot afford experiments with new trends.
- the management of structured and unstructured data as a single platform. We see the rise of Search Bases Application and business intelligence based on search and semantic algorithms. Using these capabilities integrated with a structured (PLM ? ) environment is the next big thing.
- Apps instead of generic applications that support many roles. The generic applications introduce such a complexity to the interface that they become hard to use by a casual user. Most enterprise systems, but also advanced CAD or simulation tools with thousands of options suffer from this complexity. Would not it be nice if you only had to work with a few dedicated apps as we do in our private life ?
- Dashboards (BI) that can be created on the
flyrepresenting actual data and trends based
on structured and unstructured data.
It reminded me of a PLM / ERP discussion I had with a company, where the general manager all the time stated the types of dashboards he wanted to see. He did not talk about PLM, ERP or other systems – he wanted the on-line visibility
- Cloud services are coming. Not necessary centralizing all data on the cloud to reduce it cost. But look at SIRE and other cloud services that support a user with data and remote processing power at the moment required.
- Visual navigation through a light 3D Model providing information when required. This trend is not so recent but so far not integrated with other disciplines, the Google maps approach for 3D.
So how likely are these trends to change enterprise systems like PLM, ERP or CRM. In the table below I indicated where it could apply:
As you can see the PLM row has all the reasons to introduce new technologies and change the paradigm. For that reason combined with the observations I mentioned in the beginning, I am sure there is a new TLA (Three Letter Acronym) upcoming.
The good news is that PLM is dynamic and on the move. The bad news for potential PLM users is that the confusion remains – too many different PLM definitions and approaches currently – so what will be the next thing after PLM ?
Conclusion: The acronym PLM is not dead and becomes mainstream. On the high-end there is for sure a trend to a wider and different perspective of what was initially called PLM. After EDM, TDM, PDM and PLM we are waiting for the next TLA
In my last post PLM kills Innovation or not, I tried to provoke PLM vendors to respond to my claim that PLM has too much a focus on structuring data (and therefore removing freedom) claiming it blocks innovation as everyone believes innovation requires freedom and flexibility. This statement is often heard from startups claiming implementing any type of management would kill their competitive advantage. Still in the PLM marketing world everyone mentions PLM and Innovation as Siamese twins, but no one explains explicitly why they are connected.
So not too many reactions from vendors but some interesting comments from others to this post. Andrew Mack mentions that we should not confuse Innovation and Invention as for native English speakers there is a clear distinction. I agree with him however as most of my blog readers are not native English speakers I will explain the difference in this post.
For me it is clear PLM supports Innovation in three different manners, which I will explain here in a logical order – see the conclusion for the order of profit it will bring:
Invention, the creation of a new idea that might be the golden egg for the future of a company. It is often the result of one or more individuals- not something a systematic approach or system will bring automatically. If you look how big companies handle with invention, you see that often they do not manage it. They look around the world for , or sometimes get approached by, startups that have a concept that fits to their portfolio and they buy the company and concept.
This is of course a very disconnected way of invention, but from the other hand, the drive from many startups is to work day and night to develop a concept and ultimately sell the company for a good price. Compare it to the big soccer companies that have only money (currently mainly Russian or Arabic) but no own youth development plan to raise new talents. So it is a common way for companies to acquire invention (and promote innovation).
But I believe there is also a way companies can stimulate invention by implementing the modern way of PLM (PLM 2.0 – see my posts on that) and not use PLM as an extended PDM as I described in PLM What is the target. When a company has implemented PLM in a PLM 2.0 approach, it means there is a full visibility and connection of all product data, customer demands (through sales) and experiences (through service) for an R&D department to innovate.
Why this does not happen so much?
Because inside most companies, people do not have an approach or drive for sharing data through the whole product lifecycle. Every department is optimizing themselves, not taking into account the value and overall company needs as they are not measured on that. In order to support invention PLM can provide an R&D department and individuals with all related market and customer information in order to create relevant inventions. So PLM helps here on understanding the areas of invention and probably the most unexplored area of PLM
Support selection of the right invention
The second area where PLM contributes to innovation is assisting companies to select the right opportunities that can be the next big opportunity for these companies. In case you have many opportunities, which one would you select and invest in ? As usually it unaffordable to invest in every opportunity usually and knowing at this stage you are not sure if a particular opportunity will lead to a profitable new product, you need a process and tool to select the right ones.
Here comes portfolio management as a functionality that allows companies to have an overview of all running initiatives and through reporting on key performance indicators (KPIs) being able to select the opportunities where to invest.
Support New Product Introduction
Once you have selected an opportunity and also as part of the portfolio management process you feel secure, there is the third step. How to bring this opportunity to the market as fast as possible, with the right quality and the right manufacturing definition? As being first on the market gives you market share and premium pricing.
Also as changes in the early manufacturing stage and later during the go to market phase are extremely costly, it is important to bring a new product to the market as fast as possible in the right quality, avoiding changes when the new product is in the market. This is the area where PLM contributes the most. Allowing R&D organizations to work on their virtual product definition and perform simulations, design and customer verifications. Also anticipate and resolve compliancy and sourcing issues in the early stages of the product development. All this assures a reduction in the amount of iterations before a new product is ready to ´ hit´ the market.
A famous PLM one-liner is for PLM is: PLM – doing it right the first time, it refers more to the fact that a product introduction process is done only once and with the right quality. It does not mean iterations to improve or change the product scope are not needed.
Improvement cycles are necessary to bring a product to the market. But as they are done in the virtual world, the R&D department has the option to evaluate several alternatives (virtually), work and improve them till the best option is selected for the market saving cost for late design changes or errors to be solved. And even when the product is defined, PLM can help by defining the right generic manufacturing process and make it available for the local manufacturing organizations (where is the MBOM ?)
PLM does not kill innovation and although the PLM Vendor marketing is not very explicit, there are three areas where PLM supports Innovation. In a (subjective) order of priority I would say:
· New Product Introduction – bringing the highest revenue advantages for a selected invention
· Invention discovery – by providing R&D a 360 view of their customers and market landscape enable inventions to happen in your company
· Portfolio Management – to assist in selecting the right opportunities to focus
Your thoughts ?
Last week I started my final preparation for the PLM Innovation Congress 2012 on February 22nd and 23rd in Munich, where I will speak about Making the Case for PLM. Looking forward for two intensive days of knowledge sharing and discussion
The question came to my mind that when you make the case for PLM, you also must be clear about what you mean by PLM. And here I started to struggle a little. I have my perception of PLM, but I am also aware everyone has a different perception about the meaning of PLM.
I wrote about it last year, triggered by a question in the CMPIC group (configuration management) on LinkedIn. The question was Aren’t CM and PLM the same thing ? There was a firm belief from some of the members that PLM was the IT-platform to implement CM.
A few days ago Inge Craninckx posted a question in the PDM PLM CAD network group about the definition of PLM based on a statement from the PLMIG. In short:
“PDM is the IT platform for PLM.”Or, expressed from the opposite viewpoint: “PLM is the business context in which PDM is implemented
The response from Rick Franzosa caught my attention and I extracted the following text:
The reality is that most PLM systems are doing PDM, managing product data via BOM management, vaulting and workflow. In that regard, PDM [read BOM management, vaulting and workflow], IS the IT platform for the, in some ways, unfulfilled promise of PLM.
I fully agree with Rick’s statement and coming back to my introduction about making the case for PLM, we need to differentiate how we implement PLM. Also we have to take into our minds that no vendor, so also not a PLM vendor, will undersell their product. They are all promising J
Two different types of PLM implementation
Originally PLM has started in 1999 by extending the reach of Product Data outside the engineering department. However besides just adding extra functionality to extend the coverage of the lifecycle, PLM also created the opportunity to do things different. And here I believe you can follow two different definitions and directions for PLM.
Let’s start with the non-disruptive approach, which I call the extended PDM approach
When I worked 6 years ago with SmarTeam on the Express approach, the target was to provide an OOTB (Out of the Box) generic scenario for mid-market companies. Main messages were around quick implementation and extending the CAD data management with BOM and Workflow. Several vendors at that time have promoted their quick start packages for the mid-market, all avoiding one word: change.
I was a great believer of this approach, but the first benchmark project that I governed demonstrated that if you want to do it right, you need to change the way people work, and this takes time (It took 2+ years). For the details: See A PLM success story with ROI from 2009
Cloud based solutions have become now the packaging for this OOTB approach enriched, with the ease of deployment – no IT investment needed (and everyone avoids the word change again).
If you do not want to change too much in your company, the easiest way to make PDM available for the enterprise is to extend this environment with an enterprise PLM layer for BOM management, manufacturing definition, program management, compliancy and more.
Ten years ago, big global enterprises started to implement this approach, using local PDM systems for mainly engineering data management and a PLM system for the enterprise. See picture below:
This approach is now adapted by the Autodesk PLM solution and also ARAS is marketing themselves in the same direction. You have a CAD data management environment and without changing much on that area, you connect the other disciplines and lifecycle stages of the product lifecycle by implementing an additional enterprise layer.
The advantage from this approach is you get a shared and connected data repository of your product data and you are able to extend this with common best practices, BOM management (all the variants EBOM/MBOM/SBOM, …) but also connect the market opportunities and the customer (Portfolio management, Systems engineering)
The big three, Dassault Systemes, Siemens PLM and PTC, provide the above functionality as a complete set of functionalities – either as a single platform or as a portfolio of products (check the difference between marketing and reality).
Oracle and SAP also fight for the enterprise layer from the ERP side, by providing their enterprise PLM functionality as an extension of their ERP functionality. Also here in two different ways: as a single platform or as a portfolio of products. As their nature is on efficient execution, I would position these vendors as the one that drive for efficiency in a company, assuming all activities somehow can be scheduled and predicted
My statement is that extended PDM leads to more efficiency, more quality (as you standardize on your processes) and for many companies this approach is a relative easy way to get into PLM (extended PDM). If your company exists because of bringing new products quickly to the market, I would start from the PDM/PLM side with my implementation.
The other PLM – innovative PLM
Most PLM vendors associate the word PLM in their marketing language with Innovation. In the previous paragraph I avoided on purpose the word Innovation. How do PLM vendors believe they contribute to Innovation?
This is something you do not hear so much about. Yes, in marketing terms it works, but in reality? Only few companies have implemented PLM in a different way, most of the time because they do not carry years of history, numbering systems, standard procedures to consider or to change. They can implement PLM in a different way, as they are open to change.
If you want to be innovative, you need to implement PLM in a more disruptive manner, as you need to change the way your organization is triggered – see the diagram below:
The whole organization works around the market, the customer. Understanding the customer and the market needs at every moment in the organization is key for making a change. For me, an indicator of innovative PLM is the way concept development is connected with the after sales market and the customers. Is there a structured, powerful connection in your company between these people? If not, you do the extended PLM, not the innovative PLM.
Innovative PLM requires a change in business as I described in my series around PLM 2.0. Personally I am a big believer that this type of PLM is the lifesaver for companies, but I also realize it is the hardest to implement as you need people that have the vision and power to change the company. And as I described in my PLM 2.0 series, the longer the company exist, the harder to make a fundamental change.
There are two main directions possible for PLM. The first and oldest approach, which is an extension of PDM and the second approach which is a new customer centric approach, driving innovation. Your choice to make the case for one or the other, based on your business strategy.
Looking forward to an interesting discussion and see you in Munich where I will make the case
Last week I started a small series of posts related to the topic PLM 2.0. I was hoping for more comments and discussion about the term PLM 2.0, although I must say I was glad Oleg picked it up in his posts: PLM 2.0 born to die? and Will JT-open enable future of PLM 2.0?
Oleg, as a full-time blogger, of course had the time to draw the conclusions, which will take me another two weeks, hoping meanwhile the discussion evolves. Where Oleg’s focus is on technology and openness (which are important points), I will also explain that PLM 2.0 is a change in doing business, but this will be in next week’s post.
This week I will focus on the current challenges and pitfalls in PLM. And we all know that when somebody talks about challenges, there might be problems.
|Last week||: What is PLM 2.0?|
|This week:||: Challenges in current PLM|
|Next||: Change in business|
|Final post||: Why PLM 2.0 – conclusions|
The Challenges in current PLM
First I want to state that there are several types of definition in the world for PLM, coming from different type of organizations – I listed here two vendor independent definitions:
In industry, product lifecycle management (PLM) is the process of managing the entire lifecycle of a product from its conception, through design and manufacture, to service and disposal. PLM integrates people, data, processes and business systems and provides a product information backbone for companies and their extended enterprise.
Product Lifecycle Management (PLM) is the business activity of managing a company’s products all the way across the lifecycle in the most effective way. The objective of PLM is to improve company revenues and income by maximizing the value of the product portfolio
And there are more definitions. Just recently, I noticed on the PlanetPTC blog from Aibhe Coughlan a post where she promoted a definition of PLM published in the Concurrent Engineering blog. Here I got immediate a little irritated reading the first words: “PLM is software designed to enhance process efficiencies ……… and more …”
I do not believe PLM is software. Yes there is software used to automate or implement PLM practices, but this definition starts to neglect the culture and process sides of PLM. And as Oleg was faster – read his more extended comment here
(I am not paid by Oleg to promote his blog, but we seem to have similar interests)
Back to the classical definitions
The Wiki definition gives the impression that you need to have an infrastructure to manage (store) all product data in order to serve as an information backbone for the extended enterprise. It becomes more an IT-project, often sponsored by the IT-department, with the main goal to provide information services to the company in a standardized manner.
This type of PLM implementations tends to be the same type of implementation as an ERP system or other major IT-system. In this type of top-down implementations, the classical best practices for project management should be followed. This means:
- A clear vision
- Management sponsorship
- A steering committee
- A skilled project leader and team
- Committed resources
- Power user involvement
- …… and more …
These PLM projects are promoted by PLM vendors and consultants as the best way to implement PLM. And there are a lot of positive things to say about this approach. For many big companies implementing cPDM or PLM was a major step forward. Most of the ROI stories are based on this type of implementations and have been the showcases on PLM events. It is true that data quality increases, therefore efficiency and product quality. Without PLM they would not reach the same competiveness as they have now.
But sometimes these projects go into extreme when satisfying users or IT-guidelines
To avoid the implementation of a ‘new IT-system’, companies often have the strategy that if we already have an ERP-system , let’s customize or extend it, so we can store the additional data and perform workflow processes based on this system.
In a recent webinar, I heard a speaker saying that in their company they had the following automation strategy defined together with IT is:
- First they will see if the needed PLM functionality exists in their ERP system or is part of the portfolio of their ERP provider. If the functionality is there (this means the ERP vendor has the capability to store metadata and a factsheet mentioning the right name), there is no looking outside.
- If the functionality is not there, there will be a discussion with the ERP vendor or implementer to build it on top of their ERP system.
I have seen implementations where the company has developed complete custom user interfaces in order to get user acceptance (the users would not accept the standard graphical interface). At that time, no one raised the flag about future maintenance and evolution of these custom environments. The mood was: we kept it simple – one single system.
I believe this closes the door for real PLM, as storing data in a system does not mean you will use it in an efficient and optimized manner. How will you anticipate on changes in business if it is just doing more with the same system?
And mid-market companies ?
The top-down approach described before is the fear of many mid-market companies, as they remember how painful their first ERP implementation was. And now with PLM it is even more unclear. PLM aims to involve the engineering department, which so far has not worked in a very procedural manner. Informal and ad-hoc communication combined with personal skills within this department was often the key for success.
And now an unfriendly system is brought in, with low or little usability, pushing these creative people to enter data without seeing any benefits. The organization downstream benefits but this will be only noticed later in time. And for the engineering department it will take more effort to change their work methodology focused on innovation. However, in general in the mid-market, the target of a PLM project is to have a Return on Investment (ROI) in a very short timeframe ( 1-2 years). Investing in usability should be even more important for this type of companies as there is less top-down pressure to accept this new PLM system.
And flexibility ?
In the past years we have seen that business is changing – there is a shift in global collaboration and manufacturing and from the recent history we can learn that those big enterprise projects from the past became a threat. Instead of being able to implement new concepts or new technology, the implementation became more and more vendor monolithic as other capabilities and applications do not fit anymore. This is against the concept of openness and being flexible for the future. I believe if PLM becomes as rigid as ERP, it blocks companies to innovate – the challenge for big companies is to find the balance between stability and flexibility (This was the title from Sony Ericsson’s presentation at the PLM forum in Sweden this year)
And again for mid-market companies who do not have the budget or resources to invest in similar projects. They have less a drive to optimize themselves in the same manner as big companies do as flexibility is often their trade mark (and capability to innovate) . So PLM for the mid-market will not work in the classical way.
This is one of the reasons why a mid-market PLM standard has not yet been found (yet ?). From the other hand many mid-market companies are dealing with PLM practices although often it is more close to PDM and CAD data management. And mid-market companies do not change their organization easily – there is more a departmental approach avoiding therefore a change in business.
To summarize the biggest challenges in current PLM described in this post:
- PLM is considered complex to implement
- PLM is a huge IT-project
- PLM requires change and structuring – but what about flexibility
- Where is the PLM value and ROI – user acceptance
- PLM for the mid-market – does it exist ?
Conclusion: I have been writing about the PLM challenges in the past, see the links below if you are interested in more details on a specific topic.
In 2008,I thought that Out-of-the-Box PLM systems and standard functionalities could bring a solution for the mid-market, perhaps future solutions based on the cloud. However I learned that if you want to do real PLM in a modern manner, you need to change the way you do your business – and this I will explain in my upcoming post.
- PLM and IT – love/hate relation ?
- Implementing PLM is a change not a tool
- Which PLM to choose
- PLM for the mid-market mission impossible ?
- 5 reasons not to implement plm – post 5 with links to post 1 to 4
Recently I have been reading various interesting articles, it started with Why Amazon can’t Make a Kindle in the USA from Steve Denning and from here I followed several interesting links.
Most of the articles were business driven and not with a focus on technology. However what caught my attention was the similarity of issues that were raised in these articles as-if it was about PLM.
At the end it is a plea/cry for change to be more competitive in the future. With the current economical stand still, I believe there is a need and an opportunity for this change also in PLM. I am not pointing to regime changes all around the world, but somehow they are all connected to this new wave of globalization and openness to information.
And as my domain is PLM, I took PLM 2.0 as the vehicle to describe the change currently in the PLM world. Although PLM 2.0 is a term invented by Dassault Systems, I will use it as the placeholder to describe the changes in PLM.
|This week||: What is PLM 2.0 ?|
|Next||: Challenges in current PLM|
|Next||: Change in business|
|Final post||: Why PLM 2.0 – conclusions|
I hope you will stay with me when going through these four steps and look forward to your immediate feedback.
What is PLM 2.0 ?
In 2006 Dassault Systems announced PLM 2.0 as the new generation of PLM implemented on their V6 platform. If you go to the 3DS website you see the following definition of PLM 2.0
Look for the header PLM 2.0: PLM Online for All
In the DS definition you will find several keywords that will help us further to understand the PLM 2.0 capabilities:
a typical Dassault Systems viewpoint, as they are coming from the world or 3D CAD and virtualization and the company’s vision is around lifelike – and life is mostly in 3D.
3D as interface towards all product related information is a paradigm shift for companies that were used to display only metadata on boring tabular screens where you navigate on numbers and text. The other major CAD-related PLM vendors of course could follow this paradigm too, as 3D visualization of information is known to them. However when coming from an ERP-based PLM system you will see 3D is something far out of reach for these vendors (at this moment).
This is what I believe is a crucial keyword for all PLM future implementations it builds upon the Business Information concepts that became in fashion 8 years ago. Online means direct access to the actual data. No information conversion, no need for import or export, but sharing and filtering. What you are allowed to see is actual data and an actual status. Imagine what kind of impact working on-line would have on your organization. Evaluation of trends, Key Performance Indicators directly available – still of course the interpretation to be done by experts.
Intellectual Property – a topic that should be on every company’s agenda. The reason a company currently exists and will exist in the future is based on how they manage their unique knowledge. This knowledge can be based on how certain processes are done, which components are chosen, which quality steps are critical and more. Working in a global collaboration environment challenges the company to keep their IP hidden for others, for sure when you talk about online data. Losing your IP means for a company to be vulnerable for the future – read in the referenced blog post from Steve Jennings about DELL.
This is currently the platform for change as technologies are now enabling people and companies to implement applications in a different manner. Not only on premises, but it could be online, Software As A Service, Cloud based solutions and through standardized programming interfaces, companies could implement end-to-end business process without a huge, monolithic impact. Also Web 2.0 provides the platform for communities.
The concept of communities opens new perspectives for collaboration. In general people in a community, have a common interest or task, and they share thoughts, deliverables back to the community across all company borders. This is the power of the community and the collective intelligence built inside such a community. Without company borders it should give the people a better perspective on their market on their business due to the global participation
The vision is there – now ….
All the above keywords are capabilities for the future and in the world of PLM you see that every PLM vendor / implementer is struggling with them. How to implement them consistently across their offering is the major challenge for the upcoming years, assuming PLM 2.0 is considered as the next step.
If you look at the PLM vendors beside Dassault Systems, you see that Siemens and PTC are closest to following the PLM 2.0 approach, without mentioning the term PLM 2.0. Other vendors even refuse to talk about PLM, but they share already similar components, for example Autodesk.
Interesting to see that the ERP-based PLM vendors do not follow this trend in their communication, they are still working on consolidating and completing their ‘classical’ PLM components
But the classical PLM vendors struggle with the change in paradigm too.
- What to do with current, huge and structured implementations ?
- Is PLM 2.0 having the same demands or can it be different ?
Here you see opportunities for new comers in this market as you can implement online collaboration, intellectual property creation/handling and communities in different manners with different types of implementation demands.
So far my introduction in PLM 2.0. Browsing on the web, I did not find too much other viewpoints on this specific terminology, so I am curious about your thoughts or and complementary comments on this topic.
In my next post I will zoom in into the challenges of PLM and relate them to the PLM 2.0 vision
My take on PLM (classical) and PLM 2.0
Referenced in this context – not directly mentioned:
- IBM visionary presentation from 2006 – Michael Neukirchen
- The future of PLM – Martin Ohly (global PLM blog)
- PLM 2.0 technology or facelift – Oleg Shilovitsky
- Social Media and PLM explained for Dummies – Jos Voskuil
- Going Social With Product Development – Jim Brown