You are currently browsing the category archive for the ‘PLM’ category.
In my series of blog posts related to the (PLM) data model, I talked about Product, BOMs and Parts. This time I want to focus on the EBOM and (CAD) Documents relation. This topic became relevant with the introduction of 3D CAD.
Before companies were using 3D CAD systems, there was no discussion about EBOM or MBOM (to my knowledge). Engineering was producing drawings for manufacturing and not every company was using the mono-system (for each individual part a specifying drawing). Drawings were mainly made to assist production and making a drawing for an individual part was a waste of engineering time. Parametric drawings were used to specify similar parts. But now we are in the world of 3D!
With the introduction of 3D CAD systems for the mainstream in the nineties (SolidWorks, Solid Edge, Inventor) there came a need for PDM systems managing the individual files from a CAD assembly. The PDM system was necessary to manage all the file versions. Companies that were designing simple products sometimes remained working file-based, introducing the complexity of how to name a file and how to deal with revisions. Ten years ago I was investigating data management for the lower tiers of the automotive supply chain. At that time still 60 % of the suppliers were using CATIA were working file-based. Data management was considered as an extra complexity still file version control was a big pain.
This has changed for several reasons:
- More and more OEMs were pushing for more quality control of the design data (read PDM)
- Products became more modular, which means assemblies can be used as subassemblies in other products, pushing the need for where used control
- Products are becoming more complex and managing only mechanical CAD files is not enough anymore – Electronics & Software – mechatronics – became part of the product
Most PDM systems at that time (I worked with SmarTeam) were saving the 3D CAD structure as a quantity-based document structure, resembling a lot a structure called the EBOM.
This is one of the most common mistakes made in PLM implementations.
The CAD structure does not represent the EBOM !!!
Implementers started to build all kind of customizations to create automatically from the CAD structure a Part structure, the EBOM. Usually these customizations ended up as a mission impossible, in particular when customers started to ask for bidirectional synchronization. They expected that when a Part is removed in the EBOM, it would be deleted in the CAD assembly too.
And then there was the issue that companies believed the CAD Part ID should be equal to the Part ID. This might be possible for a particular type of design parts, but does not function anymore with flexible parts, such as a tube or a spring. When this Part is modeled in a different position, it created a different CAD Document, breaking the one-to-one relation.
Finally another common mistake that I have seen in many PDM implementations is the addition of glue, paint and other manufacturing type of parts to the CAD model, to be able to generate a BOM directly from the CAD.
From the data model perspective it is more important to understand that Parts and CAD documents are different type of objects. In particular if you want to build a PLM implementation where data is shared across all disciplines. For a PDM implementation I care less about the data model as the implementation is often not targeting enterprise continuity of data but only engineering needs.
A CAD Document (Assembly / Part / Drawing / …) behaves like a Document. It can be checked-in and checked out any time a change is made inside the file. A check-in operation would create a new version of the CAD Document (in case you want to trace the history of changes).
Meanwhile the Part specified by the CAD Document does not change in version when the CAD Document is changed. Parts usually do not have versions; they remain in the same revision as long as the specifying CAD Document matures.
Moving from PDM to PLM
For a PLM implementation it is important to think “Part-driven” which means from an initial EBOM, representing the engineering specification of the Product, maturing the EBOM with more and more design specification data. Design specification data can be mechanical assemblies and parts, but also electrical parts. The EBOM from a PCB might come from the Electrical Design Application as in the mechanical model you will not create every component in 3D.
And once the Electrical components are part of the EBOM, also the part definition of embedded software can be added to the BOM. For example if software is needed uploaded in flash memory chips. By adding electrical and software components to the EBOM, the company gets a full overview of the design maturity of ALL disciplines involved.
The diagram below shows how an EBOM and its related Documents could look like:
This data model contains a lot of details:
- As discussed in my previous post – for the outside world (the customer) there is a product defined without revision
- Related to the Product there is an EBOM (Part assembly) simplified as a housing (a mechanical assembly), a connector (a mechanical art) and a PCB (a mechanical representation). All these parts behave like Mechanical Parts; they have a revision and status.
- The PCB has a second representation based on an electrical schema, which has only (for simplification) two electrical parts, a resistor and a memory chip. As you can see these components are standard purchasable parts, they do not have a revision as they are not designed.
- The Electrical Part Flash Memory has a relation to a Software Part which is defined by Object Code (a zip-file?) which of course is specified by a software specification (not in the diagram). The software object code has a version, as most of the time software is version managed, as it does not follow the classical rules of mechanical design.
Again I reached my 1000 words, a sign to stop explaining this topic. For sure there are a lot of details to explain to this data model part too.
- A CAD structure is not an EBOM (it can be used to generate a part of the EBOM)
- CAD documents and EBOM parts have a different behavior. CAD documents have versions, Parts do not have versions (most of the time
- The EBOM is the place where all disciplines synchronize their data, providing during the development phase a single view of the design status.
Let me know if this was to abstract and feel free to ask questions. Important for this series of blog post is to provide a methodology baseline for a real PLM data model.
I am looking forward to your questions or remarks to spark up the discussion.
As described in my latest LinkedIn post if you want to install PLM successful there are two important points to address from the implementation point of view:
- An explicit data model not based on system or tools capabilities, but on the type of business the company is performing. There is a difference in an engineering to order company, a built to order company or a configure to order company.
- In PLM (and Business) it is all about enabling an efficient data flow through the organization. There is no ownership of data. It is about responsibilities for particular content per lifecycle stage combined with sharing
Historically PLM implementations started with capturing the CAD data and related EBOM as this is what the CAD-related PLM vendors were pushing for and this was often for the engineering department the biggest pain. The disadvantage of this approach is that it strengthens the silo-thinking process. The PLM system becomes an engineering tool instead of an enterprise system.
I believe if you really want to be able to implement PLM successful in a company, start from a common product/part information backbone. This requires the right business objects and, therefore, the right data modeling. The methodology described below is valid for build to order and configure to order companies, less applicable for engineering to order.
In a build to order company there are the following primary information objects:
- A Product ( representing the customer view of what is sold to the outside world)
- An EBOM ( representing a composition of Parts specifying the Product at a particular time)
- An MBOM (representing the manufacturing composition of the Product at a given time)
And, of course, there are for all the information objects related Documents. Various types and when you can work more advanced, the specification document, can be the source for individually extracted requirements (not in this post)
Let´s follow an End to End scenario from a typical Build to Order company process.
A potential customer sends an RFP for a product they need. The customer RFP contains information about how the product should behave (Specification / Requirements) and how it should be delivered (packaging). A basic data model for this RFP would be:
Note the following details:
- All information objects have a meaningless number. The number is only there to support unique identification and later integration with other systems. The meaning should come from the other attribute data on the object and its relations. (A blog post on its own)
- The Product can have instead of the meaningless number the number provided by the customer. However, if this number is not unique to the company, it might be just another attribute of the product
- In general Products do not have revisions. In time, there might be other BOMs related to the product. Not in this post, products might have versions and variants. And products might be part of a product family. In this case, I used a classification to define a classification code for the product, allowing the company to discover similar products from different customers done. This to promote reuse of solutions and reuse of lessons learned.
- The customer object represents the customer entity and by implementing it as a separate object, you will be able to see all information related to this customer quickly. This could be Products (ordered / in RFQ / etc.) but also other relevant information (Documents, Parts, …)
- The initial conceptual BOM for the customer consists of two sub-BOMs. As the customer wants the products to be delivered in a 6-pack, a standard 6-pack EBOM is used. Note: the Status is Released and a new conceptual EBOM is defined as a placeholder for the BOM definition of the Product to design/deliver.
- And for all the Parts in the conceptual EBOM there can be relations towards one or more documents. Usually, there is one specifying document (the CAD model) and multiple derived documents (Drawings, Illustrations, …)
- Parts can have a revision in case the company wants to trace the evolution of a Part. Usually when Form-Fit-Function remains the same, we speak about a revision. Otherwise, the change will be a new part number. As more and more the managed information is no longer existing on the part number, companies might want to use a new part number at any change, storing in an attribute what its predecessor was.
- Documents have versions and revisions. While people work on a document, every check-in / check-out moment can create a new version of the file(s), providing tractability between versions. Most of the time at the end there will be a first released version, which is related to the part specified.
- Do not try to have the same ID and Revision for Parts and Documents. In the good old days of 2D drawings this worked, in the world of 3D CAD this is not sustainable. It leads to complexity for the user. Preferably the Part and the specifying Document should have different IDs and a different revision mechanism.
And the iterations go on:
Now let´s look at the final stage of the RFQ process. The customer has requested to deliver the same product also in single (luxury) packaging as this product will be used for service. Although it is exactly the same physical product to produce, the product ID should be different. If the customer wants unambiguous communication, they should also use a different product ID when ordering the product for service or for manufacturing. The data model for this situation will look as follows (assuming the definitions are done)
Note the following details:
- The Part in the middle (with the red shadow) – PT000123 represents the same part for both, the product ordered for manufacturing, as well as the product ordered for service, making use of a single definition for both situations
- The Part in the middle has now a large set of related documentation. Not only CAD data but also test information (how to test the product), compliance information and more.
- The Part in the middle on its own also has a deeper EBOM structure which we will explore in an upcoming post.
I reached my 1000 words and do not want to write a book. So I will conclude this post. For experienced PLM implementers probably known information. For people entering the domain of PLM, either as a new student or coming from a more CAD/PDM background an interesting topic to follow. In the next post, I will continue towards the MBOM and ERP.
Let me know if this post is useful for you – and of course – enhancements or clarifications are always welcomed. Note: some of the functionality might not be possible in every PLM system depending on its origin and core data model
Two weeks ago I got this message from WordPress, reminding me that I started blogging about PLM on May 22nd in 2008. During some of my spare time during weekends, I began to read my old posts again and started to fix links that have been disappearing.
Initially when I started blogging, I wanted to educate mid-market companies about PLM. A sentence with a lot of ambiguities. How do you define the mid-market and how do you define PLM are already a good start for a boring discussion. And as I do not want to go into a discussion, here are my “definitions”
Warning: This is a long post, full of generalizations and a conclusion.
PLM and Mid-market
The mid-market companies can be characterized as having a low-level of staff for IT and strategic thinking. Mid-market companies are do-ers and most of the time they are good in their domain based on their IP and flexibility to deliver this to their customer base. I did not meet mid-market companies with a 5-year and beyond business vision. Mid-market companies buy systems. They bought an ERP system 25-30 years ago (the biggest trauma at that time). They renewed their ERP system for the Y2K problem/fear and they switched from drawing board towards a 2D CAD system. Later they bought a 3D CAD system, introducing the need for a PDM system to manage all data.
PLM is for me a vision, a business approach supported by an IT-infrastructure that allows companies to share and discover and connect product related information through the whole lifecycle. PLM enables companies to react earlier and better in the go-to-market process. Better by involving customer inputs and experience from the start in the concept and design phases. Earlier thanks to sharing and involving other disciplines/suppliers before crucial decisions are made, reducing the amount of iterations and the higher costs of late changes.
Seven years ago I believed that a packaged solution, combined with a pre-configured environment and standard processes would be the answer for mid-market companies. The same thought currently PLM vendors have with a cloud-based solution. Take it, us it as it is and enjoy.
Here I have changed my opinion in the past seven years. Mid-market companies consider PLM as a more complex extension of PDM and still consider ERP (and what comes with that system) as the primary system in the enterprise. PLM in mid-market companies is often seen as an engineering tool.
LESSON 1 for me:
The benefits of PLM are not well-understood by the mid-market
To read more:
Globalization and Education
In the past seven years, globalization became an important factor for all type of companies. Companies started offshoring labor intensive work to low-labor-cost countries introducing the need for sharing product data outside their local and controlled premises. Also, acquisitions by larger enterprises and by some of the dominant mid-market companies, these acquisitions introduced a new area of rethinking. Acquisitions introduced discussions about: what are real best practices for our organization? How can we remain flexible, meanwhile adapt and converge our business processes to be future ready?
Here I saw two major trends in the mid-market:
Lack of (PLM) Education
To understand and implement the value of PLM, you need to have skills and understanding of more than just a vendor-specific PLM system. You need to understand the basics of change processes (Engineering Change Request, Engineering Change Order, Manufacturing Change Order and more). And you need to understand the characteristics of a CAD document structure, a (multidisciplinary) EBOM, the MBOM (generic and/or plant specific) and the related Bill of Processes. This education does not exist in many countries and people are (mis-)guided by their PLM/ERP vendor, explaining why their system is the only system that can do the job.
Interesting enough the most read posts on my blog are about the MBOM, the ETO, BTO and CTO processes. This illustrates there is a need for a proper, vendor-independent and global accepted terminology for PLM
Some educational posts:
Bill of Materials for Dummies – ETO ranked #1
ECR/ECO for Dummies ranked #2
BOM for Dummies – CTO ranked #4
BOM for Dummies: BOM and CAD ranked #7
The dominance of ERP
As ERP systems were introduced long before PLM (and PDM), these systems are often considered by the management of a mid-market company as the core. All the other tools should be (preferably) seen as an extension of ERP and if possible, let´s implement ERP vendor´s functionality to support PLM – the Swiss knife approach – one tool for everything. This approach is understandable as at the board level there are no PLM discussions. Companies want to keep their “Let´s do it”-spirit and not reshuffle or reorganize their company, according to modern insights of sharing. Strangely enough, you see in many businesses the initiative to standardize on a single ERP system first, instead of standardizing on a single PLM approach first. PLM can bring the global benefits of product portfolio management and IP-sharing, where ERP is much more about local execution.
PLM is not understood at the board level, still considered as a tool
Some post related to PLM and ERP
Where is the MBOM ? ranked #3
The human factor
A lot of the reasons why PLM has the challenge to become successful have to do with its broad scope. PLM has an unclear definition and most important, PLM forces people to share data and work outside their comfort zones. Nobody likes to share by default. Sharing makes day-to-day life more complicated, sharing might create visibility on what you actually contribute or fix. In many of my posts, I described these issues from various viewpoints: the human brain, the innovators dilemma, the way the older generation (my generation) is raised and used to work. Combined with the fact that many initial PLM/PDM implementations have created so many legacies, the need to change has become a risk. In the discussion and selection of PLM I have seen many times that in the end a company decides to keep the old status quo (with new tools) instead of really having the guts to move toward the future. Often this was a result of investors not understanding (and willing to see) the long term benefits of PLM.
PLM requires a long-term vision and understanding, which most of the time does not fit current executive understanding (lack of education/time to educate) and priority (shareholders)
Many recent posts are about the human factor:
The digital transformation
The final and most significant upcoming change is the fact that we are entering a complete new era: From linear and predictable towards fast and iterative, meaning that classical ways we push products to the market will become obsolete. The traditional approach was based on lessons learned from mechanical products after the second world-war. Now through globalization and the importance of embedded software in our products, companies need to deliver and adapt products faster than the classical delivery process as their customers have higher expectations and a much larger range to choose from. The result from this global competitiveness is that companies will change from delivering products towards a more-and-more customer related business model (continuous upgrades/services). This requires companies to revisit their business and organization, which will be extremely difficult. Business wise and human change require new IT concepts – platform? / cloud services? / Big data?
Older enterprises, mid-market and large enterprises will be extremely challenged to make this change in the upcoming 10 years. It will be a matter of survival and I believe the Innovator´s Dilemma applies here the most.
The digital transformation is apparent as a trend for young companies and strategic consultants. This message is not yet understood at the board level of many businesses.
Some recent post related to this fast upcoming trend:
ROI (Return On Investment)
I also wrote about ROI – a difficult topic to address as in most discussions related to ROI, companies are talking about the costs of the implementation, not about the tremendous larger impact a new business approach or model can have, once enabled through PLM. Most PLM ROI discussions are related to efficiency and quality gains, which are significant and relevant. However these benefits are relative small and not comparable with the ability to change your business (model) to become more customer centric and stay in business.
Some of the ROI posts:
A (too) long post this time however perhaps a good post to mark 7 years of blogging and use it as a reference for the topics I briefly touched here. PLM has many aspects. You can do the further reading through the links.
From the statistics it is clear that the education part scores the best – see rankings. For future post, let me know by creating a comment what you are looking for in this blog: PLM Mid-Market, Education, PLM and ERP, Business Change, ROI, Digitalization, or …??
Also I have to remain customer centric – thanks for reading and providing your feedback
I was sitting outside in the garden during Ascension Day, which is (still) a national holiday in the Netherlands (Thanks God). It was again nice and warm, and it made me think about the parallels between Global warming and PLM.
Climate change has always been there if we look at the history of our planet. We started to talk about Global Warming when scientist indicated that this time the climate change is caused by human intervention. As a result of vast amounts of carbon dioxide emissions, a greenhouse effect started to become visible. When the first rumors came that global warming began to come up, environmentalists started preaching we have to act NOW before it is too late. Meanwhile at the other side, people began arguing that it was just a coincidence, an opinion.
There is no scientific proof, so why worry?
In the past ten years, the signs and proofs of global warming have become evident and climate conferences filled with people who want to act and on the other side the blockers, try to create progress in the battle against global warming. In particular in Europe governments and companies are starting to become aware that they can contribute to a more sustainable society.
Not enough according to the environmentalists and scientists. As our brains still operate mostly in a prehistoric mode (day-to-day survival, food, home, social status), slow changes and sustainability for next generations are not part of most people concerns. And those people, who make us aware of this lack of priority for sustainability, are considered annoying as they disrupt our lives.
Companies that have invested (heavily) in sustainable business models often have a challenging path to survive against traditional businesses. As the majority of consumers wants cheap. Some examples:
- Energy: most power plants are heated by burning coal as this is the cheapest option. Shale gas winning became attractive because we need cheap fuel. Alternatives like solar, wind and others cannot compete on price level as long as we do not pay for the damage to nature.
- Food: produced in bio-farms, where animal wellness or health is not part of the plan. The goal is to deliver xx kilos of meat for the lowest price. Alternative like more natural ways of growing meat or even revolutionary ways (the grown hamburger) cannot compete on price currently unless we are willing to pay for it.
- The Fashion industry where down in its supply chains human beings are treated like slaves. When you buy a cheap garment, you know somebody has been suffering.
Governments sometimes subsidize or push sustainable technologies as they realize that something has to happen (most of the time for the public opinion – their voters) but there is no consistent strategy as liberals believe every form of support is against open competition. And as long as we let our prehistoric brain run our choices, the earth gets warmer with the consequences being visible more and more.
We know we have to act, but we do not act seriously
Now let´s switch to PLM. The association started when I saw Chad Jackson’s retweet from Lifecycle insights related to top PLM challenges.
Clearly the message illustrates that costs, time, and technology have priority. Not about what PLM really can establish (even in the context of global warming).
PLM started end of the previous century, initially invented by some of the major CAD vendors, Dassault Systemes, PTC, and Siemens. Five years later it was taken more seriously, as also enterprise software vendors, like SAP and Oracle, started to work on their PLM offering. And some years ago even the most skeptic company related to PLM, Autodesk, began to sell a PLM offering.
So like global warming we can conclude: PLM is recognized, and now we can act.
The early adopters of PLM are also in a challenging situation. Their first PLM implementations were very much focused on an IT-infrastructure, allowing data to flow through a global organization, without disrupting the day-to-day business model too much. These implementations are now a burden to many of them: costly and almost impossible to change. Look at the PLM stories from some of the major automotive companies, like Daimler, JLR, PSA, Renault, , Volvo Cars and more.
They are all somehow kept hostage by their old implementations (as business continues) however due to changing ownership, business models and technology they cannot benefit from modern PLM concepts as it would be a disruption.
Meanwhile, PLM has evolved from an IT-infrastructure into a business-driven approach to support global, more flexible and customer-driven business processes. Younger companies that are now starting in Asia do not suffer from this legacy and are faster established based on the know-how from the early adopters.
And this is not only happing in the automotive industry. In the recent years, I have seen examples in the Oil & Gas industry, the High-Tech industry (which in theory is relative young) and the Manufacturing industry.
Coming back to the 2015 PLM challenges tweeted by Chad Jackson, it looks like they are related to time and costs. Obviously it is not clear what values PLM can bring to a company outside efficiency gains (ERP/Lean thinking). Modern PLM allows companies to change their business model as I wrote recently: From a linear to fast and circular. No longer is the PLM mission to support companies with product information from cradle to grave but from cradle to cradle. Sustainability and becoming connected to customers are new demands: Operational services instead of selling products, linking it with the need for IoT to understand what is happening.
In the 2015 PLM, the discussion with executives is about purchasing technology instead of the need to change our business for long-term survival. Most investors do not like long-term visions as their prehistoric brains are tuned to be satisfied in the short-term.
Therefore, as long as the discussion about PLM is about IT and infrastructure and not about business change, there will be this stall, identical to what happens with addressing global warming. Short term results are expected by the stakeholders, trying to keep up the current model. Strategists and business experts are all talking about the new upcoming digital era, similar to global warming.
We know we have to act, but we do not act seriously
When I posted a short version of this post on LinkedIn on Ascension Day, I got some excellent feedback which I want to share here:
Dieter de Vroomen (independent advisor, interim manager & neighbor) wrote me an email. Dieter does not have a PLM-twisted brain. Therefore I like his opinion:
PLM and Global Warming are both assumptions, mental constructs that we can make plausible with technology and data. Both mindsets save us from disasters through the use of technology. And that’s what both sell. But is that what they produce, what we want? Apple and associates think vice versa, making what first we want and explain later the underlying technology. I miss that with global warming, but certainly PLM. That’s why it sells so bad CxO’s.
I think the point Dieter is making is interesting as he is a non-PLM guy -showing the way CxO might be thinking. As long as we (PLMers) do not offer a packaged solution, an end-to-end experience, it is hard to convince the C-level. This is one of the significant differences between ERP (its purpose is clearly is tangible) and PLM (see my post PLM at risk! It does not have a clear target).
A more motivating comment came from Ben Muis, consultant and entrepreneur in the fashion industry. We met at the PI Apparel 2013 conference, and I like his passion for bringing innovation to the fashion industry. Read his full comments on my post on LinkedIn as he combined in his career sustainability and PLM. Two quotes from Ben:
As you may know I did quite a bit of work on how the fashion industry could and should be more sustainable in its approach. This was at a time where only a handful of people at best were willing to even think about this. Knowing that in reality the decisions around cost and commercialism were driving the agenda, I drew the conclusion that by improving processes within the industry I could actually cause a sustainability improvement that was driven by commercial desire.
Explaining how you can become involved in the bigger picture and for Ben it is the possibility to keep on working on his passion in a real-time world. And finally:
So there you have it… my reasons for initially thinking your title was very close to the reason I shifted my focus from pure sustainability advice to PLM implementations to begin with. I could drive a real result much quicker. This, as I am sure you will agree, in itself supports the reason for taking PLM seriously
The topics PLM and Global Warming have a lot in common. The awareness exists. However when it comes to action, we are blocked by our prehistoric brain, thinking about short term benefits. This will not change in the next 1000 years. Therefore, we need organizations and individuals that against all odds take the steep path and have a vision of change, breaking the old models and silos. It will cost money, it will require a sacrifice and the reward will only be noticed by next generations. What a shame
A final quote before going back to standard PLM matter in upcoming posts:
“Everything is theoretically impossible, until it is done.”
Robert A. Heinlein
Did I choose the wrong job? Busy times still and the past 15 years I have focused on PLM and every year I had the feeling there was progress in the understanding and acceptance for PLM. Although the definition of PLM is a moving target, there are probably thousands of PLM experts around the world. From my recent blog posts, the past two years you might share my opinion that PLM is changing from an engineering, document-centric system towards a beyond PLM approach where a data-driven, federated platform leads to (yet unknown) benefits.
So where to draw the border of PLM?
Is there a possibility that somewhere a disruptive approach will redefine PLM again? PLM is considered complex (I don´t think so). The complexity lies first of all in the rigidness of PLM systems not being able to excite people. Next the desires from implementers to provide services to satisfy users and, as a result, make it more complicated. Finally and the most important reason the lack of understanding that implementing PLM requires a business change.
Change (don´t mention the word), which does not happen overnight.
Oleg Shilovitsky wrote about PLM and organizational change. He is leaving it for further discussion if the difficulty is related to the PLM technology or the resistance towards change for people in business. Read his conclusion:
Change is hard. We should re-think the way we implement PLM and exclude process alignment from PLM implementation. Stop changing people and stop forcing people to take complicated decisions during PLM sales process. Future PLM products will become a foundation for agile change management that will be done by companies.
Edward Lopategui is even more provocative in his blog post: The PLM Old Fart Paradox. Have a read of his post including the comments. Edward is somehow sharing the same belief, stating PLM has an identity crisis
PLM has an identity crisis. Talking PLM at a random networking event tends to engender one of two reactions. The first is from anyone who recognizes the acronym, spent 5 years consulting for company X, and begins a vigorous head-nod that instills fear their neck may unhinge in agreement. The other reaction is quite the opposite; you can almost sense a capillary dilation of the so-called blush response. Fluctuation of the pupil… Involuntary dilation of the iris… it’s the Voight-Kampff test for interest expiring at the mere utterance of the acronym. You don’t get this kind of reaction when you talk Cloud or Internet of Things, which while overused, tend to at least solicit questions and interest among the uninitiated. There’s public relations work to be done.
Both Oleg and Edward believe that new technology is needed to overcome the old PLM implementation issues: a need for change, a need to break down the silos.
Meanwhile in Europe
Meanwhile in Europe, an international research foundation for PLM (http://www.plm-irf.org/) has been initiated and is making itself heard towards the United States. What is the mission of this research foundation? To define the future of PLM. Read the opening statement:
The PLM International Research Foundation (PLM-IRF) initiative aims to establish a central mechanism to support global research into the most advanced future capabilities of PLM.
This is the first initiative ever to ask the question:
What research does the world need, to achieve the future PLM capabilities that the world wants?”
This simple question highlights that fact that the PLM industry needs coherent view of the future. Without a clear sense of direction, PLM development is likely to fall far short of what it could be.
I consider this as a mission impossible. In May this year I will be blogging for seven years about PLM and looking back to my early posts the world was different. Interesting some of the predictions (PLM in 2050 – predictions done in 2008) I made in the past are still valid however for every right prediction there might be a wrong one too.
And now this International Research Foundation is planning to define what PLM should offer in the future?
What happens if companies do not agree and implement their business approach? It reminded me of a keynote speech given by Thomas Schmidt (Vice President, Head of Operational Excellence and IS – ABB’s Power Products Division) at PLM Innovation 2012 (my review here). Thomas was challenging the audience explaining what ABB needed. Quoting Thomas Schmidt:
“And if you call this PLM, it is OK for me. However, current PLM systems do not satisfy these needs.”
So you can imagine the feeling I got: PLM has an identity crisis.
Or do I have an identity crisis?
I believe we are in a transition state where companies have to redefine their business. I described this change in my earlier post: From Linear to fast and circular. Implementing this approach first of all requires a redefinition of how organizations work. Hierarchical and siloed organizations need to transform towards flat, self-adapting structures in order to become more customer-centric and reactive to ever faster-changing market needs.
For that reason, I was surprised by a presentation shared by Chris Armbruster that same week I read Oleg´s and Edward´s posts. In many ways, Chris and I come from the opposite sides of PLM.
My background European, with a classical start from engineering, a focus on the mid-market. Chris according to his Slideshare info, US-based, Supply Chain Executive and focus on the Fortune 500.
Have a look at Chris´s presentation – rethinking business for Exponential times. It is amazing that two persons not connected at all can come to the same conclusions.
This should be an indication there is a single version of the truth!
You might say PLM has an identity crisis. We do not need a better definition of PLM to solve this. We need to change our business model and then define what we need. PLM, ERP, SLM, MES, SCM, ….. There are enough unused TLAs for the future. And I am still happy with my job.
… and you ? Looking for a new job or changing too ?
Three weeks ago there was the Product Innovation conference in Düsseldorf. In my earlier post (here) I described what I experienced during this event. Now, after all the information is somehow digested, here a more high-level post, describing the visible change in business and how it relates to PLM. Trying to describe this change in non-academic wording but in images. Therefore, I described the upcoming change in the title: from linear to circular and fast.
Let me explain this image step by step
In the middle of the previous century, we were thinking linear in education and in business. Everything had a predictable path and manufacturing companies were pushing their products to the market. First local, later in time, more global. Still the delivery process was pretty linear:
This linear approach is reflected in how organizations are structured, how they are aligned to the different steps of the product development and manufacturing process. Below a slide I used at the end of the nineties to describe the situation and the pain; lack of visibility what happens overall.
It is discouraging to see that this situation still exists in many companies.
At the end of the nineties, early 2000, PLM was introduced, conceptually managing the whole lifecycle. In reality, it was mainly a more tight connection between design and manufacturing preparation, pushing data into ERP. The main purpose was managing the collaboration between different design disciplines and dispersed teams.
Jim Brown (Tech-Clarity) wrote at that time a white paper, which is still valid for many businesses, describing the complementary roles of PLM and ERP. See the picture below:
Jim introduced the circle and the arrow. PLM: a circle with iterations, interacting with ERP: the arrow for execution. Here visual it became already clear an arrow does not have the same behavior as a circle. The 100 % linearity in business was gone.
Let´s have a closer look at the PLM circle
This is how PLM is deployed in most organizations:
Information is pushed in the ERP system as disconnected information, no longer managed and connected to its design intent.
Next, the ERP system is most of the time not well-equipped for managing after sales and services content. Another disconnect comes up.
Yes, spare parts could be ordered through ERP, but issues appearing at the customer base are not stored in ERP, often stored in a separate system again (if stored beyond email).
The result is that when working in the concept phase, there is no information available for R&D to have a good understanding of how the market or customers work with their product. So how good will it be? Check in your company how well your R&D is connected with the field?
And then the change started …
This could have stayed reality for a long time if there were not a huge business change upcoming. The world becomes digital and connected. As a result, local inefficiencies or regional underperformance will be replaced by better-performing companies. The Darwin principle. And most likely the better performing companies are coming from the emerging markets as there they do not suffer from the historical processes and “knowledge of the past”. They can step into the digital world much faster.
In parallel with these fast growing emerging markets, we discovered that we have to reconsider the ways we use our natural resources to guarantee a future for next generations. Instead of spilling resources to deliver our products, there is a need to reuse materials and resources, introducing a new circle: the circular economy.
The circular economy can have an impact on how companies bring products to the market. Instead of buying products (CAPEX) more and more organizations (and modern people) start using products or services in a rental model (OPEX). No capital investment anymore, pay as you go for usage or capacity.
The digital and connected world can have a huge impact on the products or services available in the near future. You are probably familiar with the buzz around “The Internet of Things” or “Smart and Connected”.
No longer are products depending on mechanical behavior only, more and more products are relying on electrical components with adaptive behavior through software. Devices that connect with their environment report back information to the manufacturer. This allows companies to understand what happens with their products in the field and how to react on that.
Remember the first PLM circle?
Now we can create continuity of data !
Combine the circular economy, the digital and connected world and you will discover everything can go much faster. A crucial inhibitor is how companies can reorganize themselves around this faster changing, circular approach. Companies need to understand and react to market trends in the fastest and adequate way. The future will be probably about lower volumes of the same products, higher variability towards the market and most likely more and more combining products with services (the Experience Model). This requires a flexible organization and most likely a new business model which will differ from the sequential, hierarchical organizations that we know at this moment.
The future business model ?
The flexibility in products and services will more and more come from embedded software or supported by software services. Software services will be more and more cloud based, to avoid IT-complexity and give scalability.
Software development and integration with products and services are already a challenge for classical mechanical companies. They are struggling to transform their mechanical-oriented design process towards support for software. In the long-term, the software design process could become the primary process, which would mean a change from (sequential – streamlined) lean towards (iterative – SCRUM) agile.
Once again, we see the linear process becoming challenged by the circular iterations.
This might be the end of lean organizations, potentially having to mix with agile conepts..
If it was a coincidence or not, I cannot judge, however during the PI Conference I learned about W.L. Gore & Associates, with their unique business model supporting this more dynamic future. No need to have a massive organization re-org to align the business, as the business is all the time aligning itself through its employees.
Last weekend, I discovered Semco Partners in the newspaper and I am sure there are more companies organizing themselves to become reactive instead of linear – for sure in high-tech world.
Linearity is disappearing in business, it is all about reactive, multidisciplinary teams within organizations in order to support customers and their fast changing demands.
Fast reactions need new business organizations models (flexible, non-hierarchical) and new IT-support models (business information platforms – no longer PLM/ERP system thinking)
What do you think ? The end of linear ?
I have talked enough about platforms recently. Still if you want to read more about it:
Engineering.com: Prod. Innovation Platform PlugnPlay in next generation PLM
Gartner: Product Innovation Platforms
VirtualDutchman: Platform, Backbone, Service Bus or BI
This is the fifth year that marketkey organized their vendor-independent conference in Europe around Product Innovation, where PLM is the major cornerstone. Approximate 100 companies attended this conference coming from various industries. As there were most of the time two till four parallel tracks (program here), it will still take time for me to digest all the content. However here a first impression and a comparison to what has changed since the PI Conference in 2014 – you can read my review from that conference here.
First of all the keynote speeches for this conference were excellent and were a good foundation for attendees to discuss and open their mind. Secondly I felt that this conference was actually dealing with the imminent shift from classic, centralized businesses towards the data-centric approach to connectivity of information coming from anyone / anything connected. Naturally the Internet of Everything (IoE) and the Internet of Things (IoT) were part of the discussion combined with changing business models: moving from delivering products toward offering services (CAPEX versus OPEX).
Some of the highlights here:
The first keynote speaker was Carlo Rati Director, MIT Senseable Lab. He illustrated through various experiments and examples how being connected through devices we can change and improve our world: tagging waste, mobile phone activity in a city and the Copenhagen Wheel. His main conclusion (not a surprise): For innovation there is a need to change collaboration. Instead of staying within the company / discipline boundaries solving problems through collaboration between different disciplines will lead to different thinking. How is your company dealing with innovation?
The second session I attended was John Housego from W.L. Gore and Associates who explained the company’s model for continuous growth and innovation. The company’s future is not based on management but based on leadership of people working in teams in a flat organization. Every employee is an associate, directly involved and challenged to define the company’s future. Have a read about the company’s background here on Wikipedia.
Although the company is 50 years old, I realized that their cultural model is a perfect match with the future of many businesses. More and more companies need to be lean and flexible and support direct contact between the field, customers, market and experts inside the company. Implementing a modern PLM platform should be “a piece of cake” if the technology exists, as W.L. Gore’s associates will not block the change if they understand the value. No silos to break down.
My presentation “The Challenge of PLM Upgrades as We See the Rules of Business Change” was based around two themes (perpetual software ? / seamless upgrades ?) and from there look towards the future what to expect in business. When we look back, we see that every 10 years there is a major technology change, which makes the past incompatible to upgrade. Now we are dreaming that cloud-based solutions are the future to guarantee seamless upgrades (let’s wait 10 years). To my opinion companies should not consider a PLM upgrade at this moment.
The changes in business models, people behavior and skills plus technology change, will enable companies to move towards a data-centric approach. Companies need to break with the past (a linear, mechanical-design-based, product development approach) and redesign a platform for the future (a business-innovation platform based on the data). In my upcoming blog post(s) I will give more background on this statement.
Trond Zimmerman from the Volvo Group Truck explained the challenges and solution concept they experienced as they are currently implementing answering the challenge of working in a joint venture with Dongfeng Commercial Vehicles. As in a joined venture you want to optimize sharing of common parts, still you cannot expect a single PLM solution for the total joint venture. For that reason, Volvo Group Truck is implementing Share-A-Space from Eurostep to have a controlled collaboration layer between the two joint venture partners.
This is, to my opinion, one of the examples of future PLM practices, where data will not be stored in a single monolithic system, but data will be connected through information layers and services. The case is similar to what has been presented last year at Product Innovation 2014 where Eurostep and Siemens Industrial Turbomachinery implemented a similar layer on top of their PDM environment to enable controlled sharing with their suppliers.
David Rowan from wired.co.uk closed the day with his keynote: Understanding the New Rules of Product Innovation. He touched the same topic as John Housego from W.L. Gore somehow: it is all about democratization. Instead of hierarchy we are moving to network-based activities. And this approach has a huge impact on businesses. David’s message: Prepare for constant change. Where in the past we lived in a “linear” century, change according to Moore’s law, we are entering now an exponential century where change is going faster and faster. Besides examples of the Internet of Thing, David also gave some examples of the Internet of Stupid Things. He showed a quote from Steve Balmer stating that nobody would pay $ 500 for a phone (Apple). The risk he made is that by claiming some of these stupid inventions might lead to a quote in the future. I think the challenge is always to stay open-minded without judging as at the end the market will decide.
PLM and ERP
I spent the evening networking with a lot of people, most of them excited about the future capabilities that have been presented. In parallel, the discussion was also about the conservative behavior of many companies. Topics that are already for ten years under discussion – how to deal and connect PLM and ERP, where is the MBOM, what are the roles of PLM and ERP for an organization, are still thankful topics for a discussion, showing where most companies now are with their business understanding.
In parallel to a product innovation conference apparently there is still a need to agree on basic PLM concepts from the previous century.
The second day opened with an excellent keynote speech from Dirk Schlesinger from Cisco. He talked about the Internet of Everything and provided examples of the main components of IoE: Connectivity, Sensors, Platform, Analytics, and Mobility. In particular the example of Connectivity was demonstrating the future benefits modern PLM platforms can bring. Dirk talked about a project with Dundee Mining where everything in the mine was tagged with RFI devices (people, equipment, vehicles, and resources) and the whole mine was equipped with Wi-Fi.
Based on this approach the execution and planning of what happened was done in their HQ through a virtual environment, giving planners immediate visibility of what happens and allowing them to decide on real data. This is exactly the message I have posted in my recent blog posts.
The most fascinating part were the reported results. This project is ongoing now for 3 years and the first year they achieved a production increase of 30 %. Now they are aiming for this year for a 400 % production increase and a 250 % efficiency increase. These are the numbers to imagine when you implement a digital strategy. It is no longer any more about making our classical processes more efficient, it is about everyone connected and everyone collaborates.
Marc Halpern from Gartner gave an good presentation connecting the hype of the Internet of Things with the world of PLM again, talking about Product Innovation Platforms. Marc also touched on the (needed) upcoming change in engineering processes. More and more we will develop complex products, which need system thinking. Systems of Systems to handle this complexity, As Marc stated: “Product, process, culture is based on electro-mechanical products where the future trend is all about software.” We should reconsider our Bill of Materials (mechanical) and think probably more about a Bill of Features (software). Much of Marc’s presentation contained the same elements as I discussed in my PDT2014 blog post from October last year.
I was happy to see Jenni Ala-Mantila presenting the usage of PLM system for Skanska Oy. Skanska is one of the largest construction companies operating global. See one of their beautiful corporate videos here. I always have been an advocate to use PLM practices and PLM infrastructure to enhance, in particular, the data-continuity in a business where people work in silos with separate tools. There are so many benefits to gain by having an end-to-end visibility of the project and its related data. Jenni’s presentation was confirming this.
By implementing a PLM backbone with a focus on project management, supplier collaboration and risk management, she confirmed that PLM has contributed significant to their Five Zero – vision: Zero loss-making projects, Zero Environmental incidents, Zero Accidents, Zero Ethical breaches and Zero Defects. Skanska is really a visionary company although it was frustrating to learn that there was still a need to build a SharePoint connection with their PLM environment. The future of data-centric has not reached everyone in the organization yet.
The last two sessions of the conference, a panel discussion “Why is Process Innovation Challenging & What can be done about it” plus the final keynote “Sourcing Growth where Growth Takes Place” had some commonality which I expressed in some twitter quotes:
Where last year I had the impression that the PLM world was somehow in a static mode, not so much news in 2014. It became clear in this 2015 conference that the change towards new business paradigms is really happening and at a faster pace than expected. From mechanical development processes to software processes, from linear towards continuous changes. Moe to come this year
This time I would like to receive some feedback from my readers as I believe the topic I am discussing here might be similar to a PLM / ERP discussion – a discussion between religions. I have preached the past two years a more data-centric approach for PLM, instead of file management and related tot this data-centric approach, the concept of a PLM platform / Business Platform – CIMdata/ Innovation Platform – Gartner becomes clear.
What´s the issue?
As I wrote in my earlier post (random PLM future thoughts), I realized that talking about platforms is not that straight-forward when meeting companies with their history and terminology. Some claim they are already using a business platform, others have no clue what makes a platform different from a their current PLM implementation ? Therefore I will summarize the different approaches I have seen in my network and give a non-academic opinion as a base for discussion. Looking forward to your opinion.
The platform approach
My definition of a PLM platform:
- A central repository of data based on a core data model. Information is stored as data in a unique way
- On top of this repository, applications can run, using a subset of the overall data elements, proving dedicated functionality and user interface to a particular user / role
- Access to the platform is provided through web-technology. Storage could be on the cloud.
- External applications and data can be connected through an open (standardized?) API embedded or federated
- The PLM platform can be a collection of services and functionality coming from various vendors / suppliers – the app store concept
- The platform approach is THE DREAM for business, being flexible to combine and edit data in any desired context in dedicated apps / environments
In the PLM world, Dassault Systems with their 3DExperience approach is following this trend although here you might argue about the ease of use to add external apps to this platform – is it open ? Aras and Autodesk might also claim they have a PLM platform, where you might question the same and if the depth of the data model and the provided solutions on top of the data model are mature enough. Finally also SAP can be considered as a platform, but I would not name it a PLM platform at this moment in time. An important question for me would be: How can achieve openness of a PLM platform?
The PLM backbone approach
My definition of a PLM backbone:
- The core PLM functionality is provided by a single, proprietary PLM system
- Additional functionality that is not part of the core development (acquisitions) is connected to the backbone through proprietary interfaces
- External authoring tools are linked to the backbone through integrations or interfaces which could be developed by third parties
- External system can interface to the PLM backbone through open interfaces
- The PLM backbone is THE DREAM for engineering, as historically this was the domain where PLM started to be implemented
I would consider Siemens and PTC (see picture) the best examples of a PLM backbone approach with their PLM portfolio. Teamcenter and Windchill are both rich PLM systems further connected to several systems, covering the product lifecycle. I am not expert enough to state that the same conclusion is valid for Oracle´s Agile, where I believe the backbone is bigger than the PLM system. What do you think ? Will these PLM vendors also move to a platform approach? And what will be the platform?
The Service Bus approach
My understanding of the Service Bus (I am not an IT-expert):
- Service Bus has a standardized interface to request for data or to post data that needs to be stored in other systems
- The Service Bus approach reduces the amount of (custom) interfaces between systems by requiring standardized inputs and outputs per system
- Providing a user with information that is not entirely available in a single system, the service bus needs to acquire the data from other systems, which might not give a high-performance as expected by business people
- The Service Bus is the IT DREAM as it simplifies the complexity for IT to manage point-to-point solutions between systems and makes an upgrade strategy easier to support.
From a very high-level view, the service bus approach has some similarities to a platform. The service bus concept allows business to select the systems they like the most (provided they connect to the service bus) – Image property of IBM.com
The main difference would be the persistence of information, where is the real data stored? I came across the service bus approach more often in the past, where the target was most of the time to integrate the PDM functionality (PLM as an enterprise solution was never in scope here).
For the Service Bus approach, I am curious to learn its relevance for future PLM implementations as the challenge would be to provide any user in the company with the relevant information in context. Is the service bus going to be replaced by the platform? Who would be the major players here?
The Business Intelligence approach
This method I discovered in project-centric companies (Oil & Gas companies, EPCs, Construction companies) but strangely enough also at some manufacturing companies, where I would assume integration of systems would bring large benefits.
- Each type of information is managed only in one single system avoiding interfaces or duplication of data.
- Only where needed, data will be pushed from one system to other systems
- Business Intelligence applications extract information from the relevant system and present this in context to the user, giving him/her a better of understanding
- Business users will work have to work in multiple systems to complete their tasks
- The BI approach is the ULTIMATE IT DREAM as it simplifies their works dramatically and shuts down business demands.
I have seen an example where IT dictated that for document management we use product ABC (well-known Content Management system). Next for internal documents we use SharePoint. For CAD, we use product PQR as much as possible (heavily adapted) or AutoCAD 2D (to support the minimum). For ERP, the standard system is XYZ (a famous ERP system – you do not lose your job by selecting them) and of course everyone uses Excel as a common interface of information between people.
It was impossible in this company to have a business view on the solution landscape. As you can imagine, this company’s margins are not (yet) under pressure as their industry is very conservative.
What do you think?
Is the future for PLM in platforms? If Yes, what about openness? Who are the candidates to offer such a platform? Or will lack of industry standards and openness block wider adoption? If No, will there be a massive PLM system in the future, connected to other enterprise systems (ERP/CRM)? Or will PLM be implemented as a collection of smaller systems communicating through an enterprise service bus?
I am looking forward discussing the topic here and soon during the upcoming Product Innovation conference in Düsseldorf
Currently, I am preparing my sessions for the upcoming Product Innovation conference in Düsseldorf. See: www.picongress.com. My first session will be about PLM upgrades and how to deal with them for the future. It is a challenging topic as some PLM vendors claim using their product, there will be no upgrade problems and cloud-based solutions also provide seamless upgrades in the future.
Don’t cheer to early when you see this kind of messages. I had the chance to look back the past twenty years what happened with PLM and tried to look forward to the upcoming ten years what might happen.
In addition, this lead to some interesting thoughts that I will share in detail during the conference. I will come back to this topic in this blog after the conference. Here some unstructured thoughts that passed my mind recently when preparing this session.
Not every upgrade is the same!
First there was an interesting blog post from Ed Lopategui from E(E) with the title There is No Upgrade, where he addresses the difference between consumer software and enterprise software. Where consumer software will be used by millions and tested through long Alfa and beta cycles, PLM software often comes to the market in what you could consider a beta stage with limited testing.
Most PLM vendors invest a lot of their revenue in providing new functionality and technology based on their high-end customer demands. They do not have the time and budget to invest in the details of the solution; for this reason PLM solutions will remain a kind of framework.
In addition, when a solution is not 100 % complete there will be an adaptation from the customer, making upgrades later, not 100 percent guaranteed or compatible. More details on PLM Upgrades after the conference, let’s look into the near future.
The Future of PLM resides in Brussels!
Some weeks ago I was positively amused by some messages coming from Roger Tempest (PLM Interest Group) related to the future of PLM. Roger claims the PLM industry is effectively rudderless. For that Roger announces the Launch Meeting for the PLM International Research Foundation,
“simple because such a platform does not yet exist.”
I checked if perhaps an ERP International Research Foundation existed, but I only found references to SAP, so what makes the PLM International Research Foundation unique ?
According to Roger, the reason behind this initiative is the lack of clear targets for PLM. I quote:
The lack of detailed thought means that many future possibilities for PLM are just not being considered; and the lack of collective thought means that even the current initiatives to improve PLM remain fragmented and ineffective
As I mentioned in the previous paragraph, PLM vendors are in a kind of rat race to keep up with market demands, rapidly changing business, meanwhile building on their core technology. Not an easy game, as they cannot start from scratch, but for sure, and here I agree, they do not optimize their portfolio.
Who can and will take part in such a research forum?
This is the same for companies implementing PLM systems. They are looking for solutions in the market that improve their businesses. This might be a PLM system, but perhaps other components bring even a higher value. Is ALM or SLM part of PLM, for example? This is a challenge as who defined what PLM is and where are the boundaries ?
This leaves the activity to the academics for sure they will have the most advanced and futuristic vision of what is possible conceptually. From my observations, the main challenges currently with PLM are that even the vendors are ten years ahead in their capabilities compared to what most companies are asking for. For the academic approach, I still have to think about Monty Python’s sketch related to soccer. See below
Sorry for the generalization, I believe we should not focus on what is PLM and how PLM should be defined. What we now call PLM is entirely different from what we called PLM 10 years ago, see my last year´s post PLM is changing. I think the future should focus how we are going to deal with business platforms, which contain PLM facets.
The PLM future
Interesting enough we are on the brink of a new business paradigm due to globalization and digitization as you might have read from my recent posts. There are analysts, consultancy firms and research foundations all describing this challenging future.
Have a look at this post from Verdi Ogewell’s article at Engineering.com: Product Innovation Platform: Plug’n’play next generation PLM. The post is a summary of the platform discussion during the PDT 2014 conference, which I consider as one of the best conferences if you want to go into the details. See also my post: The weekend after PDT 2014.
The future is about innovation and/or business platforms where data is available based on a federated approach, not necessary based on a single, monolithic PLM platform.
Focusing on standardization and openness of such a platform is for me the central mission we have.
Remember: Openness is a right, not a privilege.
Let PLM vendors and other application providers develop their optimized services for individual business scenarios that will remove the borders of system thinking. Academic support will be needed to solve interoperability and openness required for initiatives like Industry 4.0 and IDC´s third platform.
I am looking forward to interesting discussions at the upcoming
PI conference but also with peers in my network.
The future is challenging and will it still be named PLM?
A PLM-twisted mind never rests. Not even during these Xmas seasonal holidays, when everything else comes to rest. The dark Christmas days, here in the Netherlands, are the days to share with your family and with others who need your support. For a short time, we focus on being kind, charity and what matters for humanity.
Back to our purpose you might say. This year Pope Francis brought this message very aptly to his cardinals – read it here if you have not heard about it yet.
Next my PLM-twisted mind started ringing all kind of Xmas bells. The pope is talking about PLM! Instead of focusing on your business silo, your personal kingdom, we have to focus on what is the original purpose of our company, not of the individual person. Forget politics, back to the mission !
Then I realized there is a paradox within PLM. PLM is a must-have or must-do in a capitalistic world as through PLM companies can become be more competitive than others, win market share and become the market leader.
Nothing social. It is the base for survival in this global world. When your company is not funded by the government, you have to be competitive to survive. Your business needs to make enough money to keep on innovating and stay in business. This is why companies need PLM..
The paradox however is that effective PLM implementations are all based on the concept of sharing. Sharing data in the early ideation phases, through crowd-sourcing, open innovation, internal sharing with partners and potential customers. Next the development, delivery and maintenance phases of the lifecycle are all performing in an ideal way if information is shared and flowing across the value chain without being locked in silos. The current hype of IoT (Internet of Things) is about sharing data.
So to be a successful, profitable company, inside your business you need to go back to the roots of sharing (data). Interesting paradox isn’t it?
Therefore, I wish you all to have a PLM Pope in your company who will explain the mission and break down the holy houses. You need a PLM pope in your company to make sure it gets implemented successful.
I wish you all a happy and successful 2015
with a lot of sharing
p.s. Should I see a shrink for my PLM-twisted brain?