You are currently browsing the tag archive for the ‘PLM’ tag.
In my series of blog posts related to the (PLM) data model, I talked about Product, BOMs and Parts. This time I want to focus on the EBOM and (CAD) Documents relation. This topic became relevant with the introduction of 3D CAD.
Before companies were using 3D CAD systems, there was no discussion about EBOM or MBOM (to my knowledge). Engineering was producing drawings for manufacturing and not every company was using the mono-system (for each individual part a specifying drawing). Drawings were mainly made to assist production and making a drawing for an individual part was a waste of engineering time. Parametric drawings were used to specify similar parts. But now we are in the world of 3D!
With the introduction of 3D CAD systems for the mainstream in the nineties (SolidWorks, Solid Edge, Inventor) there came a need for PDM systems managing the individual files from a CAD assembly. The PDM system was necessary to manage all the file versions. Companies that were designing simple products sometimes remained working file-based, introducing the complexity of how to name a file and how to deal with revisions. Ten years ago I was investigating data management for the lower tiers of the automotive supply chain. At that time still 60 % of the suppliers were using CATIA were working file-based. Data management was considered as an extra complexity still file version control was a big pain.
This has changed for several reasons:
- More and more OEMs were pushing for more quality control of the design data (read PDM)
- Products became more modular, which means assemblies can be used as subassemblies in other products, pushing the need for where used control
- Products are becoming more complex and managing only mechanical CAD files is not enough anymore – Electronics & Software – mechatronics – became part of the product
Most PDM systems at that time (I worked with SmarTeam) were saving the 3D CAD structure as a quantity-based document structure, resembling a lot a structure called the EBOM.
This is one of the most common mistakes made in PLM implementations.
The CAD structure does not represent the EBOM !!!
Implementers started to build all kind of customizations to create automatically from the CAD structure a Part structure, the EBOM. Usually these customizations ended up as a mission impossible, in particular when customers started to ask for bidirectional synchronization. They expected that when a Part is removed in the EBOM, it would be deleted in the CAD assembly too.
And then there was the issue that companies believed the CAD Part ID should be equal to the Part ID. This might be possible for a particular type of design parts, but does not function anymore with flexible parts, such as a tube or a spring. When this Part is modeled in a different position, it created a different CAD Document, breaking the one-to-one relation.
Finally another common mistake that I have seen in many PDM implementations is the addition of glue, paint and other manufacturing type of parts to the CAD model, to be able to generate a BOM directly from the CAD.
From the data model perspective it is more important to understand that Parts and CAD documents are different type of objects. In particular if you want to build a PLM implementation where data is shared across all disciplines. For a PDM implementation I care less about the data model as the implementation is often not targeting enterprise continuity of data but only engineering needs.
A CAD Document (Assembly / Part / Drawing / …) behaves like a Document. It can be checked-in and checked out any time a change is made inside the file. A check-in operation would create a new version of the CAD Document (in case you want to trace the history of changes).
Meanwhile the Part specified by the CAD Document does not change in version when the CAD Document is changed. Parts usually do not have versions; they remain in the same revision as long as the specifying CAD Document matures.
Moving from PDM to PLM
For a PLM implementation it is important to think “Part-driven” which means from an initial EBOM, representing the engineering specification of the Product, maturing the EBOM with more and more design specification data. Design specification data can be mechanical assemblies and parts, but also electrical parts. The EBOM from a PCB might come from the Electrical Design Application as in the mechanical model you will not create every component in 3D.
And once the Electrical components are part of the EBOM, also the part definition of embedded software can be added to the BOM. For example if software is needed uploaded in flash memory chips. By adding electrical and software components to the EBOM, the company gets a full overview of the design maturity of ALL disciplines involved.
The diagram below shows how an EBOM and its related Documents could look like:
This data model contains a lot of details:
- As discussed in my previous post – for the outside world (the customer) there is a product defined without revision
- Related to the Product there is an EBOM (Part assembly) simplified as a housing (a mechanical assembly), a connector (a mechanical art) and a PCB (a mechanical representation). All these parts behave like Mechanical Parts; they have a revision and status.
- The PCB has a second representation based on an electrical schema, which has only (for simplification) two electrical parts, a resistor and a memory chip. As you can see these components are standard purchasable parts, they do not have a revision as they are not designed.
- The Electrical Part Flash Memory has a relation to a Software Part which is defined by Object Code (a zip-file?) which of course is specified by a software specification (not in the diagram). The software object code has a version, as most of the time software is version managed, as it does not follow the classical rules of mechanical design.
Again I reached my 1000 words, a sign to stop explaining this topic. For sure there are a lot of details to explain to this data model part too.
- A CAD structure is not an EBOM (it can be used to generate a part of the EBOM)
- CAD documents and EBOM parts have a different behavior. CAD documents have versions, Parts do not have versions (most of the time
- The EBOM is the place where all disciplines synchronize their data, providing during the development phase a single view of the design status.
Let me know if this was to abstract and feel free to ask questions. Important for this series of blog post is to provide a methodology baseline for a real PLM data model.
I am looking forward to your questions or remarks to spark up the discussion.
As described in my latest LinkedIn post if you want to install PLM successful there are two important points to address from the implementation point of view:
- An explicit data model not based on system or tools capabilities, but on the type of business the company is performing. There is a difference in an engineering to order company, a built to order company or a configure to order company.
- In PLM (and Business) it is all about enabling an efficient data flow through the organization. There is no ownership of data. It is about responsibilities for particular content per lifecycle stage combined with sharing
Historically PLM implementations started with capturing the CAD data and related EBOM as this is what the CAD-related PLM vendors were pushing for and this was often for the engineering department the biggest pain. The disadvantage of this approach is that it strengthens the silo-thinking process. The PLM system becomes an engineering tool instead of an enterprise system.
I believe if you really want to be able to implement PLM successful in a company, start from a common product/part information backbone. This requires the right business objects and, therefore, the right data modeling. The methodology described below is valid for build to order and configure to order companies, less applicable for engineering to order.
In a build to order company there are the following primary information objects:
- A Product ( representing the customer view of what is sold to the outside world)
- An EBOM ( representing a composition of Parts specifying the Product at a particular time)
- An MBOM (representing the manufacturing composition of the Product at a given time)
And, of course, there are for all the information objects related Documents. Various types and when you can work more advanced, the specification document, can be the source for individually extracted requirements (not in this post)
Let´s follow an End to End scenario from a typical Build to Order company process.
A potential customer sends an RFP for a product they need. The customer RFP contains information about how the product should behave (Specification / Requirements) and how it should be delivered (packaging). A basic data model for this RFP would be:
Note the following details:
- All information objects have a meaningless number. The number is only there to support unique identification and later integration with other systems. The meaning should come from the other attribute data on the object and its relations. (A blog post on its own)
- The Product can have instead of the meaningless number the number provided by the customer. However, if this number is not unique to the company, it might be just another attribute of the product
- In general Products do not have revisions. In time, there might be other BOMs related to the product. Not in this post, products might have versions and variants. And products might be part of a product family. In this case, I used a classification to define a classification code for the product, allowing the company to discover similar products from different customers done. This to promote reuse of solutions and reuse of lessons learned.
- The customer object represents the customer entity and by implementing it as a separate object, you will be able to see all information related to this customer quickly. This could be Products (ordered / in RFQ / etc.) but also other relevant information (Documents, Parts, …)
- The initial conceptual BOM for the customer consists of two sub-BOMs. As the customer wants the products to be delivered in a 6-pack, a standard 6-pack EBOM is used. Note: the Status is Released and a new conceptual EBOM is defined as a placeholder for the BOM definition of the Product to design/deliver.
- And for all the Parts in the conceptual EBOM there can be relations towards one or more documents. Usually, there is one specifying document (the CAD model) and multiple derived documents (Drawings, Illustrations, …)
- Parts can have a revision in case the company wants to trace the evolution of a Part. Usually when Form-Fit-Function remains the same, we speak about a revision. Otherwise, the change will be a new part number. As more and more the managed information is no longer existing on the part number, companies might want to use a new part number at any change, storing in an attribute what its predecessor was.
- Documents have versions and revisions. While people work on a document, every check-in / check-out moment can create a new version of the file(s), providing tractability between versions. Most of the time at the end there will be a first released version, which is related to the part specified.
- Do not try to have the same ID and Revision for Parts and Documents. In the good old days of 2D drawings this worked, in the world of 3D CAD this is not sustainable. It leads to complexity for the user. Preferably the Part and the specifying Document should have different IDs and a different revision mechanism.
And the iterations go on:
Now let´s look at the final stage of the RFQ process. The customer has requested to deliver the same product also in single (luxury) packaging as this product will be used for service. Although it is exactly the same physical product to produce, the product ID should be different. If the customer wants unambiguous communication, they should also use a different product ID when ordering the product for service or for manufacturing. The data model for this situation will look as follows (assuming the definitions are done)
Note the following details:
- The Part in the middle (with the red shadow) – PT000123 represents the same part for both, the product ordered for manufacturing, as well as the product ordered for service, making use of a single definition for both situations
- The Part in the middle has now a large set of related documentation. Not only CAD data but also test information (how to test the product), compliance information and more.
- The Part in the middle on its own also has a deeper EBOM structure which we will explore in an upcoming post.
I reached my 1000 words and do not want to write a book. So I will conclude this post. For experienced PLM implementers probably known information. For people entering the domain of PLM, either as a new student or coming from a more CAD/PDM background an interesting topic to follow. In the next post, I will continue towards the MBOM and ERP.
Let me know if this post is useful for you – and of course – enhancements or clarifications are always welcomed. Note: some of the functionality might not be possible in every PLM system depending on its origin and core data model
Two weeks ago I got this message from WordPress, reminding me that I started blogging about PLM on May 22nd in 2008. During some of my spare time during weekends, I began to read my old posts again and started to fix links that have been disappearing.
Initially when I started blogging, I wanted to educate mid-market companies about PLM. A sentence with a lot of ambiguities. How do you define the mid-market and how do you define PLM are already a good start for a boring discussion. And as I do not want to go into a discussion, here are my “definitions”
Warning: This is a long post, full of generalizations and a conclusion.
PLM and Mid-market
The mid-market companies can be characterized as having a low-level of staff for IT and strategic thinking. Mid-market companies are do-ers and most of the time they are good in their domain based on their IP and flexibility to deliver this to their customer base. I did not meet mid-market companies with a 5-year and beyond business vision. Mid-market companies buy systems. They bought an ERP system 25-30 years ago (the biggest trauma at that time). They renewed their ERP system for the Y2K problem/fear and they switched from drawing board towards a 2D CAD system. Later they bought a 3D CAD system, introducing the need for a PDM system to manage all data.
PLM is for me a vision, a business approach supported by an IT-infrastructure that allows companies to share and discover and connect product related information through the whole lifecycle. PLM enables companies to react earlier and better in the go-to-market process. Better by involving customer inputs and experience from the start in the concept and design phases. Earlier thanks to sharing and involving other disciplines/suppliers before crucial decisions are made, reducing the amount of iterations and the higher costs of late changes.
Seven years ago I believed that a packaged solution, combined with a pre-configured environment and standard processes would be the answer for mid-market companies. The same thought currently PLM vendors have with a cloud-based solution. Take it, us it as it is and enjoy.
Here I have changed my opinion in the past seven years. Mid-market companies consider PLM as a more complex extension of PDM and still consider ERP (and what comes with that system) as the primary system in the enterprise. PLM in mid-market companies is often seen as an engineering tool.
LESSON 1 for me:
The benefits of PLM are not well-understood by the mid-market
To read more:
Globalization and Education
In the past seven years, globalization became an important factor for all type of companies. Companies started offshoring labor intensive work to low-labor-cost countries introducing the need for sharing product data outside their local and controlled premises. Also, acquisitions by larger enterprises and by some of the dominant mid-market companies, these acquisitions introduced a new area of rethinking. Acquisitions introduced discussions about: what are real best practices for our organization? How can we remain flexible, meanwhile adapt and converge our business processes to be future ready?
Here I saw two major trends in the mid-market:
Lack of (PLM) Education
To understand and implement the value of PLM, you need to have skills and understanding of more than just a vendor-specific PLM system. You need to understand the basics of change processes (Engineering Change Request, Engineering Change Order, Manufacturing Change Order and more). And you need to understand the characteristics of a CAD document structure, a (multidisciplinary) EBOM, the MBOM (generic and/or plant specific) and the related Bill of Processes. This education does not exist in many countries and people are (mis-)guided by their PLM/ERP vendor, explaining why their system is the only system that can do the job.
Interesting enough the most read posts on my blog are about the MBOM, the ETO, BTO and CTO processes. This illustrates there is a need for a proper, vendor-independent and global accepted terminology for PLM
Some educational posts:
Bill of Materials for Dummies – ETO ranked #1
ECR/ECO for Dummies ranked #2
BOM for Dummies – CTO ranked #4
BOM for Dummies: BOM and CAD ranked #7
The dominance of ERP
As ERP systems were introduced long before PLM (and PDM), these systems are often considered by the management of a mid-market company as the core. All the other tools should be (preferably) seen as an extension of ERP and if possible, let´s implement ERP vendor´s functionality to support PLM – the Swiss knife approach – one tool for everything. This approach is understandable as at the board level there are no PLM discussions. Companies want to keep their “Let´s do it”-spirit and not reshuffle or reorganize their company, according to modern insights of sharing. Strangely enough, you see in many businesses the initiative to standardize on a single ERP system first, instead of standardizing on a single PLM approach first. PLM can bring the global benefits of product portfolio management and IP-sharing, where ERP is much more about local execution.
PLM is not understood at the board level, still considered as a tool
Some post related to PLM and ERP
Where is the MBOM ? ranked #3
The human factor
A lot of the reasons why PLM has the challenge to become successful have to do with its broad scope. PLM has an unclear definition and most important, PLM forces people to share data and work outside their comfort zones. Nobody likes to share by default. Sharing makes day-to-day life more complicated, sharing might create visibility on what you actually contribute or fix. In many of my posts, I described these issues from various viewpoints: the human brain, the innovators dilemma, the way the older generation (my generation) is raised and used to work. Combined with the fact that many initial PLM/PDM implementations have created so many legacies, the need to change has become a risk. In the discussion and selection of PLM I have seen many times that in the end a company decides to keep the old status quo (with new tools) instead of really having the guts to move toward the future. Often this was a result of investors not understanding (and willing to see) the long term benefits of PLM.
PLM requires a long-term vision and understanding, which most of the time does not fit current executive understanding (lack of education/time to educate) and priority (shareholders)
Many recent posts are about the human factor:
The digital transformation
The final and most significant upcoming change is the fact that we are entering a complete new era: From linear and predictable towards fast and iterative, meaning that classical ways we push products to the market will become obsolete. The traditional approach was based on lessons learned from mechanical products after the second world-war. Now through globalization and the importance of embedded software in our products, companies need to deliver and adapt products faster than the classical delivery process as their customers have higher expectations and a much larger range to choose from. The result from this global competitiveness is that companies will change from delivering products towards a more-and-more customer related business model (continuous upgrades/services). This requires companies to revisit their business and organization, which will be extremely difficult. Business wise and human change require new IT concepts – platform? / cloud services? / Big data?
Older enterprises, mid-market and large enterprises will be extremely challenged to make this change in the upcoming 10 years. It will be a matter of survival and I believe the Innovator´s Dilemma applies here the most.
The digital transformation is apparent as a trend for young companies and strategic consultants. This message is not yet understood at the board level of many businesses.
Some recent post related to this fast upcoming trend:
ROI (Return On Investment)
I also wrote about ROI – a difficult topic to address as in most discussions related to ROI, companies are talking about the costs of the implementation, not about the tremendous larger impact a new business approach or model can have, once enabled through PLM. Most PLM ROI discussions are related to efficiency and quality gains, which are significant and relevant. However these benefits are relative small and not comparable with the ability to change your business (model) to become more customer centric and stay in business.
Some of the ROI posts:
A (too) long post this time however perhaps a good post to mark 7 years of blogging and use it as a reference for the topics I briefly touched here. PLM has many aspects. You can do the further reading through the links.
From the statistics it is clear that the education part scores the best – see rankings. For future post, let me know by creating a comment what you are looking for in this blog: PLM Mid-Market, Education, PLM and ERP, Business Change, ROI, Digitalization, or …??
Also I have to remain customer centric – thanks for reading and providing your feedback
I was sitting outside in the garden during Ascension Day, which is (still) a national holiday in the Netherlands (Thanks God). It was again nice and warm, and it made me think about the parallels between Global warming and PLM.
Climate change has always been there if we look at the history of our planet. We started to talk about Global Warming when scientist indicated that this time the climate change is caused by human intervention. As a result of vast amounts of carbon dioxide emissions, a greenhouse effect started to become visible. When the first rumors came that global warming began to come up, environmentalists started preaching we have to act NOW before it is too late. Meanwhile at the other side, people began arguing that it was just a coincidence, an opinion.
There is no scientific proof, so why worry?
In the past ten years, the signs and proofs of global warming have become evident and climate conferences filled with people who want to act and on the other side the blockers, try to create progress in the battle against global warming. In particular in Europe governments and companies are starting to become aware that they can contribute to a more sustainable society.
Not enough according to the environmentalists and scientists. As our brains still operate mostly in a prehistoric mode (day-to-day survival, food, home, social status), slow changes and sustainability for next generations are not part of most people concerns. And those people, who make us aware of this lack of priority for sustainability, are considered annoying as they disrupt our lives.
Companies that have invested (heavily) in sustainable business models often have a challenging path to survive against traditional businesses. As the majority of consumers wants cheap. Some examples:
- Energy: most power plants are heated by burning coal as this is the cheapest option. Shale gas winning became attractive because we need cheap fuel. Alternatives like solar, wind and others cannot compete on price level as long as we do not pay for the damage to nature.
- Food: produced in bio-farms, where animal wellness or health is not part of the plan. The goal is to deliver xx kilos of meat for the lowest price. Alternative like more natural ways of growing meat or even revolutionary ways (the grown hamburger) cannot compete on price currently unless we are willing to pay for it.
- The Fashion industry where down in its supply chains human beings are treated like slaves. When you buy a cheap garment, you know somebody has been suffering.
Governments sometimes subsidize or push sustainable technologies as they realize that something has to happen (most of the time for the public opinion – their voters) but there is no consistent strategy as liberals believe every form of support is against open competition. And as long as we let our prehistoric brain run our choices, the earth gets warmer with the consequences being visible more and more.
We know we have to act, but we do not act seriously
Now let´s switch to PLM. The association started when I saw Chad Jackson’s retweet from Lifecycle insights related to top PLM challenges.
Clearly the message illustrates that costs, time, and technology have priority. Not about what PLM really can establish (even in the context of global warming).
PLM started end of the previous century, initially invented by some of the major CAD vendors, Dassault Systemes, PTC, and Siemens. Five years later it was taken more seriously, as also enterprise software vendors, like SAP and Oracle, started to work on their PLM offering. And some years ago even the most skeptic company related to PLM, Autodesk, began to sell a PLM offering.
So like global warming we can conclude: PLM is recognized, and now we can act.
The early adopters of PLM are also in a challenging situation. Their first PLM implementations were very much focused on an IT-infrastructure, allowing data to flow through a global organization, without disrupting the day-to-day business model too much. These implementations are now a burden to many of them: costly and almost impossible to change. Look at the PLM stories from some of the major automotive companies, like Daimler, JLR, PSA, Renault, , Volvo Cars and more.
They are all somehow kept hostage by their old implementations (as business continues) however due to changing ownership, business models and technology they cannot benefit from modern PLM concepts as it would be a disruption.
Meanwhile, PLM has evolved from an IT-infrastructure into a business-driven approach to support global, more flexible and customer-driven business processes. Younger companies that are now starting in Asia do not suffer from this legacy and are faster established based on the know-how from the early adopters.
And this is not only happing in the automotive industry. In the recent years, I have seen examples in the Oil & Gas industry, the High-Tech industry (which in theory is relative young) and the Manufacturing industry.
Coming back to the 2015 PLM challenges tweeted by Chad Jackson, it looks like they are related to time and costs. Obviously it is not clear what values PLM can bring to a company outside efficiency gains (ERP/Lean thinking). Modern PLM allows companies to change their business model as I wrote recently: From a linear to fast and circular. No longer is the PLM mission to support companies with product information from cradle to grave but from cradle to cradle. Sustainability and becoming connected to customers are new demands: Operational services instead of selling products, linking it with the need for IoT to understand what is happening.
In the 2015 PLM, the discussion with executives is about purchasing technology instead of the need to change our business for long-term survival. Most investors do not like long-term visions as their prehistoric brains are tuned to be satisfied in the short-term.
Therefore, as long as the discussion about PLM is about IT and infrastructure and not about business change, there will be this stall, identical to what happens with addressing global warming. Short term results are expected by the stakeholders, trying to keep up the current model. Strategists and business experts are all talking about the new upcoming digital era, similar to global warming.
We know we have to act, but we do not act seriously
When I posted a short version of this post on LinkedIn on Ascension Day, I got some excellent feedback which I want to share here:
Dieter de Vroomen (independent advisor, interim manager & neighbor) wrote me an email. Dieter does not have a PLM-twisted brain. Therefore I like his opinion:
PLM and Global Warming are both assumptions, mental constructs that we can make plausible with technology and data. Both mindsets save us from disasters through the use of technology. And that’s what both sell. But is that what they produce, what we want? Apple and associates think vice versa, making what first we want and explain later the underlying technology. I miss that with global warming, but certainly PLM. That’s why it sells so bad CxO’s.
I think the point Dieter is making is interesting as he is a non-PLM guy -showing the way CxO might be thinking. As long as we (PLMers) do not offer a packaged solution, an end-to-end experience, it is hard to convince the C-level. This is one of the significant differences between ERP (its purpose is clearly is tangible) and PLM (see my post PLM at risk! It does not have a clear target).
A more motivating comment came from Ben Muis, consultant and entrepreneur in the fashion industry. We met at the PI Apparel 2013 conference, and I like his passion for bringing innovation to the fashion industry. Read his full comments on my post on LinkedIn as he combined in his career sustainability and PLM. Two quotes from Ben:
As you may know I did quite a bit of work on how the fashion industry could and should be more sustainable in its approach. This was at a time where only a handful of people at best were willing to even think about this. Knowing that in reality the decisions around cost and commercialism were driving the agenda, I drew the conclusion that by improving processes within the industry I could actually cause a sustainability improvement that was driven by commercial desire.
Explaining how you can become involved in the bigger picture and for Ben it is the possibility to keep on working on his passion in a real-time world. And finally:
So there you have it… my reasons for initially thinking your title was very close to the reason I shifted my focus from pure sustainability advice to PLM implementations to begin with. I could drive a real result much quicker. This, as I am sure you will agree, in itself supports the reason for taking PLM seriously
The topics PLM and Global Warming have a lot in common. The awareness exists. However when it comes to action, we are blocked by our prehistoric brain, thinking about short term benefits. This will not change in the next 1000 years. Therefore, we need organizations and individuals that against all odds take the steep path and have a vision of change, breaking the old models and silos. It will cost money, it will require a sacrifice and the reward will only be noticed by next generations. What a shame
A final quote before going back to standard PLM matter in upcoming posts:
“Everything is theoretically impossible, until it is done.”
Robert A. Heinlein
Did I choose the wrong job? Busy times still and the past 15 years I have focused on PLM and every year I had the feeling there was progress in the understanding and acceptance for PLM. Although the definition of PLM is a moving target, there are probably thousands of PLM experts around the world. From my recent blog posts, the past two years you might share my opinion that PLM is changing from an engineering, document-centric system towards a beyond PLM approach where a data-driven, federated platform leads to (yet unknown) benefits.
So where to draw the border of PLM?
Is there a possibility that somewhere a disruptive approach will redefine PLM again? PLM is considered complex (I don´t think so). The complexity lies first of all in the rigidness of PLM systems not being able to excite people. Next the desires from implementers to provide services to satisfy users and, as a result, make it more complicated. Finally and the most important reason the lack of understanding that implementing PLM requires a business change.
Change (don´t mention the word), which does not happen overnight.
Oleg Shilovitsky wrote about PLM and organizational change. He is leaving it for further discussion if the difficulty is related to the PLM technology or the resistance towards change for people in business. Read his conclusion:
Change is hard. We should re-think the way we implement PLM and exclude process alignment from PLM implementation. Stop changing people and stop forcing people to take complicated decisions during PLM sales process. Future PLM products will become a foundation for agile change management that will be done by companies.
Edward Lopategui is even more provocative in his blog post: The PLM Old Fart Paradox. Have a read of his post including the comments. Edward is somehow sharing the same belief, stating PLM has an identity crisis
PLM has an identity crisis. Talking PLM at a random networking event tends to engender one of two reactions. The first is from anyone who recognizes the acronym, spent 5 years consulting for company X, and begins a vigorous head-nod that instills fear their neck may unhinge in agreement. The other reaction is quite the opposite; you can almost sense a capillary dilation of the so-called blush response. Fluctuation of the pupil… Involuntary dilation of the iris… it’s the Voight-Kampff test for interest expiring at the mere utterance of the acronym. You don’t get this kind of reaction when you talk Cloud or Internet of Things, which while overused, tend to at least solicit questions and interest among the uninitiated. There’s public relations work to be done.
Both Oleg and Edward believe that new technology is needed to overcome the old PLM implementation issues: a need for change, a need to break down the silos.
Meanwhile in Europe
Meanwhile in Europe, an international research foundation for PLM (http://www.plm-irf.org/) has been initiated and is making itself heard towards the United States. What is the mission of this research foundation? To define the future of PLM. Read the opening statement:
The PLM International Research Foundation (PLM-IRF) initiative aims to establish a central mechanism to support global research into the most advanced future capabilities of PLM.
This is the first initiative ever to ask the question:
What research does the world need, to achieve the future PLM capabilities that the world wants?”
This simple question highlights that fact that the PLM industry needs coherent view of the future. Without a clear sense of direction, PLM development is likely to fall far short of what it could be.
I consider this as a mission impossible. In May this year I will be blogging for seven years about PLM and looking back to my early posts the world was different. Interesting some of the predictions (PLM in 2050 – predictions done in 2008) I made in the past are still valid however for every right prediction there might be a wrong one too.
And now this International Research Foundation is planning to define what PLM should offer in the future?
What happens if companies do not agree and implement their business approach? It reminded me of a keynote speech given by Thomas Schmidt (Vice President, Head of Operational Excellence and IS – ABB’s Power Products Division) at PLM Innovation 2012 (my review here). Thomas was challenging the audience explaining what ABB needed. Quoting Thomas Schmidt:
“And if you call this PLM, it is OK for me. However, current PLM systems do not satisfy these needs.”
So you can imagine the feeling I got: PLM has an identity crisis.
Or do I have an identity crisis?
I believe we are in a transition state where companies have to redefine their business. I described this change in my earlier post: From Linear to fast and circular. Implementing this approach first of all requires a redefinition of how organizations work. Hierarchical and siloed organizations need to transform towards flat, self-adapting structures in order to become more customer-centric and reactive to ever faster-changing market needs.
For that reason, I was surprised by a presentation shared by Chris Armbruster that same week I read Oleg´s and Edward´s posts. In many ways, Chris and I come from the opposite sides of PLM.
My background European, with a classical start from engineering, a focus on the mid-market. Chris according to his Slideshare info, US-based, Supply Chain Executive and focus on the Fortune 500.
Have a look at Chris´s presentation – rethinking business for Exponential times. It is amazing that two persons not connected at all can come to the same conclusions.
This should be an indication there is a single version of the truth!
You might say PLM has an identity crisis. We do not need a better definition of PLM to solve this. We need to change our business model and then define what we need. PLM, ERP, SLM, MES, SCM, ….. There are enough unused TLAs for the future. And I am still happy with my job.
… and you ? Looking for a new job or changing too ?
This is the fifth year that marketkey organized their vendor-independent conference in Europe around Product Innovation, where PLM is the major cornerstone. Approximate 100 companies attended this conference coming from various industries. As there were most of the time two till four parallel tracks (program here), it will still take time for me to digest all the content. However here a first impression and a comparison to what has changed since the PI Conference in 2014 – you can read my review from that conference here.
First of all the keynote speeches for this conference were excellent and were a good foundation for attendees to discuss and open their mind. Secondly I felt that this conference was actually dealing with the imminent shift from classic, centralized businesses towards the data-centric approach to connectivity of information coming from anyone / anything connected. Naturally the Internet of Everything (IoE) and the Internet of Things (IoT) were part of the discussion combined with changing business models: moving from delivering products toward offering services (CAPEX versus OPEX).
Some of the highlights here:
The first keynote speaker was Carlo Rati Director, MIT Senseable Lab. He illustrated through various experiments and examples how being connected through devices we can change and improve our world: tagging waste, mobile phone activity in a city and the Copenhagen Wheel. His main conclusion (not a surprise): For innovation there is a need to change collaboration. Instead of staying within the company / discipline boundaries solving problems through collaboration between different disciplines will lead to different thinking. How is your company dealing with innovation?
The second session I attended was John Housego from W.L. Gore and Associates who explained the company’s model for continuous growth and innovation. The company’s future is not based on management but based on leadership of people working in teams in a flat organization. Every employee is an associate, directly involved and challenged to define the company’s future. Have a read about the company’s background here on Wikipedia.
Although the company is 50 years old, I realized that their cultural model is a perfect match with the future of many businesses. More and more companies need to be lean and flexible and support direct contact between the field, customers, market and experts inside the company. Implementing a modern PLM platform should be “a piece of cake” if the technology exists, as W.L. Gore’s associates will not block the change if they understand the value. No silos to break down.
My presentation “The Challenge of PLM Upgrades as We See the Rules of Business Change” was based around two themes (perpetual software ? / seamless upgrades ?) and from there look towards the future what to expect in business. When we look back, we see that every 10 years there is a major technology change, which makes the past incompatible to upgrade. Now we are dreaming that cloud-based solutions are the future to guarantee seamless upgrades (let’s wait 10 years). To my opinion companies should not consider a PLM upgrade at this moment.
The changes in business models, people behavior and skills plus technology change, will enable companies to move towards a data-centric approach. Companies need to break with the past (a linear, mechanical-design-based, product development approach) and redesign a platform for the future (a business-innovation platform based on the data). In my upcoming blog post(s) I will give more background on this statement.
Trond Zimmerman from the Volvo Group Truck explained the challenges and solution concept they experienced as they are currently implementing answering the challenge of working in a joint venture with Dongfeng Commercial Vehicles. As in a joined venture you want to optimize sharing of common parts, still you cannot expect a single PLM solution for the total joint venture. For that reason, Volvo Group Truck is implementing Share-A-Space from Eurostep to have a controlled collaboration layer between the two joint venture partners.
This is, to my opinion, one of the examples of future PLM practices, where data will not be stored in a single monolithic system, but data will be connected through information layers and services. The case is similar to what has been presented last year at Product Innovation 2014 where Eurostep and Siemens Industrial Turbomachinery implemented a similar layer on top of their PDM environment to enable controlled sharing with their suppliers.
David Rowan from wired.co.uk closed the day with his keynote: Understanding the New Rules of Product Innovation. He touched the same topic as John Housego from W.L. Gore somehow: it is all about democratization. Instead of hierarchy we are moving to network-based activities. And this approach has a huge impact on businesses. David’s message: Prepare for constant change. Where in the past we lived in a “linear” century, change according to Moore’s law, we are entering now an exponential century where change is going faster and faster. Besides examples of the Internet of Thing, David also gave some examples of the Internet of Stupid Things. He showed a quote from Steve Balmer stating that nobody would pay $ 500 for a phone (Apple). The risk he made is that by claiming some of these stupid inventions might lead to a quote in the future. I think the challenge is always to stay open-minded without judging as at the end the market will decide.
PLM and ERP
I spent the evening networking with a lot of people, most of them excited about the future capabilities that have been presented. In parallel, the discussion was also about the conservative behavior of many companies. Topics that are already for ten years under discussion – how to deal and connect PLM and ERP, where is the MBOM, what are the roles of PLM and ERP for an organization, are still thankful topics for a discussion, showing where most companies now are with their business understanding.
In parallel to a product innovation conference apparently there is still a need to agree on basic PLM concepts from the previous century.
The second day opened with an excellent keynote speech from Dirk Schlesinger from Cisco. He talked about the Internet of Everything and provided examples of the main components of IoE: Connectivity, Sensors, Platform, Analytics, and Mobility. In particular the example of Connectivity was demonstrating the future benefits modern PLM platforms can bring. Dirk talked about a project with Dundee Mining where everything in the mine was tagged with RFI devices (people, equipment, vehicles, and resources) and the whole mine was equipped with Wi-Fi.
Based on this approach the execution and planning of what happened was done in their HQ through a virtual environment, giving planners immediate visibility of what happens and allowing them to decide on real data. This is exactly the message I have posted in my recent blog posts.
The most fascinating part were the reported results. This project is ongoing now for 3 years and the first year they achieved a production increase of 30 %. Now they are aiming for this year for a 400 % production increase and a 250 % efficiency increase. These are the numbers to imagine when you implement a digital strategy. It is no longer any more about making our classical processes more efficient, it is about everyone connected and everyone collaborates.
Marc Halpern from Gartner gave an good presentation connecting the hype of the Internet of Things with the world of PLM again, talking about Product Innovation Platforms. Marc also touched on the (needed) upcoming change in engineering processes. More and more we will develop complex products, which need system thinking. Systems of Systems to handle this complexity, As Marc stated: “Product, process, culture is based on electro-mechanical products where the future trend is all about software.” We should reconsider our Bill of Materials (mechanical) and think probably more about a Bill of Features (software). Much of Marc’s presentation contained the same elements as I discussed in my PDT2014 blog post from October last year.
I was happy to see Jenni Ala-Mantila presenting the usage of PLM system for Skanska Oy. Skanska is one of the largest construction companies operating global. See one of their beautiful corporate videos here. I always have been an advocate to use PLM practices and PLM infrastructure to enhance, in particular, the data-continuity in a business where people work in silos with separate tools. There are so many benefits to gain by having an end-to-end visibility of the project and its related data. Jenni’s presentation was confirming this.
By implementing a PLM backbone with a focus on project management, supplier collaboration and risk management, she confirmed that PLM has contributed significant to their Five Zero – vision: Zero loss-making projects, Zero Environmental incidents, Zero Accidents, Zero Ethical breaches and Zero Defects. Skanska is really a visionary company although it was frustrating to learn that there was still a need to build a SharePoint connection with their PLM environment. The future of data-centric has not reached everyone in the organization yet.
The last two sessions of the conference, a panel discussion “Why is Process Innovation Challenging & What can be done about it” plus the final keynote “Sourcing Growth where Growth Takes Place” had some commonality which I expressed in some twitter quotes:
Where last year I had the impression that the PLM world was somehow in a static mode, not so much news in 2014. It became clear in this 2015 conference that the change towards new business paradigms is really happening and at a faster pace than expected. From mechanical development processes to software processes, from linear towards continuous changes. Moe to come this year
Currently, I am preparing my sessions for the upcoming Product Innovation conference in Düsseldorf. See: www.picongress.com. My first session will be about PLM upgrades and how to deal with them for the future. It is a challenging topic as some PLM vendors claim using their product, there will be no upgrade problems and cloud-based solutions also provide seamless upgrades in the future.
Don’t cheer to early when you see this kind of messages. I had the chance to look back the past twenty years what happened with PLM and tried to look forward to the upcoming ten years what might happen.
In addition, this lead to some interesting thoughts that I will share in detail during the conference. I will come back to this topic in this blog after the conference. Here some unstructured thoughts that passed my mind recently when preparing this session.
Not every upgrade is the same!
First there was an interesting blog post from Ed Lopategui from E(E) with the title There is No Upgrade, where he addresses the difference between consumer software and enterprise software. Where consumer software will be used by millions and tested through long Alfa and beta cycles, PLM software often comes to the market in what you could consider a beta stage with limited testing.
Most PLM vendors invest a lot of their revenue in providing new functionality and technology based on their high-end customer demands. They do not have the time and budget to invest in the details of the solution; for this reason PLM solutions will remain a kind of framework.
In addition, when a solution is not 100 % complete there will be an adaptation from the customer, making upgrades later, not 100 percent guaranteed or compatible. More details on PLM Upgrades after the conference, let’s look into the near future.
The Future of PLM resides in Brussels!
Some weeks ago I was positively amused by some messages coming from Roger Tempest (PLM Interest Group) related to the future of PLM. Roger claims the PLM industry is effectively rudderless. For that Roger announces the Launch Meeting for the PLM International Research Foundation,
“simple because such a platform does not yet exist.”
I checked if perhaps an ERP International Research Foundation existed, but I only found references to SAP, so what makes the PLM International Research Foundation unique ?
According to Roger, the reason behind this initiative is the lack of clear targets for PLM. I quote:
The lack of detailed thought means that many future possibilities for PLM are just not being considered; and the lack of collective thought means that even the current initiatives to improve PLM remain fragmented and ineffective
As I mentioned in the previous paragraph, PLM vendors are in a kind of rat race to keep up with market demands, rapidly changing business, meanwhile building on their core technology. Not an easy game, as they cannot start from scratch, but for sure, and here I agree, they do not optimize their portfolio.
Who can and will take part in such a research forum?
This is the same for companies implementing PLM systems. They are looking for solutions in the market that improve their businesses. This might be a PLM system, but perhaps other components bring even a higher value. Is ALM or SLM part of PLM, for example? This is a challenge as who defined what PLM is and where are the boundaries ?
This leaves the activity to the academics for sure they will have the most advanced and futuristic vision of what is possible conceptually. From my observations, the main challenges currently with PLM are that even the vendors are ten years ahead in their capabilities compared to what most companies are asking for. For the academic approach, I still have to think about Monty Python’s sketch related to soccer. See below
Sorry for the generalization, I believe we should not focus on what is PLM and how PLM should be defined. What we now call PLM is entirely different from what we called PLM 10 years ago, see my last year´s post PLM is changing. I think the future should focus how we are going to deal with business platforms, which contain PLM facets.
The PLM future
Interesting enough we are on the brink of a new business paradigm due to globalization and digitization as you might have read from my recent posts. There are analysts, consultancy firms and research foundations all describing this challenging future.
Have a look at this post from Verdi Ogewell’s article at Engineering.com: Product Innovation Platform: Plug’n’play next generation PLM. The post is a summary of the platform discussion during the PDT 2014 conference, which I consider as one of the best conferences if you want to go into the details. See also my post: The weekend after PDT 2014.
The future is about innovation and/or business platforms where data is available based on a federated approach, not necessary based on a single, monolithic PLM platform.
Focusing on standardization and openness of such a platform is for me the central mission we have.
Remember: Openness is a right, not a privilege.
Let PLM vendors and other application providers develop their optimized services for individual business scenarios that will remove the borders of system thinking. Academic support will be needed to solve interoperability and openness required for initiatives like Industry 4.0 and IDC´s third platform.
I am looking forward to interesting discussions at the upcoming
PI conference but also with peers in my network.
The future is challenging and will it still be named PLM?
A year ago I wrote a blog post questioning if the construction industry would learn from PLM practices in relation to BIM.
In that post, I described several lessons learned from other industries. Topics like:
- Working on a single, shared repository of on-line data (the Digital Mock Up). Continuity of data based on a common data model – not only 3D
- It is a mindset. People need to learn to share instead of own data
- Early validation and verification based on a virtual model. Working in the full context
- Planning and anticipation for service and maintenance during the design phase. Design with the whole lifecycle in mind (and being able to verify the design)
The comments to that blog post already demonstrated that the worlds of PLM and BIM are not 100 percent comparable and that there are some serious inhibitors preventing them to come closer. One year later, let´s see where we are:
BIM moving into VDC (or BLM ?)
The first trend that becomes visible is that people in the construction industry start to use more and more the term Virtual Design and Construction (VDC) instead of BIM (Building Information Model or Building Information Management?).
The good news here is that there is less ambiguity with the term VDC instead of BIM. Does this mean many BIM managers will change their job title? Probably not as most construction companies are still in the learning phase what a digital enterprise means for them.
Still Virtual Design and Construction focuses a lot on the middle part of the full lifecycle of a construction. VDC does not necessary connect the early concept phase and for sure almost neglects the operational phase. The last phase is often ignored as construction companies are not thinking (yet) about Repair & Maintenance contracts (the service economy).
And surprisingly, last week I saw a blog post from Dassault Systemes, where Dassault introduced the word BLM (Building Lifecycle Management). Related to this blog post also some LinkedIn discussions started. BLM, according to Dassault Systemes, is the combination of BIM and PLM – read this post here.
The challenge however for construction companies is to, what are the related data sets they require and how can you create this continuity of data. This brings us to one of the most important inhibitors.
Where in other industries a clear product data owner exists, the ownership of data in EPC (Engineering, Procurement, Construction) companies, typical for the construction industry or oil & gas industry is most of the times on purpose vague.
First of all the owner of a construction often does not know which data could be relevant to maintain. And secondly, as soon as the owner asks for more detailed information, he will have to pay for that, raising the costs, which not directly flow back to benefits, only later during the FM (Facility Management) /Operational stage.
And let´s imagine the owner could get the all the data required. Next the owner is at risk, as potentially having the information might makes you liable for mistakes and claims.
From discussion with construction owners I learned their policy is not to aim for the full dataset related to a construction. It reduces the risk to be liable. Imagine Boeing and Airbus would follow this approach. This brings us to another important inhibitor.
A risk shifting business
The construction industry on its own is still a risk shifting business, where each party tries to pass the risk of cost of failure to another stakeholder in the pyramid. The most powerful owners / operators of the construction industry quickly play down the risk to their contractors and suppliers. And these companies then then distribute the risk further down to their subcontractors.
If you do not accept the risk, you are no longer in the game. This is different from other industries and I have seen this approach in a few situations.
For example, I was dealing with an EPC company that wanted to implement PLM. The company expected that the PLM implementer would take a large part of the risk for the implementation. As they were always taking the risk too for their big customers when applying for a project. Here there was a clash of cultures, as PLM implementers learned that the risk of a successful PLM implementation is vague as many soft values define the success. It is not a machine or platform that has to work after some time.
Another example was related to requirements management. Here the EPC company wanted to become clear and specific to their customer. However their customer reacted very strange. Instead of being happy that the EPC company invested in more upfront thinking and analysis, the customer got annoyed as they were not used to be specific so early in the process. They told the EPC company, “if you have so many questions, probably you do not understand the business”.
So everyone in the EPC business is pushed to accept a higher risk and uncertainty than other industries. However, the big reward is that you are allowed to have a cost of failure above 15 – 20 percent without feeling bad. Which this percentage you would be out of business in other industries. And this brings us to another important inhibitor.
Accepted high cost of failure
As the industry accepts this high cost of failure, companies are not triggered to work different or to redesign their processes in order to lower the inefficiencies. The UK government mandates BIM Level 2 for their projects starting in 2016 and beyond, to reduce costs through inefficiencies.
But will the UK government invest to facilitate and aim for data ownership? Probably not, as the aim of governments is not to be extreme economical. Being not liable has a bigger value than being more efficient for governments as I learned. Being more efficient is the message to the outside world to keep the taxpayer satisfied.
It is hard to change this way of thinking. It requires a cultural change through the whole value chain. And cultural change is the “worst” thing that can happen to a company. The biggest inhibitor.
Cultural change is a point that touches all industries and there is no difference between the construction industry and for example a classical discrete manufacturing company. Because of global competition and comparable products other industries have been forced already to work different, in order to survive (and are still challenged)
The cultural change lies in people. We (the older generation) are educated and brought up in classical engineering models that reflect the post second world war best practices. Being important in a process is your job justification and job guarantee.
New paradigms, based on a digital world instead of a document-shifting world, need to be defined and matured and will make many classical data processing jobs redundant. Read this interesting article from the Economist: The Onrushing Wave
This is a challenge for every company. The highest need to implement this cultural change is ironically for those countries with the highest legacy: Western Europe / the United-States.
As these countries also have the highest labor cost, the impact of, keep on doing the old stuff, will reduce their competitiveness. The impact for construction companies is less, as the construction industry is still a local business, as at the end resources will not travel the globe to execute projects.
However cheaper labor costs become more and more available in every country. If companies want to utilize them, they need to change the process. They need shift towards more thinking and knowledge in the early lifecycle to avoid the need for high qualified people to be in the field to the fix errors.
Sharing instead of owning
For me the major purpose of PLM is to provide an infrastructure for people to share information in such a manner that others, not aware of the information, can still easily find and use the information in a relevant context of their activities. The value: People will decide on actual information and no longer become reactive on fixing errors due to lack of understanding the context.
The problem for the construction industry is that I have not seen any vendor focusing on sharing the big picture. Perhaps the BLM discussion will be a first step. For the major tool providers, like Autodesk and Bentley, their business focus is on the continuity of their tools, not on the continuity of data.
Last week I noticed a cloud based Issue Management solution, delivered by Kubus. Issue Management is one of the typical and easy benefits a PLM infrastructure can deliver. In particular if issues can be linked to projects, construction parts, processes, customers. If this solution becomes successful, the extension might be to add more data elements to the cloud solution. Main question will remain: Who owns the data ? Have a look:
For continuity of data, you need standards and openness – IFC is one of the many standards needed in the full scope of collaboration. Other industries are further developed in their standards driven by end-user organizations instead of vendors. Companies should argue with their vendors that openness is a right, not a privilege.
A year ago, I was more optimistic about the construction industry adopting PLM practices. What I have learned this year, and based on feedback from others, were are not at the turning point yet. Change is difficult to achieve from one day to the other. Meanwhile, the whole value chain in the construction industry has different objectives. Nobody will take the risk or can afford the risk.
I remain interested to see where the construction industry is heading.
What do you think will 2015 be the year of a breakthrough?
In the past two years, I have been heavily involved in PLM Proof of Concepts sitting at both sides of the table. Supporting companies in their PLM selection, supporting a vendor explaining their value to the customer and supporting implementers assisting them with industry knowledge, all in the context of a PLM selection process.
The Proof of Concept is crucial in a PLM selection process as it is the moment where the first glimpse of reality comes to the table.
Different size of companies, different consultants all have a different view on the importance of the Proof of Concept. Let me share you my thoughts after a quick recap on the PLM selection process.
The PLM selection process
1. Build a vision
It is important that a company understands what they want to achieve in the next five to ten years, before starting a PLM selection process. Implementing PLM means a business transformation, even if you are a small company. If the management does not understand a vision is required, there is a potential risk upcoming, as PLM without a change in the way people work, will not deliver the expected results.
2. Issue an RFI to potential candidates
Once you have a PLM vision, it is time to get in touch with potential suppliers. The RFI (Request for Information) phase is the phase where you can educate yourself better by challenging the suppliers to work with you on the future solutions.
3. Discuss with selected candidates
From the RFI responses you understand which companies are attractive because they match your vision, your budget or industry. Have a first interaction with the selected companies and let them demo their standard environment targeted to your vision.
In this stage, you check with the preferred companies their ability to deliver and your ability to work together. The POC phase should give you the understanding of the scope for the upcoming PLM project and help you to understand who and how the project can be executed. More details about this step below.
Although some companies start with an RFP before the POC, for me it makes most sense to verify the details after you have a proper understanding of the To-Be solution. The RFP is often the base for the contractual scope and therefore should be as accurate as possible
In the past, I wrote in more detail about the PLM selection process. Two posts: PLM selection: Don’t do this and PLM selection: Do this. Have a read if you want to understand this part in more depth. Now let´s focus on the POC .
- As described before, the target of the Proof of Concept should be to get a better understanding of the potential To-Be processes and obtain an impression of the capabilities of the implementer and the preferred PLM software.
The result should be that you have more realistic expectations of what can be achieved and the challenges your company will face.
- From there, you can evaluate the risks, address them and build an achievable roadmap to implement. It is important that the focus is not just on the cost of the implementation.
- To sell PLM inside your company, you need to realign with the vision and explain, to all people involved,the value of “Why PLM”.
Explaining the value is complex, as not everyone needs the same message. The management will focus on business benefits where users will focus how it impacts their daily life. If you forget to explain the value, the PLM projects, it is considered again as just another software purchase.
Make sure the Proof of Concept is driven by validating future business scenarios, focusing on the To-Be solution. The high-level scenarios should be demonstrated and explained to the business people. In this stage, it is important people realize the benefits and the value of the new processes.
The POC is also an internal sales event. The goal should be to get more enthusiastic and supportive business people in your company for the upcoming PLM project. Identify the champions you will need to lean on during the implementation.
Test the implementer. To my opinion the critical success of a PLM implementation depends on the implementation team, not on the software. Therefore, the POC phase is the best moment to learn if you can work with the implementer. Do they know your business? Do they have experience with your business? The more you are aligned, the higher the chance you will be successful as a team
Show commitment to engage. Often I have seen POC engagements where the company demanded the implementer or vendor a Proof of Concept for free. This creates an unbalanced situation during the Proof of Concept as the vendor or implementer can not invest time and resources in the process as expected without any commitment from the company. By paying a certain fee for the POC, a company can demonstrate to the implementer /vendor that this POC is valuable for you and you can request the same response from them.
The Proof of Concept is not a detailed function/feature check to identify each mouse-click or option in the system. During the implementation, these details might come up. It is important in a Proof of Concept to understand the big picture and not to get lost in the details. As human beings we tend to focus on what does not work, not realizing that probably over eighty-ninety percent works according the needs
Do not expect the ultimate To-Be scenario demonstrated during the Proof of Concept. The Proof of Concept is a learning stage for both the company and the implementer to imagine the best possible scenario. PLM systems are generic and likely they will not provide a similar configuration and functionality matching your environment. At this stage validate if the primary capabilities are there and if there are gaps.
Do not run a POC with a vendor (only). This might be one of the most critical points for a POC. A PLM software vendor’s target is to sell their software and for that reason they often have dedicated presales teams that will show you everything in a smooth manner, overwhelming you with all the beauty of the software. However after the POC this team is gone and you will have to align yourself again with the implementation partner, trying to match again your business needs and their understanding.
Realize – you get what you are asking for. This is more a Do-and-Don’t message packed together. A Proof of Concept phase is a point where companies get to know each other. If you are not focused, do not expect the implementer / vendor to be committed. A PLM implementation is not product. It is a business transformation supported by products and services. Do not treat PLM implementers and vendors in the same way, as your customers treat you (in case you deliver products).
There are still many more thoughts about the Proof of Concept . Ideally you run two POCs in parallel, either with two implementers of the preferred software (if possible) or with two different implementers representing different software.
Ideally, as I know it is a challenge, especially for small and medium-sized businesses, where people are running to keep the business on-going.
Still remember, PLM is a business transformation, targeting to improve your business in the upcoming five to ten years, avoiding you are running out of business.
Your thoughts ?
As a bonus a short anecdote that I posted in 2010 still relevant:
Some time ago a Christian PLM Sales professional died (let’s call him Jack) and according to his believe he faced Saint Peter at the gates of Heaven and Hell.
Saint Peter greeted Jack and said: “Jack, with the PLM Sales you have done good and bad things to the world. For that reason, I cannot decide if you should go to Heaven or to Hell. Therefore, I allow you to make the choice yourself”.
Jack replied: “But Saint Peter, how can I make such an important decision for the rest of my eternal life. It is too difficult!”
Saint Peter replied: “No problem Jack, take a look at Heaven and Hell, take your time and then tell me your decision.”
Jack entered Heaven and he was surprised about the quietness and green atmosphere there. Angels were singing, people were eating from golden plates with the best food ever, people were reading poetry and everything was as peaceful as you could imagine. In the distance, he could see God surrounded by some prophets talking about the long-term future. After some time, Jack had seen it and went to Hell to have a view there.
And when he opened the gates of Hell, he was astonished. Everywhere he looked there were people partying, having fun. It reminded him off these sales kick-offs, he had in the past, exotic places with lots of fun. In the distance, he could see the Devil as DJ playing the latest dance music – or was it DJ Tiësto?
Jack did not hesitate and ran back to Saint Peter, no time to lose. “Saint Peter,” he said “I want to go to Hell, no doubt. And pity I did not know it before”
“So be it, ” said Saint Peter “go for it.”
And then once Jack entered Hell, it was suddenly all fire around him, people were screaming of pain and suffering and also Jack felt the first flames.
“Devil!!” He screamed “what happened to what I have seen before?”
With a sarcastic voice, the devil replied: “That? That was a proof of concept.”
Shaping the PLM platform of the Future
It was the first time I attended this event. I was positively surprised about the audience and content. Where other PLM conferences were often more focusing on current business issues, here a smaller audience (130 persons) was looking into more details around the future of PLM. Themes like PLM platforms, the Circular Economy, Open Standards and longevity of data were presented and discussed here.
The emergence of the PLM platform
1. The product lifecycle will become more and more circular due to changing business models and in parallel the different usage/availability of materials will have an impact how we design and deliver products
Can current processes and tools support today’s complexity. And what about tomorrow? According to a CIMdata survey there is a clear difference in profit and performance between leaders and followers, and the gap is increasing faster. “Can you afford yourself to be a follower ?” is a question companies should ask themselves.
Rethinking PLM platform does not bring the 2-3 % efficiency benefit but can bring benefits from 20 % and more.
Peter sees a federated platform as a must for companies to survive. I in particular likes his statement:
The new business platform paradigm is one in which solutions from multiple providers must be seamlessly deployed using a resilient architecture that can withstand rapid changes in business functions and delivery modalities
Industry voices on the Future PLM platform
Steven Vetterman from ProSTEP talked about PLM in the automotive industry. Steven started describing the change in the automotive industry, by quoting Heraclitus Τα πάντα ρεί – the only constant is change. Steven described two major changes in the automotive industry:
1. The effect of globalization, technology and laws & ecology
2. The change of the role of IT and the impact of culture & collaboration
Interesting observation is that the preferred automotive market will shift to the BRIC countries. In 2050 more than 50 % of the world population (estimate almost 10 billion people at that time) will be living in Asia, 25 percent in Africa. Europe and Japan are aging. They will not invest in new cars.
For Steven, it was clear that current automotive companies are not yet organized to support and integrate modern technologies (systems engineering / electrical / software) beyond mechanical designs. Neither are they open for a true global collaboration between all players in the industry. Some of the big automotive companies are still struggling with their rigid PLM implementation. There is a need for open PLM, not driven from a single PLM system, but based on a federated environment of information.
Yves Baudier spoke on behalf of the aerospace industry about the standardization effort at their Strategic Standardization Group around Airbus and some of its strategic suppliers, like Thales, Safran, BAE systems and more. If you look at the ASD Radar, you might get a feeling for the complexity of standards that exist and are relevant for the Airbus group.
It is a complex network of evolving standard all providing (future) benefits in some domains. Yves was talking about the through Lifecycle support which is striving for data creation once and reuse many times during the lifecycle. The conclusion from Yves, like all the previous speakers is that: The PLM Platform of the Future will be federative, and standards will enable PLM Interoperability
Energy and Marine
Shefali Arora from Wärtsilä spoke on behalf of the energy and marine sector and gave an overview of the current trends in their business and the role of PLM in Wärtsilä. With PLM, Wärtsilä wants to capitalize on its knowledge, drive costs down and above all improve business agility. As the future is in flexibility. Shefali gave an overview of their PLM roadmap covering the aspects of PDM (with Teamcenter), ERP (SAP) and a PLM backbone (Share-A-space). The PLM backbone providing connectivity of data between all lifecycle stages and external partners (customer / suppliers) based on the PLCS standard. Again another session demonstrating the future of PLM is in an open and federated environment
The future PLM platform is a federated platform which adheres to standards provides openness of interfaces that permit the platform to be reliable over multiple upgrade cycles and being able to integrate third-parties (Peter Bilello)
The afternoon session I followed the Systems Engineering track. Peter Bilello gave an overview of Model-Based Systems engineering and illustrated based on a CIMdata survey that even though many companies have a systems engineering strategy in place it is not applied consistently. And indeed several companies I have been dealing with recently expressed their desire to integrate systems engineering into their overall product development strategy. Often this approach is confused by believing requirements management and product development equal systems engineering. Still a way to go.
Dieter Scheithauer presented his vision that Systems Engineering should be a part of PLM, and he gave a very decent, academic overview how all is related. Important for companies that want to go into that direction, you need to understand where you aiming at. I liked his comparison of a system product structure and a physical product structure, helping companies to grab the difference between a virtual, system view and a physical product view:
More Industry voices
The afternoon session started with Christophe Castaing, explaining BIM (Building Information Modeling) and the typical characteristics of the construction industry. Although many construction companies focus on the construction phase, for 100 pieces of information/exchange to be managed during the full life cycle only 5 will be managed during the initial design phase (BIM), 20 will be managed during the construction phase (BAM) and finally 75 will be managed during the operation phase (BOOM). I wrote about PLM and BIM last year: Will 2014 become the year the construction industry will discover PLM?
Christophe presented the themes from the French MINnD project, where the aim is starting from an Information Model to come to a platform, supporting and integrated with the particular civil and construction standards, like IFC. CityGml but also PLCS standard (isostep ISO 10303-239
Amir Rashid described the need for PLM in the consumer product markets stating the circular economy as one of the main drivers. Especially in consumer markets, product waste can be extremely high due to the short lifetime of the product and everything is scrapped to land waste afterward. Interesting quote from Amir: Sustainability’s goal is to create possibilities not to limit options. He illustrated how Xerox already has sustainability as part of their product development since 1984. The diagram below demonstrates how the circular economy can impact all business today when well-orchestrated.
Marc Halpern closed the tracks with his presentation around Product Innovation Platforms, describing how Product Design and PLM might evolve in the upcoming digital era. Gartner believes that future PLM platforms will provide insight (understand and analyze Big Data), Adaptability (flexible to integrate and maintain through an open service oriented architecture), promoting reuse (identifying similarity based on metadata and geometry), discovery (the integration of search analysis and simulation) and finally community (using the social paradigm).
If you look to current PLM systems, most of them are far from this definition, and if you support Gartner’s vision, there is still a lot of work for PLM vendor to do.
Interesting Marc also identified five significant risks that could delay or prevent from implementing this vision:
- inadequate openness (pushing back open collaboration)
- incomplete standards (blocking implementation of openness)
- uncertain cloud performance (the future is in cloud services)
- the steep learning curve (it is a big mind shift for companies)
- Cyber-terrorism (where is your data safe?)
After Marc´s session there was an interesting panel discussion with some the speakers from that day, briefly answering discussing questions from the audience. As the presentations have been fairly technical, it was logical that the first question that came up was: What about change management?
A topic that could fill the rest of the week but the PDT dinner was waiting – a good place to network and digest the day.
Day 2 started with two interesting topics. The first presentation was a joined presentation from Max Fouache (IBM) and Jean-Bernard Hentz (Airbus – CAD/CAM/PDM R&T and IT Backbones). The topic was about the obsolescence of information systems: Hardware and PLM applications. As in the aerospace industry some data needs to be available for 75 years. You can imagine that during 75 years a lot can change to hardware and software systems. At Airbus, there are currently 2500 applications, provided by approximate 600 suppliers that need to be maintained. IBM and Airbus presented a Proof of Concept done with virtualization of different platforms supporting CATIA V4/V5 using Linux, Windows XP, W7, W8 which is just a small part of all the data.
The conclusion from this session was:
To benefit from PLM of the future, the PLM of the past has to be managed. Migration is not the only answer. Look for solutions that exist to mitigate risks and reduce costs of PLM Obsolescence. Usage and compliance to Standards is crucial.
Next Howard Mason, Corporate Information Standards Manager took us on a nice journey through the history of standards developed in his business. I loved his statement: Interoperability is a right, not a privilege
In the systems engineering track Kent Freeland talked about Nuclear Knowledge Management and CM in Systems Engineering. As this is one of my favorite domains, we had a good discussion on the need for pro-active Knowledge Management, which somehow implies a CM approach through the whole lifecycle of a plant. Knowledge management is not equal to store information in a central place. It is about building and providing data in context that it can be used.
Ontology for systems engineering
Leo van Ruijven provided a session for insiders: An ontology for Systems Engineering based on ISO 15926-11. His simplified approach compared to the ISO 15288 lead to several discussion between supporters and opponents during lunch time.
Master Data Management
Based on the type of information companies want to manage in relation to each other supported by various applications (PLM, ERP, MES, MRO, …) this can be a complex exercise and Marc ended with recommendations and an action plan for the MDM lead. In my customer engagements I also see more and more the digital transformation leads to MDM questions. Can we replace Excel files by mastered data in a database?
Almost at the end of the day I was speaking about the PDM platform of the people targeted for the people from the future. Here I highlighted the fundamental change in skills that’s upcoming. Where my generation was trained to own and capture information as much as possible information in your brain (or cabinet), future generations are trained and skilled in finding data and building information out of it. Owning (information) is not crucial for them. Perhaps as the world is moving fast. See this nice YouTube movie at the end.
Ella Jamsin ended the conference on behalf of the Ellen MacArthur Foundation explaining the need to move to a circular economy and the PLM should play a role in that. No longer is PLM from cradle-to-grave but PLM should support the lifecycle from cradle-to-cradle.
Unfortunate I could not attend all sessions as there were several parallel sessions. Neither have I written about all sessions I attended. The PDT Europe conference, a conference for people who mind about the details around the PLM future concepts and the usage of standards, is a must for future strategists.