You are currently browsing the tag archive for the ‘PLM’ tag.
This is for the moment the last post about the difference between files and a data-oriented approach. This time I will focus on the need for open exchange standards and the relation to proprietary systems. In my first post, I explained that a data-centric approach can bring many business benefits and is pointing to background information for those who want to learn more in detail. In my second post, I gave the example of dealing with specifications.
It demonstrated that the real value for a data-centric approach comes at the moment there are changes of the information over time. For a specification that is right the first time and never changes there is less value to win with a data-centric approach. Moreover, aren’t we still dreaming that we do everything right the first time.
The specification example was based on dealing with text documents (sometimes called 1D information). The same benefits are valid for diagrams, schematics (2D information) and CAD models (3D information)
The challenge for a data-oriented approach is that information needs to be stored in data elements in a database, independent of an individual file format. For text, this might be easy to comprehend. Text elements are relative simple to understand. Still the OpenDocument standard for Office documents is in the background based on a lot of technical know-how and experience to make it widely acceptable. For 2D and 3D information this is less obvious as this is for the domain of the CAD vendors.
CAD vendors have various reasons not to store their information in a neutral format.
- First of all, and most important for their business, a neutral format would reduce the dependency on their products. Other vendors could work with these formats too, therefore reducing the potential market capture. You could say that in a certain manner the Autodesk 2D format for DXF (and even DWG) have become a neutral format for 2D data as many other vendors have applications that read and write back information in the DXF-data format. So far DXF is stored in a file but you could store DXF data also inside a database and make it available as elements.
- This brings us to the second reason why using neutral data formats are not that evident for CAD vendors. It reduces their flexibility to change the format and optimize it for maximal performance. Commercially the significant, immediate disadvantage of working in neutral formats is that it has not been designed for particular needs in an individual application and therefore any “intelligent” manipulations on the data are hard to achieve..
The same reasoning can be applied to 3D data, where different neutral formats exist (IGES, STEP, …. ). It is very difficult to identify a common 3D standard without losing many benefits that an individual 3D CAD format brings currently. For example, CATIA is handling 3D CAD data in a complete different way as Creo does, and again handled different compared to NX, SolidWorks, Solid Edge and Inventor. Even some of them might use the same CAD kernel.
However, it is not only about the geometry anymore; the shapes represent virtual objects that have metadata describing the objects. In addition other related information exists, not necessarily coming from the design world, like tasks (planning), parts (physical), suppliers, resources and more
PLM, ERP, systems and single source of truth
This brings us in the world of data management, in my world mainly PLM systems and ERP systems. An ERP system is already a data-centric application, the BOM is already available as metadata as well as all the scheduling and interaction with resources, suppliers and financial transactions. Still ERP systems store a lot of related documents and drawings, containing content that does not match their data model.
PLM systems have gradually becoming more and more data centric as the origin was around engineering data, mostly stored in files. In a data-centric approach, there is the challenge to exchange data between a PLM system and an ERP system. Usually there is a need to share information between two systems, mainly the items. Different definitions of an item on the PLM and ERP side make it hard to exchange information from one system to the other. It is for that reason why there are many discussions around PLM and ERP integration and the BOM.
In the modern data-centric approach however we should think less and less in systems and more and more in business processes performed on actual data elements. This requires a company-wide, actually an enterprise-wide or industry-wide data definition of all information that is relevant for the business processes. This leads into Master Data Management, the new required skill for enterprise solution architects
The data-centric approach creates the impression that you can achieve a single source of the truth as all objects are stored uniquely in a database. SAP solves the problem by stating everything fits in their single database. To my opinion this is more a black hole approach: Everything gets inside, but even light cannot escape. Usability and reuse of information that was stored with the intention not to be found is the big challenge here.
Other PLM and ERP vendors have different approaches. Either they choose for a service bus architecture where applications in the background link and synchronize common data elements from each application. Therefore, there is some redundancy, however everything is connected. More and more PLM vendors focus on building a platform of connected data elements, where on top applications will run, like the 3DExperience platform from Dassault Systèmes.
As users we are more and more used to platforms as Google, Apple provide these platforms already in the cloud for common use on our smartphones. The large amount of apps run on shared data elements (contacts, locations …) and store additional proprietary data.
Platforms, Networks and standards
And here we enter an interesting area of discussion. I think it is a given that a single database concept is a utopia. Therefore, it will be all about how systems and platforms communicate with each other to provide in the end the right information to the user. The systems and platforms need to be data-centric as we learned from the discussion around the document (file centric) or data-centric approach.
In this domain, there are several companies already active for years. Datamation from Dr. Kais Al-Timimi in the UK is such a company. Kais is a veteran in the PLM and data modeling industry, and they provide a platform for data-centric collaboration. This quote from one of his presentations, illustrates we share the same vision:
“……. the root cause of all interoperability and data challenges is the need to transform data between systems using different, and often incompatible, data models.
It is fundamentally different from the current Application Centric Approach, in that data is SHARED, and therefore, ‘NOT OWNED’ by the applications that create it.
This means in a Data Centric Approach data can deliver MORE VALUE, as it is readily sharable and reusable by multiple applications. In addition, it removes the overhead of having to build and maintain non-value-added processes, e.g. to move data between applications.”
Another company in the same domain is Eurostep, who are also focusing on business collaboration between in various industries. Eurostep has been working with various industry standards, like AP203/214, PLCS and AP233. Eurostep has developed their Share-A-space platform to enable a data-centric collaboration.
This type of data collaboration is crucial for all industries. Where the aerospace and automotive industry are probably the most mature on this topic, the process industry and construction industry are currently also focusing on discovering data standards and collaboration models (ISO 15926 / BIM). It will be probably the innovators in these industries that clear the path for others. For sure it will not come from the software vendors as I discussed before.
If you reach this line, it means the topic has been interesting in depth for you. In the past three post starting from the future trend, an example and the data modeling background, I have tried to describe what is happening in a simplified manner.
If you really want to dive into the PLM for the future, I recommend you visit the upcoming PDT 2014 conference in Paris on October 14 and 15. Here experts from different industries will present and discuss the future PLM platform and its benefits. I hope to meet you there.
Some more to read:
Two weeks ago I attended the Nobletek PLM forum in Belgium, where a group of experts, managers and users discussed topics related to my favorite theme: “Is PLM changing? “
Dick Terleth (ADSE) lead a discussion with title “PLM and Configuration Management as a proper profession” or "How can the little man grow?". The context of the discussion was related to the topic: “How is it possible that the benefits of PLM (and Configuration Management) are not understood at C-level?” or with other words: “Why is the value for Configuration Management and PLM not obvious?”.
In my previous post, PLM is doomed unless …., I quoted Ed Lopategui (www.eng-eng.com), who commented that being a PLM champion (or a Configuration Management expert as Dick Terleth would add) is bad for your career. Dick Terleth asked the same question, showing pictures of the self-assured accountant and the Configuration Management or PLM professional. (Thanks Dick for the pictures). Which job would you prefer?
The PLM ROI discussion
A first attempt to understand the difference could be related to the ROI discussion, which seems to be only applicable for PLM. Apparently ERP and financial management systems are a must for companies. No ROI discussion here. Persons who can control/report the numbers seem to have the company under control. For the CEO and CFO the value of PLM is often unclear. And to make it worse, PLM vendors and implementers are fighting for their unique definition of PLM so we cannot blame companies to be confused. This makes it clear that if you haven´t invested significant time to understand PLM, it will be hard to see the big picture. And at C-level people do not invest significant time to understand the topic. It is the C-level´s education, background or work experience that make him/her decide.
So if the C-level is not educated on PLM, somebody has to sell the value to them. Oleg Shilovitsky wrote about it recently in his post Why is it hard to sell PLM ROI and another respected blogger, Joe Barkai, sees the sun come up behind the cloud, in his latest post PLM Service Providers Ready To Deliver Greater Value. If you follow the posts of independent PLM bloggers (although who is 100 % independent), you will see a common understanding that implementing PLM currently requires a business transformation as old processes were not designed for a modern infrastructure and digital capabilities.
PLM is about (changing) business processes
Back to the Nobletek PLM forum. Douglas Noordhoorn, the moderator of the forum challenged the audience stating that PLM has always been there (or not there – if you haven´t discovered it). It is all about managing the product development processes in a secure way. Not talking about “Best Practices” but “Good practices." Those who had a proper education in the aerospace industry learned that good processes are crucial to deliver planes that can fly and are reliable.
Of course, the aerospace industry is not the same as other industries. However, more and more other industries in my network, like Nuclear new build, the construction industry or other Engineering, Procurement and Construction companies want to learn from aerospace and automotive good practices. They realize they are losing market share due to the fact that the cost of failure combined with relative high labor costs makes them too expensive. But from where to they get their proper good practices education?
The PLM professional?
And this was an interesting point coming up from the Nobletek forum. There is no proper, product agnostic education for PLM (anymore). If you study logistics, you will learn a lot about various processes and how they can be optimized for a certain scenario. When you study engineering, there is a lot of focus on engineering disciplines and methods. But there is no time to educate engineers in-depth to understand the whole product development process and how to control it. Sometimes I give a guest lecture to engineering classes. It is never an important part of the education.
To become a PLM professional
For those who never had any education in standard engineering processes, there is Frank Watts Engineering control book, which probably would be a good base. But it is not the PLM professional only that should be aware, of the good practices. Moreover, all companies manufacturing products, plants or buildings should learn these basics. As a side step, it would make a discussion around BIM more clear. At this time, manufacturing companies are every time discovering their good practices in the hard way.
And when this education exists, companies will be aware that it is not only about the tools, but it is the way the information is flowing through the organization. Even there is a chance that somewhere at C-level someone has been educated and understands the value. For ERP everyone agrees. For PLM, it remains a labyrinth of processes designed by companies learning on the job currently. Vendors and implementers pushing what they have learned. Engineering is often considered as a hard-to-manage discipline. As a SAP country manager once said to me: “Engineers are actually resources that do not want be managed, but we will get them …..”
And then the future ……
I support the demand for a better education in engineering processes especially for industries outside aerospace or automotive. I doubt if it will have a significant impact although it might create the visibility and understanding for PLM at C-level. No need anymore for the lone ranger who fights for PLM. Companies will have better educated people that understand the need for good practices that exist. These good practices will be the base for companies when discussing with PLM vendors and implementers. Instead of vendors and implementers pushing their vision, you can articulate, and follow your vision.
However, we need a new standard book too. We are currently in the middle of a big change. Thanks to modern technology and connectivity the world is changing. I wrote and spoke about it in: Did you notice PLM is changing?
This awareness needs to become visible at C-level.
Who will educate them ??
Now back to soccer – 4 years ago Spain-The Netherlands was the last match – the final. Now it is the first match for them – will the Dutch change the game ?
The past month I had several discussions related to the complexity of PLM. Why is PLM conceived as complex ? Why is it hard to sell PLM internal into an organization ? Or to phrase it differently: “What makes PLM so difficult for normal human beings. As conceptually it is not so complex”
So what makes it complex ? What´s behind PLM ?
The main concept behind PLM is that people share data. It can be around a project, a product, a plant through the whole lifecycle. In particular during the early lifecycle phases, there is a lot of information that is not yet 100 percent mature. You could decide to wait till everything is mature before sharing it with others (the classical sequential manner), however the chance of doing it right the first time is low. Several iterations between disciplines will be required before the data is approved. The more and more a company works sequential, the higher costs of changes are and the longer the time to market. Due to this rigidness of this sequential approach, it becomes difficult to respond rapidly to customer or market demands. Therefore in theory, (and it is not a PLM theory), concurrent engineering should reduce the amount of iterations and the total time to market by working in parallel in not approved data yet.
PLM goes further, it is also about sharing of data and as it started originally in the early phases of the lifecycle, the concept of PLM was often considered something related to engineering. And to be fair, most of the PLM (CAD-related) vendors have a high focus on the early stages of the lifecycle and strengthen this idea. However sharing can go much further, e.g. early involvement of suppliers (still engineering) or support for after-sales/services (the new acronym SLM). In my recent blog posts I discussed the concepts of SLM and the required data model for that.
The complexity lies in the word “sharing”. What does sharing mean for an organization, where historically every person was awarded for the knowledge he/she has/owned, instead of being awarded for the knowledge this person made available and shared. Many so-called PLM implementations have failed to reach the sharing target as the implementation focus was on storing data per discipline and not necessary storing data to become shareable and used by others. This is a huge difference.
Some famous (ERP) vendors claim if you store everything in their system, you have a “single version of the truth”. Sounds attractive. My garbage bin at home is also a place where everything ends up in a single place, but a garbage bin has not been designed for sharing, as another person has no clue and time to analyze what´s inside. Even data in the same system can be hidden for others as the way to find data is not anticipated.
Data sharing instead of document deliverables
The complexity of PLM is that data should be created and shared in a matter not necessary the most efficient manner for a single purpose, however with some extra effort, to make it usable and searchable for others. A typical example is drawings and documents management, where the whole process for a person is focused on delivering a specific document. Ok for that purpose, but this document on its own becomes a legacy for the long-term as you need to know (or remember) what´s inside the document.
A logical implication of data sharing is that, instead of managing documents, organizations start to collect and share data elements (a 3D model, functional properties, requirements, physical properties, logistical properties, etc). Data can be connected and restructured easily through reports and dashboards, therefore, proving specific views for different roles in the organization. Sharing becomes possible and it can be online. Nobody needed to consolidate and extract data from documents (Excels ?)
This does not fit older generations and departmental managed business units that are rewarded only on their individual efficiency. Have a look at this LinkedIn discussion where the two extremes are visible.
“The sad thing about PLM is that only PLM experts can understand it! It seems to be a very tight knit club with very little influence from any outside sources.
I think PLM should be dumped. It seems to me that computerizing engineering documentation is relatively easy process. I really think it has been over complicated. Of course we need to get the CAD vendors out of the way. Yes it was an obvious solution, but if anyone took the time to look down the road they would see that they were destroying a well established standard that were so cost effective and simple. But it seems that there is no money in simple”
And a the other side Kais stating:
“If we want to be able to use state-of-the art technology to support the whole enterprise, and not just engineering, and through-life; then product information, in its totality, must be readily accessible and usable at all times and not locked in any perishable CAD, ERP or other systems. The Data Centric Approach that we introduced in the Datamation PLM Model is built on these concepts”
Readers from my blog will understand I am very much aligned with Kais and PLM guys have a hard time to convince Joe of the benefits of PLM (I did not try).
Making the change happen
Beside this LinkedIn discussion, I had discussions with several companies where my audience understood the data-centric approach. It was nice to be in the room together, sharing the ideas what would be possible. However the outside world is hard to convince and here it is about change management.
I read an interesting article in IndustryWeek from John Dyer with the title: What Motivates Blockers to Resist Change?
John describes the various types of blockers and when reading the article combined with my PLM twisted brain, I understood again that this is one of the reasons PLM is perceived as complex – you need to change and there are blockers:
Blocker (noun) – Someone who purposefully opposes any change (improvement) to a process for personal reasons
“Blockers” can occupy any position in a company. They can be any age, gender, education level or pay rate. We tend to think of blockers as older, more experienced workers who have been with the company for a long time, and they don’t want to consider any other way to do things. While that may be true in some cases, don’t be surprised to find blockers who are young, well-educated and fairly new to the company.
The problem with blockers
The combination of business change and the existence of blockers are one of the biggest risks for companies to go through a business transformation. By the way, this is not only related to PLM, it is related to any required change in business.
A company I have been working with was eager in studying their path to the future, which required more global collaboration, a competitive business model and a more customer centric approach. After a long evaluation phase they decided they need PLM, which was new for most of the people in the company. Although the project team was enthusiastic, they were not able to pass the blockers for a change. Ironically enough they lost a significant part of their business to companies that have implemented PLM. Defending the past is not a guarantee for the future.
A second example is Nokia. Nokia was famous for they ways they were able to transform their business in the past. How come they did not see the smartphone and touch screens upcoming ? Apparently based on several articles presented recently, it was Nokia´s internal culture and superior feeling that they were dominating the market, that made it impossible to switch. The technology was known, the concepts were there, however the (middle) management was full of blockers.
Two examples where blockers had a huge impact on the company.
Staying in business and remaining competitive is crucial for companies. In particular the changes that currently happen require people to work different in order to stay completive. Documents will become reports generated from data. People handling and collecting documents to generate new documents will become obsolete as a modern data-centric approach makes them redundant. Keeping the old processes might destroy a company. This should convince the blockers to give up
In my previous post, I wrote about the different ways you could look at Service Lifecycle Management (SLM), which, I believe, should be part of the full PLM vision. The fact that this does not happen is probably because companies buy applications to solve issues instead of implementing a consistent company wide vision (When and Where to start is the challenge). Oleg Shilovitsky just referred one more time to this phenomena – Why PLM is stuck in PDM.
I believe PLM as the enterprise information backbone for product information. I will discuss the logical flow of data that might be required in a PLM data model, to support SLM. Of course all should be interpreted in the context of the kind of business your company is in.
This post is probably not the easiest to digest as it assumes you are somehow aware and familiar with the issues relevant for the ETO (Engineering To Order) /EPC (Engineering Procurement Construction) /BTO (Build To Order) business
A collection of systems or a single device
The first significant differentiation I want to make is between managing an installation or a single device as I will focus only on installations.
An installation can be a collection of systems, subsystems, equipment and/or components, typically implemented by companies that deliver end-to-end solutions to their customers. A system can be an oil rig, a processing production line (food, packages, …), a plant (processing chemicals, nuclear materials), where maintenance and service can be performed on individual components providing full traceability.
Most of the time a customer specific solution is delivered to a customer, either direct or through installation / construction partners. This is the domain I will focus on.
I will not focus on the other option for a single device (or system) with a unique serial number that needs to be maintained and serviced as a single entity. For example a car, a computer device. Usually a product for mass consumption, not to be traced individually.
In order to support SLM at the end of the PLM lifecycle, we will see a particular data model is required which has dependencies on the early design phases.
Let´s go through the lifecycle stages and identify the different data types.
The concept / sales phase
In the concept/sales phase the company needs to have a template structure to collect and process all the information shared and managed during their customer interaction.
In the implementations that I guided, this was often a kind of folder structure grouping information into a system view (what do we need), a delivery view (how and when can we deliver), a services view (who does what ) and a contractual view (cost, budget, time constraints). Most of these folders had initially relations to documents. However the system view was often already based on typical system objects representing the major systems, subsystems and components with metadata.
In the diagram, the colors represent various data types often standard available in a rich PLM data model. Although it can be simplified by going back to the old folder/document approach shared on a server, you will recognize the functional grouping of the information and its related documents, which can be further detailed into individual requirements if needed and affordable. In addition, a first conceptual system structure can already exist with links to potential solutions (generic EBOMs) that have been developed before. A PLM system provides the ideal infrastructure to store and manage all data in context of each other.
The Design phase
Before the design phase starts, there is an agreement around the solution to be delivered. In that situation, an as-sold system structure will be leading for the project delivery, and later this evolved structure will be the reference structure for the as-maintained and as-services environment.
A typical environment at this stage will support a work breakdown structure (WBS), a system breakdown structure (SBS) and a product breakdown structure (PBS). In cases where the location of the systems and subsystems are relevant for the solution, a geographical breakdown structure (GBS) can be used. This last method is often used in shipbuilding (sections / compartments) and plant design (areas / buildings / levels) and is relevant for any company that needs to combine systems and equipment in shared locations.
The benefit of having the system breakdown structure is that it manages the relations between all systems and subsystems. Potentially when a subsystem will be delivered by a supplier this environment supports the relationship to the supplier and the tracking of the delivery related to the full system / project.
Note: the system breakdown structure typically uses a hierarchical tag numbering system as the primary id for system elements. In a PLM environment, the system breakdown elements should be data objects, providing the metadata describing the performance of the element, including the mandatory attributes that are required for exchange with MRO (Maintenance Repair Overhaul) systems.
Working with a system breakdown structure is common for plant design or a asset maintenance project and this approach will be very beneficial for companies delivering process lines, infrastructure projects and other solutions that need to be delivered as a collection of systems and equipment.
The delivery phase
During the delivery phase, the system breakdown structure supports the delivery of each component in detail. In the example below you can see the relation between the tag number, the generic part number and the serial number of a component.
The example below demonstrates the situation where two motors (same item – same datasheet) is implemented at two positions in a subsystem with a different tag number, a unique serial number and unique test certificates per motor.
The benefit of a system breakdown structure here is that it supports the delivery of unique information per component that needs to be delivered and verified on-site. Each system element becomes traceable.
The maintenance phase
For the maintenance phase the system breakdown structure (or a geographical breakdown structure) could be the place holder to follow up the development of an installation at a customer site.
Imagine that, in the previous example, the motor with tag number S1.2-M2 appears to be under dimensioned and needs to be replaced by a more powerful one. The situation after implementing this change would look like the following picture:
Through the relationships with the BOM items (not all are shown in the diagram), there is the possibility to perform a where-used query and identify other customers with a similar motor at that system position. Perhaps a case for preventive maintenance?
Note: the diagram also demonstrates that the system breakdown structure elements should have their own lifecycle in order to support changes through time (and provide traceability).
From my experience, this is a significant differentiator PLM systems can bring in relation to an MRO system. MRO and ERP (Enterprise Resource Planning)systems are designed to work with the latest and actual data only. Bringing in versioning of assets and traceability towards the initial design intent is almost impossible to achieve for these systems (unless you invest in a heavy customized system).
In this post and my previous post, I tried to explain the value of having at least a system breakdown structure as part of the overall PLM data model. This structure supports the early concept phase and connects data from the delivery phase to the maintenance phase.
Where my mission in the past 8 years was teaching non-classical PLM industries the benefits of PLM technology and best practices, in this situation you might say it is where classical BTO companies can learn from best practices from the process and oil & gas industry.
Note: Oleg just published a new blog post: PLM Best Practices and Henry Ford Mass Production System where he claims PLM vendors, Service partners and consultants like to sell Best Practices and still during implementation discover mass customization needs to be made to become customer specific, therefore, the age of Best Practices is over.
I agree with that conclusion, as I do not believe in an Out-Of-The-Box approach, to lead a business change.
Still Best Practices are needed to explain to a company what could be done and in that context without starting from a blank sheet.
Therefore I have been sharing this Best Practice (for free)
Some weeks ago there was a vivid discussion around the need for SLM (service lifecycle management) besides PLM started in a PLM LinkedIn group. Of course, the discussion was already simmering in the background in other LinkedIn groups and fora (forums) triggered by PTC´s announcement to focus on SLM and their “observation” that they were probably the only PLM vendor to observe that need. The Internet of Things is in one pen stroke connected with SLM. (Someone still using a pen?)
Of course it is not that simple and I will try to bring some logic in the thought process, the potential hype and the various approaches you could take related to SLM
First SLM as a TLA (Three Letter Acronym). If you would Google what is the meaning of SLM the most common meaning is Hello, often said on IRC, this is short for “salaam”, or hello.
In the context of PLM it is a relative new acronym and the discussion on LinkedIn was also about the fact if we needed a new TLA. In general. What we try to achieve with SLM is: the ability to trace and follow existing products at customers and to provide advanced or integrated services to them. In a basic matter this could be providing documentation and service information (spare parts information). In an advanced manner, this could be thinking about the Internet of Things, be products that connect to the home base and provide information for preventive maintenance, performance monitoring and enhancements, etc.
The topic is not new for companies around the world that have a “what can we do beyond PDM” vision, as I was involved already in 2001 in discussion with a large Swiss company providing solutions for the food processing industry. They wanted to leverage their internal customer centric delivery process and extend it to their customer support using a web interface for relevant content: spare parts lists and documentation.
I am sure one or two readers of this blog post will remember “the spindle case” (the only part in the demo concept that had real data behind it at that time)
For many industries and businesses the customer services (and the margin on spare parts) are the main areas where they make a sustainable profit to secure the company’s future. Most of the time, the initial sale and/or delivery of their products are done with relative low margin due to the competitive sales situation they are during selling. And of course the sale itself is surrounded with uncertainty which vendors have to accept.
If they would ask for more certainty – it would require a more detailed research, which is costly for them or considered as a disadvantage by their potential customer. As other competing vendors do not insist on further research, your company might consider not being “skilled” enough to estimate properly a product.
The above paragraph implicitly clarifies that we are mainly talking about companies, where their primary process is Engineering to Order or Build to Order. For companies where the product is delivered through a Configure to Order or an Off-the-Shelf approach, there is no need to work in a similar manner. Buying a computer or a car has no sales engineering involved anymore. There is a clear understanding of the target price and of course resellers will still focus on differentiating themselves by providing adjacent services.
So for simplicity I will focus on companies with a BTO or ETO primary business process
SLM and ETO
In a real Engineering to Order process, traditionally the company that delivers the solution to the client will not be really involved in the follow up of the lifecycle of the products delivered. The delivered product (small machinery, large machinery or even an installation or plant) is delivered to the customer and with the commissioning and handover a lot of information is transferred to the customer, based on the requirements of the customer.
Usually during this handover, a lot of intelligence of the information is gone, as the customer does not have the same engineering environment and therefore requires information is “neutral” formats: paper (less and less), PDFs (the majority) and (stripped) CAD data combined with Excels.
The information battle here between the ETO-delivery company and the customer is, that the ETO-delivery company does not want to provide too much information to the customer, to make the customer fully independent, as the service and spare parts business is the area where they can make their margin. The customer, however, often wants to have ownership of the majority of data, but also there is the awareness if they ask too much; they will pay for it (as an engineering company will consider this as extra work). So finding the right balance is the point.
However, the balance is changing, and this is where SLM comes in.
More and more we see that companies who purchased in the past an Engineering to Order product (or even plant) are changing their business model towards using the product or running the plant and ask from the Engineering to Order company to provide the solution as a service. A kind of operation lease including resources. This means solutions are no longer sold as a collection of products, but as an operational model (40.000 chickens / day, 1 Mio liter/day, 100 000 Tons / year, etc., etc.)
The owner of the equipment is no longer the owner, but pays for the service to perform the business. Very similar to SaaS (software as a service) solutions. You do not own the software anymore; you pay for using it, no matter what kind of hardware / software architecture there is behind the offering.
In that case, the Engineering to Order company can provide much more advanced services when they extend their delivery process with capabilities for the operational phase of the product. As a more integrated approach eliminates the need for this disruptive handover process. Data does not need to be made “stupid” again, it is a continuous flow of information.
How this can be done, I will describe in an upcoming, more technical, blog post. This approach brings value to both the Engineering to Order company and the owner/operator of the product / plant.
As it is a continuous flow of information, I would like to conclude this topic by stating that, for Engineering to Order companies, there is no need to think about an extra SLM solution. You could label the last part of the PLM process the SLM domain.
As the customer data is already unique, it is just a normal continuation of the PLM process.
Two closing notes here:
- I have seen already Engineering to Order companies that provide the whole maintenance and service of the delivered product / plant to their customer integrated in their data environment. (so it is happening !)
- Engineering to Order companies are still discovering the advantages of PLM to get a cross-project, cross-discipline understanding and working methodology for their delivery process. Historically they were thinking in isolated projects, where the brain of experienced engineers was the connection between different projects. Now PLM practices are becoming the foundation for sharing and capitalizing on knowledge.
And with the last remark on capitalizing the knowledge, we move from the Engineering to Order industry to the Build to Order
SLM and BTO
In the Build to Order industry, the company that delivers a solution to their customer, has tried, in a way, to standardize certain parts of their total solution. These parts can be standardized/configurable machinery or standardized/configurable equipment, or even a level higher standardized systems and subsystems.
More configurable/modular standardization is what most companies are aiming for. As the more you modularize your solution parts, the clearer it will be that there are two different main processes inside the same organization:
- One process, the main process for the company, fulfilling the customer need. In this process it is about combining existing solution components and engineering them together in a customer specific solution. This could be a PLM delivery model like ETO.
- One process to enhance, maintain and develop new solution components, which is a typical R&D process. Here I would state PLM is indisputable needed, to bring new technology and solutions to the main business process
So within a company, there might be the need for two different PLM solution processes. From my observations in the past 10 years, companies invest in PDM for their R&D process and try to do a little of PLM on top of this PDM implementation for their delivery process. This basic PLM process usually focuses again on the core of the engineering process of delivery, starting somewhere from the specifications till the delivery of the solution.
So “full” PLM is very rare to find. The front end of the delivery process, systems engineering, is often considered complex and often the customer does not want to engage fully in the front end definition of the solution.
“You are the experts, you know best what we want” is often heard.
Ironically in an analogue situation this is often the case of PLM implementations at risk. Here the company expects the PLM implementer to know what they want, without being explicit or understanding what is needed.
To extend the discussion for PLM and SLM, I would like to change the question to a different dimension first:
Do we need two PLM implementations within one company ?
One for R&D and one for the delivery process ?
Reasons to say No are:
- Simplicity – it is easier to have one system instead of two systems
- The amount of R&D activity is so low compared to the delivery process; the main PLM system can support this.
Reasons to say Yes are:
- The R&D process is extremely important as is the delivery process
- The R&D process is extremely important and we have a large customer base to serve
Reading these two options, it brings some clarity.
If the R&D process is a significant differentiator and you are aiming to serve many customers, it makes sense to have two PLM implementations.
Still two PLM implementations could be based on the same PLM infrastructure and I would challenge readers of this post to explain why it should be a single instance of a PLM infrastructure.
Why two PLM systems
- I believe based on the potential huge amount of data a single instance would create a data monster, where we can see that connected systems (using big data) is the future.
- In other concepts there is an enterprise PLM and local PDMs exactly because there is no single system that can do all in an efficient manner.
Still I haven´t talked about SLM, which could be part of the delivery process, where you manage customer specific data. For that, more detail in my next blog post, there is are some data model constraints for the PLM system.
I would state you only can use a separate SLM system if you are not interested in data from the early phases of the delivery process. In the early phase, you use conceptual structures to define the product /installation/plant. These conceptual structures are to my opinion the connection between the concept phase and the service phase. Usually tag numbers are used to describe the functional usage of a product or system, and they are the ones referenced by service engineers to start a service operation.
Only when this view or need does not exist, I can imagine, SLM is needed, where potential based on serial numbers, services are tracked and monitored and are fed back to the R&D environment. The R&D environment then would publish product data into the SLM system
You might be confused at this time, as I did not bring the various information structures into this post to clarify the data flow for the delivery process. This I will do in my upcoming post.
Why not CTO and SLM ?
I haven´t discussed Configure to Order (CTO) here, as I consider CTO a logistical process, which logically is addressed by the ERP system. The definitions of the configurations and its related content probably will be delivered through a PDM/PLM system, so the R&D type of PLM system will exist in the company.
SLM most logically would be performed in this situation by the ERP system, as there is no PLM delivery layer. Having said this, a new religion discussion might come up. Is SLM a separate discipline or is it part of the ERP system?
This topic is no discussion for the big ERP vendors – they do it all J, but it is up to your company if a Swiss knife is the right tool to work within your organization.
For the moment I would like to conclude:
- PLM and SLM –> No (only Yes in isolated cases)
- PLM and PLM –> Yes (as SLM requires the front end of PLM too)
Do we need SLM ? Perhaps yes as a way to describe a functional domain. No when we are talking about another silo system. I believe the future is in connectivity of data and in the long term PLM, ERP and SLM will be functional domains describing how connected data will serve particular needs.
Looking forward to your thoughts
The product innovation conference in February has become one of my favorite events, mainly for networking. Perhaps PLM vendors try to give you the impression that we are in a fast moving world. In reality, most companies are moving in a much slower pace than these vendors dream of. In general for an outsider, last year might have looked similar to what happened this year. In this post, I will describe the subtle differences that I noticed.
The event was in the same location as last year with approximately. 400 participant including 60 speakers. The conference had three main streams: keynotes, PLM and design. The PLM and design sessions were most of the time parallel sessions. Great if you are interested in one domain only, a little more challenging for people who are enjoying to be in both domains. However the good news is that all participants will have access to the recorded sessions in a week or two. And from last years’ experience I can say the recordings are good, so I am looking forward to a virtual additional conference in two weeks from now.
Some remarks about the sessions that I was able to attend
Going to Mars ?
Bas Lansdorp explained us about the Mars One mission, what was the drive and challenge behind establishing a permanent human settlement on Mars. It was an inspiring opening session to make you think out of the box. Several interesting topics came up.
1. First of all that most of the mission’s materials need to be basic, proven technology instead of modern, innovative concepts. As maintenance and risks for issues need to be minimized, it is better to keep it with proven technology.
2. The crew selection is a long process – the first crew will fly in 10 years from now, so who are those individuals that want to take up the challenge to stay forever with 3 others, and every few years some more people will come. But hard to escape, and there is no way back. Amazing!
3. Part of the funding can be done by media rights. Bas explained the revenues that are related, for example, with the Olympic Games are already stunning. Imagine to have “Live on Mars” as a reality soap available all around the world. Programs like Big Brother demonstrate that it is in our nature just to watch ordinary people see how the behave. Will they fight? Will they have sex? Public voyeurism and eternal fame.
Although the keynote had no relation to PLM, I felt energized by the entrepreneurial thinking of Bas, following his passion and wanting to realize it. As Mars does not need the first centuries entrepreneurs, it was clear Bas is not part of the first crew.
Managing complexity and volume
Next Peter Smith from VF International presented the huge challenge his group of companies had to manage the complexity of the various products and their seasonal deliveries, up to 12 collection models per year. The group with famous brands like The North Face, Lee, Wranglers, JanSport, Kipling and Timberland has the challenge to deliver 500 Mio units/year which means 16 units/second ! For sure an execution engine. So where does PLM fit?
For Peter PLM is part of the infrastructure, a glue for the innovation process, but not driving the innovation process. They try to standardize on a single PLM system, but some of the brands have such characteristics and history that this was not possible to realize. As the business must go on, a new PLM should not be disruptive for business.
The two main challenges Peter sees for current PLM are:
- The software models available for them as consumers. Changes go here too slow
- Organizational change implications. How to change when change is hard?
It was clear from Peter’s experience that many of his points were from the IT-perspective. During the networking break when I spoke with others, some of them mentioned that the business value for PLM was missing in Peter’s analysis – too much tool/infrastructure.
The digital value chain
An interesting session from Michael Bitzer (Accenture) and Sebastien Handschuh (Daimler). After an introduction about the German initiative Digital Industry 4.0 the remaining part of the session was around Daimler´s approach to use JT as a neutral, application independent format for their 3D data. At this time, Daimler has already over a 6 Mio JT-files and the format has been proven to fulfill their process needs.
Where possible Daimler aims to collaborate with suppliers in JT format for 3D. In this manner, their suppliers are not forced to use exclusively CATIA or NX. And the answer one question from the audience if Daimler was supporting the Siemens flavored JT or the real neutral JT format, it was clear that Daimler was aiming for the neutral format. I believe an interesting move to a more generic data approach in this case for 3D CAD data instead of original file formats. Hopefully more standardization to follow.
PLM selection: Do´s and Don’ts
I was moderating a discussion session for companies that were in the process of selecting a PLM system or that wanted to share their experience. Unfortunate the session was overpopulated with a lot of people not all necessary in the selection process. Due to the large audience not really an opportunity to have an in-depth discussion. Still it was amazing to see that there are still companies where the value of PLM is not clear at the management level and therefore the focus is on quick ROI.
In a one-to-one discussion afterwards I learned about a company where the shareholders/investors of a company forced the PLM project to fail by pushing unrealistic deadlines and not understanding the human and business change required. Unrealistic ROI expectations and lack of understanding where PLM really brings a competitive advantage is missing. Worst case due to their short-term focus the company will slowly be out of business as competitiveness and margins will reduce. For this type of situations, there is the excellent Dilbert cartoon below.
Secure data sharing in the extended enterprise
An interesting session was organized by Häkan Kårdén (Eurostep) and Kristofer Thoresson (Siemens Industrial Turbomachinery). Siemens had chosen to use the Eurostep Share-A-space environment between their internal data (their PDM system and other data sources) and the external data from suppliers, customers and field services. A pragmatic concept and interesting to see Share-A-space Found-Its-place. PLM Vendors probably would claim that their system could organize this secure and remote access without the need for a system in between. But the fact that a Siemens company decides to use Share-A-space demonstrates there is still a gap between a potential safe, single PLM based implementation and a pragmatic separation approach.
PLM is changing
In my session that afternoon I focused on the visible change in PLM. From an IT infrastructure for file collaboration towards a more data-centric business driven approach. And from there looking into the future anticipate that moving towards a data-centric approach is crucial to be ready for advanced computer power and brain-matching algorithms. This will be the game changers I believe in the upcoming decade in line with the Industry 4.0 ideas. My past two post have been indicating this direction:
A Circular economy
Peter Bilello from CIMdata had a good presentation related to the change in business we see and must make. No longer can we afford an economy where we waste raw materials. The circular economy is about supporting the product lifecycle from cradle-to-cradle instead of the classical cradle-to-grave. This is what you could call the circular economy; This matches the trend that companies more and more will deliver services to their customers instead of selling products to them. Instead of buying a fridge you pay for cooling capacity and your supplier changes the current model with a new model after three years. The service or experience economy fitting very nicely with the new generations that seem to prefer more to live and share at the moment instead of owning property.
Your digital shadow
The closing keynote from Stephanie Hankey was like the starting keynote. No relation to PLM but interesting in the context of what the effects are from digitalization and mobility. She provided some insights about the data that is already collected from each individual (or device) and how this all can be combined in profiles – your digital shadow. And of course your shadow might give the wrong impression. You can imagine that with growing trend of smart devices and the Internet of Things it will be hard to stay out of it. Companies will sell and buy data sets from their potential customers (victims). Scary as it all happens in the background and you are not fully aware of it.
(At the point, I was writing this paragraph my computer crashed with a blue screen – coincidence?)
Cultured beef ?
After a good burger and discussion in the evening, the opening keynote on day two was from Mark Post with the title Cultured Beef – changing the way we eat and think about food forever. Another interesting keynote where Mark explained how we can feed the growing world population in a more sustainable way by creating animal products through cell culture and bio fabrication instead of farming. The process is still in the early days of discovery but by using cell culture you can assure you get the right meat, even without fat, and it is real meat. Currently still expensive. Mark estimates that with current technology and up scaling of the process a price of $ 65 per kilo can be reached. Too expensive for consumers at this time but a promising number for the future. Another (Dutch) keynote speaker that made us think differently for the rest of the day.
Next Bjarne Nørgaard from MAN Diesel & Turbo gave a good lecture for the audience, what it takes to design and build a ship. You build the engine and wrap the ship around it. The challenge for MAN is to follow, service and maintain the engine through is 30 year’s lifecycle and possibly longer. Next Bjarne went into the details of their information architecture, and it was surprising to learn that their PDM system was Siemens and that they used Aras on top of that for connecting data to the rest of the enterprise and lifecycle of the engine. You would assume two PLM systems in-house for one company is an overkill. Bjarne explained that they tried initially to achieve these goals with Teamcenter but failed due to lack of flexibility. Great marketing for Aras, bad for Siemens. Although I am sure the cultural aspect has played a role. No one likes their first PLM or ERP system, as the first implementation is this domain is the moment you have the biggest internal culture shock.
Using search and semantic technology
The presentation from Moises Martines-Ablanado (Configuration Management Airbus Group) and Thomas Kamps (Conweaver) was interesting as they demonstrated one of the upcoming concepts I foresee will have a great future. Conweaver connects to existing enterprise systems (PLM, ERP, CRM, and legacy) and create a semantic mapping and linking of the data indexed from these systems. And through this network of data provide apps with a particular purpose. For example identify directly changes in the current EBOM and MBOM and potentially from there update the MBOM based in EBOM changes. A concept I have seen with Exalead too, illustrating that once you are in a data-centric environment, combining data sources for particular purposes can be achieved fast. No need for the classical approach of a single database that stores all.
A new TLA ? CLM
Joy Batchelor gave a clear presentation why besides PLM and ERP Jaguar Landrover (JLR) needs a third system supporting the connectivity of product configurations and sales configurations. They are able to manage 58.000.000.000 combinations for 170 different markets, which means every person on this planet could have its unique Jaguar Landrover. Joy introduced CLM (Configuration Lifecycle Management) as the third domain needed to support these configurations. The system they are using is ConfigIT, and I assume all automotive vendors have their own toolsets to manage the product and marketing configurations. I hope to learn more on that area. Will CLM be a separate domain or will it be absorbed by PLM or ERP vendors in the future ? Time will tell/
A game changer ?
Henk Jan Pels from the Eindhoven University of Technology took us back in time and explained how ERP became visible on the CFO’s agenda eliminating the discussion on ROI. Where ERP is handling material flows, to develop and deliver products there is also a need for knowledge flows between requirements, functional and the physical definition of a product. Expanding these flows to a framework that covers the technology, the building blocks, the families and the individual products would be the ideal interaction Henk Jan is proposing. And a PLM system would be the environment to implement this concept. Henk Jan announced this as a game changer. I agree if management of companies spend times to understand the benefits, it will be a game changer. Somehow it remained an academic concept and I believe we are all eager to learn if companies will adapt this idea, knowing change to something that is not common or traditional is a cultural risk.
The German future ?
The final presentation I could attend was from Martin Eigner, who first explained in some detail what the Industry 4.0 approach was about. From there he took us into the world of model based systems engineering. You could say an integration of PLM with more virtual system modeling and analysis as the front end of the development process. Somehow similar to last year’s presentation, but understandable as the world of PLM does not evolve so fast.
This is somehow also my conclusion from this year’s event. I was hoping to see some new sparks. For sure the keynotes were inspiring although less related to PLM. The case from Airbus and Conweaver was inspiring as I believe search and semantic based applications are a logical extension for the challenges companies want to address with PLM. JLR’s presentation explaining the need for Configuration Lifecycle Management strengthened my thought that in the future PLM and ERP will disappear. It is about a business platform with combined services, which might fall in one of the classical categories. I believe for many people the German Innovation 4.0 should be studied and replicated as it acknowledges exactly the future trend to remain competitive.
It was a pity for the public that Siemens PLM, Dassault Systèmes and Autodesk were not there. As the two largest PLM vendors and one of the largest PLM challengers, you would expect them be there and allow prospects and PLM consultants to compare where each of the PLM companies is different. Still it was a good conference. Well organized and as mentioned in the introduction, all presentations are recorded, giving everyone the opportunity to digest and review content again.
I am looking forward to the next Product Innovation conference with perhaps some more PLM related keynotes and big data practices.
I will be attending the annual Product Innovation conference again in Berlin next week. Looking forward to this event, as it is one of the places where you have the chance to network and listen to presentations from people that are PLM minded. A kind of relaxation, as strangely enough, most of the companies I am visiting, considerer PLM still considered as something difficult, something related to engineering, not so much connected to the future of their business.
I believe one of the reasons is that people have founded their opinion on the past. An expensive implementation horror story, an engineering focuses implementation or other stories that have framed PLM in a certain manner.
However PLM has changed and it significance has grown !
During the Product Innovation conference, I will present in more depth this topic related to the change of PLM.,with more examples and a surprising projection to the future. Later, when time permits, I will share the more in-depth observations in my blog, hopefully extended based on discussions during the conference. And if you attend the conference, don’t miss my session.
the term PLM (Product Lifecycle Management) was introduced as a logical extension to cPDM (collaborative Product Data Management). Where the initial focus was of global file sharing of mechanical CAD data, PLM extended the scope with multidisciplinary support, connecting manufacturing preparation and providing an infrastructure for change management.
In the nineties product data management was in transition.
In the early 90s, UNIX dominated, and installing a PDM system was the work of IT-experts. Large enterprises, already operating globally, were pushing for standardization, and control of data to connect their engineers in a more efficient manner. Connectivity was achieved through expensive lease lines; people like me, had to connect to the internet through dial-up modems and its usage was limited, providing static web pages with minimal graphics.
It was obvious that cPDM and the first PLM projects were extremely expensive. There was no experience; it was learning on the job. The costs were high and visible at the management level. Giving the management the impression that PLM is potentially the same challenge as ERP, but with a less clear scope. And the projects were executed by IT-experts, end-users were not really in the game.
At the end of the 90s, a small revolution started to take place. The power of the PC combined with Microsoft technology provided a much cheaper and flexible alternative for a complex UNIX based implementation.
Affordable 3D CAD emerged in the mid-market, leading to the need for Windows-based PDM systems and with Windows came Excel, the PDM/PLM killer application.
A person with some rudimentary Visual Basic skills could do magic with Excel and although not an IT-expert would become the champion of the engineering department.
At that time, PLM conferences provided a platform on which industry could discuss and share their tips and tricks on how to implement in the best manner a system. The focus was mainly on the IT-side and large enterprises. The scope was engineering centric, connecting the various disciplines including mechanical, electrical and simulation, in a database and connecting files and versions.
most large enterprises had already started to implement a PLM system. The term PLM became an accepted acronym associated with something that is needed for big companies and is complex and expensive, a logical statement based on the experiences of early adopters.
PLM was the infrastructure that could connect product information between disciplines and departments working from different locations. The NPI (New Product Introduction) process became a topic pushed by all enterprise PLM vendors and was a practice that demonstrated the value of providing visibility on information across a large, dispersed company, to better decision-making.
As this process was more data-centric instead of CAD-centric, these capabilities promoted the recognition and introduction of PLM in non-traditional manufacturing industries like Consumer Packaged Goods, Pharmaceuticals and Apparel where planning and coordination of information leads, instead of a Bill of Material.
In large enterprises, PLM still lay with the IT-architects as they were the ones deciding the standards and software to be used. PLM and ERP connectivity was an expensive topic.
For the mid-market, many PLM vendors were working on offers to standardize a PLM implementation; this usually involved a stripped-down or limited version from the full PLM system, a preconfigured system with templates or something connected to SharePoint. Connectivity was much easier then 15 years ago, thanks to a better internet infrastructure and the deployment of VPN.
For me at that time selling PLM to the mid-market was challenging; how do you explain the value and minimize the risk while current business was still running well? What was so wrong with the existing practices based on Excel? In summary, with good margins and growing business, wasn’t everything under control without the need for PLM? This was the time I started to share my experiences in my blog: A Virtual Dutchman´s introduction
Mid-market PLM projects focused on departmental needs, with IT providing implementation support and guidance. As the number of IT-staff is usually limited in these companies and often organized around ERP and what they learned from its implementation, it was hard to find business experts for PLM in the implementation teams.
the financial crisis had started, and globalization had started to become real through world-wide connectivity – better infrastructure and WEB 2.0. The world became an open space for consumers and competitors; the traditional offshore countries became consumers and began to invest in developing products and services for their domestic market but also targeted the rest of the world. Large enterprises were still expanding their huge PLM implementations though some were challenged because of a change of ownership. Capital investors did not come from the US or Europe anymore but from the BRIC (Brazil, Russia, India, China) countries, forcing some established companies to restructure and refocus.
In response to the crisis, mid-market companies started to reduce costs and focus on efficiency. Lots of discussions related to PLM began as it appeared to be THE strategy needed to survive, though a significant proportion of the investment in PLM was cancelled or postponed by management due to uncertainty and impact on the organization.
PLM conferences showed that almost all of the big enterprises and the mid-market companies still using PLM for connecting departments without fundamentally integrating them in one complete PLM concept. It is easier to streamline the sequential process (thinking lean) instead of making it a concurrent process with a focus on the market needs. PLM conferences were being attended by a greater mix of IT and Business representatives from different businesses learning from each other.
everyone in the world is connected and consequently, the amount of data is piling up. And now it is more about data than about managing document. The introduction of smart devices has had an impact on how people want to work; instead of sharing files and documents, we start sharing and producing huge amounts of data. In addition the upcoming “Internet of Things” demonstrates we are moving to a world where connectivity through data becomes crucial.
Sharing data is the ideal strategy for modern PLM. PLM vendors and other leading companies in enterprise software are discovering that the classical method of storing all information into one database does not work anymore and will not work in the future.
In the future, a new generation of PLM systems, either as an evolution of existing systems or as a disruption from the current market, will come. No longer will the target be to store all information in one system; the goal will be to connect and interpret data and make the right decisions based on that. This is similar to what the new generation of workers are used to, and they will replace the (my) older generation in the upcoming decade
Combined with more and more cloud-based solutions and platforms, the role of IT will diminish, and the importance of business people driving PLM will become ever more crucial.
PLM has become a business-driven strategy and requires people that are strong enough to develop, justify and implement this approach in their companies. New champions are needed !
The value of communities, blogs and conferences
is bringing together the global brainpower in social environments. Complemented with presentations, opinions and discussions from all different industries and domains the ideal environment to grow new ideas. Here you can associate the information, question its relevancy for your business and network with others – the perfect base for innovating and securing your future business.
Therefore, do not use communities or conferences to stick to your opinion but be open and learn.
One of my favorite quotes
This year I had several discussions with persons working for construction companies. They shared their BIM dreams and tried to explain them the PLM benefits and basics as they are much alike. The challenge in these discussions was that each of us comes from a complete different background. The word PLM does not resonate well outside product-centric companies. In project-centric companies, people tend to focus more on the tools they are using, instead of the overall business process. Construction companies and EPC companies in Oil & Gas always had a project-centric approach, and for them every project is unique.
Ten years ago
AECbytes.com published in 2004 the chart below, demonstrating the construction industry is lagging behind in productivity compared to other industries.
You find a link to the full article here.
Now it is BIM
It is an old graph, and I haven’t seen a more recent one. However, I guess the trend has not changed significantly. What has changed is that construction companies are now talking about BIM. BIM meaning Building Information Model, a term which has a history with Autodesk. Read the wiki news about BIM. There are many interpretations of BIM. One of the formal definitions is:
Building Information Modeling (BIM) is a digital representation of physical and functional characteristics of a facility. A BIM is a shared knowledge resource for information about a facility forming a reliable basis for decisions during its life-cycle; defined as existing from earliest conception to demolition.
This is a high-level definition, and BIM is characterized as a shared knowledge resource. Is it a 3D Digital model ? Is it a kind of DMU (Digital Mock-Up) ? Is it a Building Lifecycle environment ? There is the word “life-cycle” in the definition.
I noticed many vendors and consultants in this industry talk about what is BIM. It is rare to find quantified values for implementing BIM. You find exactly the same values as PLM brings to manufacturing companies. Better decisions, managing complex constructions and projects, early decisions that save costs later, etc.
Governments have been pushing BIM to the construction industry (both for the civil and building industry) as they believe this is a way to improve quality and better manage time and costs. And as they are usually the big spenders, the leading construction firms have to adapt to these standards to get these contracts.
Would any construction company begin with BIM without being pushed?
In product-centric companies, the global competition and the consumer are driving the need for PLM. Margins are under pressure, and they need to be competitive to stay in business. The construction industry is not (yet) that much driven by global influence and the choice of consumers.
The chart below illustrates the BIM ambition in the UK. At this time, companies are entering level 2, and they struggle to understand what is the impact for them to be at BIM Level 2. I am sure other countries have their own and similar roadmap.
The diagram illustrates the same path which other industries have been going through in the past twenty years.
BIM Levels and PDM / PLM
BIM level 0 is focused on managing CAD, in the other industries this was the time that single disciplines managed their own CAD data. There was no sharing at that time.
Level 1 is focusing on managing 2D and 3D CAD together much similar to what in other industries is done with a PDM system. The PDM system manages in one environment the 2D and 3D data. This is still as a departmental solution but could provide in one environment information from different disciplines. Here, you find all suppliers from 3D CAD systems having their PDM solution, not focusing on a core 3D Model
Level 2 is about sharing 3D BIM models for different disciplines to support 4D (construction planning based on 3D) and 5D (construction planning based on 3D planning and costing integrated). This is what in other industries, primarily automotive and aerospace, was considered as the early days of DMU (Digital Mock Up) and PLM. Dassault Systemes and Siemens are leading here and historically CATIA has been the base for the 3D Model.
BIM Level 3 is what can be found currently in the asset centric industries (Energy, Nuclear, Oil & Gas) where working from a virtual plant model all disciplines are connected through the whole lifecycle. This is the domain that I have been advocating in previous posts, promoting PLM concepts and capabilities.
For example read: PLM for Asset Lifecycle Management.
Apparently the construction industry is still in the early phases of BIM Level 3. I would compare it to teenage sex; they all talk about it, but nobody does it. Or Hollywood BIM as Antonio Ruivo Meireles calls it in his AECbytes article: “Say “NO!” to Hollywood BIM”.
Antonio talks about the BIM implementation at Mota-Engill. Briefly touching a common topic for PLM implementations: “People and Cultural Change”. However, most of the implementation report was focused on tools, where even Excel and Visual Basic play a role.
Tools or Platform ?
And this is the point where construction companies could learn from other industries. They have discovered (or are still discovering) that Excel and Visual Basic are like soft drugs. They take away the pain, but they do not provide the solution in the long term. Instead of that, legacy Excels start piling up in directories, and the Visual Basic code becomes the domain of an enthusiastic expert (till this expert moves to another company or retires). The risk is ending up with a legacy environment so hard to change that a costly revolution is needed at a certain moment.
Construction companies are still investing in selecting a set of tools/applications, each with their own proprietary data and format. And they use customizations or standardized information carriers, like the COBie spreadsheets, to exchange information between partners and disciplines. This is already a giant step forward, as COBie forces companies to focus on mandatory and standard content, required at specific stages of the lifecycle instead of searching for it when it is actually needed.
Somehow the COBie approach is similar to the early days of PLM, where companies forced their disciplines to save information in the PLM system (as it became imperative). In these departments and disciplines the work and interaction did not change so much as before they had the PLM system. The cultural change here was that designers and engineers had to enter more data upfront for higher quality downstream.
An intermediate conclusion might be that construction companies follow the same direction as early PLM. Standardizing the data (model) to have a common understanding between stakeholders. Construction companies might not want to implement a PLM system as ownership of data is unclear as compared to manufacturing companies every discipline or department in PLM might be another company in the construction industry.
Now let’s look into the future
The movie below from Airbus describes the current way of working in a multidisciplinary, multi-partner, multi-location online system. Airbus calls it their DMU. Please before continuing reading look at this movie as the concept is crucial
I want to highlight two statements in this movie.
Russ Brigham @ 5:39 talking about suppliers not participating to the DMU:
“They will be making decisions on out of date data or even incorrect data”
And @ 7:11
“DMU is a mind-set …….”
I am aware that the aerospace industry is not directly comparable to the construction industry, there are commonalities from which the construction industry can learn:
- Working on a single, shared repository of on-line data (the DMU)
A common data model – not only 3D
- It is a mind-set.
People need to share instead of own data
- Early validation and verification based on a virtual model
Working in the full context
- Planning and anticipation for service and maintenance during the design phase
Design with the whole lifecycle in mind (and being able to verify the design)
Data ownership ?
For the construction industry, the current difficulty might be that none of the parties involved wants to invest in owning the data. For Airbus, it is clear. As the manufacturer of the airplane, they remain responsible for the information throughout the whole lifecycle.
For a construction, this might be different. The owner might be totally disconnected from the construction and the operations, therefore, not willing to promote or invest in the DMU approach.
However, the owner should realize that it is not about ownership but about facilitating on-line collaboration around a construction from the initial concept phase till maintenance and even decommissioning, connecting all the stakeholders. The benefits better decisions at each stage of the lifecycle leading to lower failure costs and waste in materials, resources and time. The construction industry still accepts too high failure rates compared to the manufacturing industry. And as at the end the owner/operator spends most of these costs, they should be interested in this approach.
Major construction companies responsible for the project execution and control might want to invest in a PLM platform, allowing them to execute projects better, learn from other connected projects and create a solid base for maintenance contracts
My dream and wish for 2014 for the construction industry: Focus on the next step of integrating data on a PLM backbone instead of standardizing interfaces between applications. It is the future mind-set proven in other industries.
I wish you all a happy, healthy and successful 2014 full of change for the best
May BIM, BAM, BOOM become true
The last month I haven’t been able to publish much of my experiences as I have been in the middle of several PLM selection processes for various industries. Now in a quiet moment looking back, I understand it is difficult for a company to choose a PLM solution for the future.
I hope this post will generate some clarity and may lead to some further discussion with other experts in the audience. I wrote about the do’s and don’ts of PLM selection in 2010, and most of it is still actual; however, there is more. Some of the topics explained:
Do you really need PLM ?
This is where it starts. PLM is not Haarlemerolie, an old Dutch medicine that was a cure for everything since the 17th century. The first step is that you need to know what you want to achieve and how you are aiming to achieve it. Just because a competitor has a PLM system installed, does not mean they use it properly or that your company should do it too. If you do not know why your company needs PLM, stop reading and start investigating.
If you are still reading this, you are part of the happy few, as justifying the need for PLM is not easy. Numerous of companies have purchased a PLM system just because they think they needed PLM. Or there was someone convinced that this software would bring PLM.
Most of these cases there was the confusion with PDM. Simply stating: PDM is more a departmental tool (engineering – multidisciplinary) where PLM is a mix of software, infrastructure to connect all departments in a company and support the product through its entire lifecycle.
Implementing “real” PLM is a business change, as people have to start sharing data instead of pushing documents from department to department. And this business transformation is a journey. It is not a fun journey, nicely characterized in Ed Lopategui’s blog post, the PLM Trail.
Although I believe it is not always that dramatic, Ed set the expectations right. Be well prepared before you start.
Why do companies still want PLM, while it is so difficult to implement?
The main reason is to remain competitive. If margins are under pressure, you can try to be more efficient, get better and faster tools. But by working in the old way, you can only be a little better.
Moving from a sequential, information pushing approach towards an on-line, global information sharing manner is a change in business processes. It is interaction between all stakeholders. Doing things different requires courage, understanding and trust you made the right choice. When it goes wrong, there are enough people around you to point fingers at why it went wrong – hindsight is so easy.
Doing nothing and becoming less and less competitive is easier (the boiling frog again) as in that case the outside world will be blamed, and there is nobody to point fingers at (although if you understand the issue you should make the organization aware the future is at stake)
Why is PLM so expensive?
Assuming you are still reading, and you and your management are aligned there is a need for PLM, a first investigation into possible solutions will reveal that PLM is not cheap.
When you calculate the overall investment required in PLM, the management often gets discouraged by the estimated costs. Yes, the benefits are much higher, but to realize these benefits, you need to have a clear understanding of your own business and a realistic idea how the future would look like. The benefits are not in efficiency. The main benefits come from capabilities that allow you to respond better and faster than by just optimizing your departments. I read a clarifying post recently, which is addressing this issue: Why PLM should be on every Executive’s agenda !
From my experience with PLM projects, it is surprising to learn that companies do not object to spend 5 to 20 times more money for an ERP implementation. It is related to the topic: management by results or management by means.
PLM is not expensive compared to other enterprise systems. It can become expensive (like ERP implementations) if you lose control. Software vendors have a business in selling software modules, like car resellers have a business in selling you all the comfort beyond the basics.
The same for implementation partners, they have a business in selling services to your company, and they need to find the balance between making money and delivering explainable value. Squeezing your implementation partner will cause a poor delivery. But giving them an open check means that, at a certain moment, someone will stand up and shutdown the money drain as the results are no longer justifiable. Often I meet companies in this stage, the spirit has gone. It is all about the balance between costs and benefits.
This happens in all enterprise software projects, and the only cure is investing in your own people. Give your employees time and priority to work in a PLM project. People with knowledge of the business are essential, and you need IT resources to implement. Do not make the mistake to leave business uncommitted to the PLM implementation. Management and middle management does not take the time to understand PLM as they are too busy or not educated / interested.
Make business owners accountable for the PLM implementation – you will see stress (it is not their daily job – they are busy), but in the longer time you will see understanding and readiness of the organization to achieve the expected results.
We are the largest – why select the largest ?
When your assignment is to select a new enterprise system, life could be easy for you. Select a product or service from the largest business and your career is saved. Nobody gets blamed for selecting the largest vendor, although if you work for a small mid-sized company, you might think twice.
Many vendors and implementers start their message with:
“…. Market leader in ABC, though leader in XYZ, recognized by 123”
The only thing you should learn from this message is that this company probably has delivered a trustworthy solution in the past. Looking at the past you get an impression of its readiness and robustness for the future. Many promising companies have been absorbed by the larger ones and disappeared. As Clayton Christensen wrote in The Innovators Dilemma:
“What goes up does not go down”.
Meaning these large companies focus on their largest clients and will focus less on the base of the business pyramid (where the majority is), making them vulnerable for disruptive innovation.
Related to this issue there is an interesting post (and its comments), written by Oleg Shilovitsky recently: How many PLM vendors disappear in disruption predicted by Gartner.
Still when selecting a PLM vendor it is essential to know if they have the scale to support you in the future and if they have the vision to guide you into the future.
The future of PLM is towards managing data in a connected manner, not necessary coming from a single database, not necessary using only structured data. If your PLM vendor or implementer is pushing you to realize document and file management, they are years late and not the best for your future.
PLM is a big elephant
PLM is considered as a big elephant, and I agree if you address everything in one shot that PLM can do. PLM has multiple directions to start from – I wrote about it: PLM at risk – it does not have a single job
PLM has a huge advantage compared to a transactional system like ERP and probably CRM. You can implement a PLM infrastructure and its functionality step by step in the organization, start with areas that are essential and produce clear benefits for the organization. That is the main reason that PLM implementations can take 2 – 3 years. You give the organization time to learn, to adapt and to extend.
We lose our flexibility ?
Nobody in an organization likes to be pushed in a cooperate way of working, which by definition is not as enjoyable and as flexible as they way you currently work. It is still an area where PLM implementations can improve: provide the user with an environment that is not too rigid and does not feel like a rigid system. You seen this problem with old traditional large PLM implementations for example with automotive OEMs. For them, it is almost impossible to switch to a new PLM implementation as everything has been built and connected in such a proprietary way, almost impossible to move to more standard systems and technologies. Late PLM implementations should learn from these lessons learned.
PLM vendor A says PLM vendor B will be out of business
One of the things I personally dislike is FUD (Fear, Uncertainty and Doubt). It has become a common practice in politics and I have seen PLM vendors and implementers using the same tactics. The problem with FUD is that it works. Even if the message is not verifiable, the company looking for a PLM system might think there must be some truth in this statement.
My recommendation to a company that gets involved in FUD during a PLM selection process, they should be worried about the company spreading the FUD. Apparently they have no stronger arguments to explain to you why they are the perfect solution; instead they tell you indirectly we are the less worst.
Is the future in the cloud ?
I think there are two different worlds. There is the world of smaller businesses that do not want to invest in an IT-infrastructure and will try anything that looks promising – often tools oriented. This is one of my generalizations of how US businesses work – sorry for that. They will start working with cloud based systems and not be scared by performance, scalability and security. As long all is easy and does not disturb the business too much.
Larger organizations, especially with a domicile in Europe, are not embracing cloud solutions at this moment. They think more in private or on-premise environments. Less in cloud solutions as security of information is still an issue. The NSA revelations prove that there is no moral limit for information in the sake of security – combined with the fear of IP theft from Asia, I think European companies have a natural resistance for storing data outside of their control.
For sure you will see cloud advocates, primarily coming from the US, claiming this is the future (and they are right), but there is still work to do and confidence to be built.
PLM selection often has a focus on checking hundreds of requirements coming from different departments. They want a dream system. I hope this post will convince you that there are so many other thoughts relevant to a PLM selection you should take into account. And yes you still need requirements (and a vision).
Your thoughts ?
- CIMdata Publishes PLM Geography Report (detroit.cbslocal.com)