2050This is for the moment the last post about the difference between files and a data-oriented approach. This time I will focus on the need for open exchange standards and the relation to proprietary systems. In my first post, I explained that a data-centric approach can bring many business benefits and is pointing to background information for those who want to learn more in detail. In my second post, I gave the example of dealing with specifications.

It demonstrated that the real value for a data-centric approach comes at the moment there are changes of the information over time. For a specification that is right the first time and never changes there is less value to win with a data-centric approach. Moreover, aren’t we still dreaming that we do everything right the first time.

The specification example was based on dealing with text documents (sometimes called 1D information). The same benefits are valid for diagrams, schematics (2D information) and CAD models (3D information)

1D,2D,3D …..

1DThe challenge for a data-oriented approach is that information needs to be stored in data elements in a database, independent of an individual file format. For text, this might be easy to comprehend. Text elements are relative simple to understand. Still the OpenDocument standard for Office documents is in the background based on a lot of technical know-how and experience to make it widely acceptable. For 2D and 3D information this is less obvious as this is for the domain of the CAD vendors.

CAD vendors have various reasons not to store their information in a neutral format.

  • First of all, and most important for their business, a neutral format would reduce the dependency on their products. Other vendors could work with these formats too, therefore reducing the potential market capture. You could say that in a certain manner the Autodesk 2D format for DXF (and even DWG) have become a neutral format for 2D data as many other vendors have applications that read and write back information in the DXF-data format. So far DXF is stored in a file but you could store DXF data also inside a database and make it available as elements.
  • This brings us to the second reason why using neutral data formats are not that evident for CAD vendors. It reduces their flexibility to change the format and optimize it for maximal performance. Commercially the significant, immediate disadvantage of working in neutral formats is that it has not been designed for particular needs in an individual application and therefore any “intelligent” manipulations on the data are hard to achieve..

3dThe same reasoning can be applied to 3D data, where different neutral formats exist (IGES, STEP, …. ). It is very difficult to identify a common 3D standard without losing many benefits that an individual 3D CAD format brings currently. For example, CATIA is handling 3D CAD data in a complete different way as Creo does, and again handled different compared to NX, SolidWorks, Solid Edge and Inventor. Even some of them might use the same CAD kernel.

However, it is not only about the geometry anymore; the shapes represent virtual objects that have metadata describing the objects. In addition other related information exists, not necessarily coming from the design world, like tasks (planning), parts (physical), suppliers, resources and more

PLM, ERP, systems and single source of truth

This brings us in the world of data management, in my world mainly PLM systems and ERP systems. An ERP system is already a data-centric application, the BOM is already available as metadata as well as all the scheduling and interaction with resources, suppliers and financial transactions. Still ERP systems store a lot of related documents and drawings, containing content that does not match their data model.

PLM systems have gradually becoming more and more data centric as the origin was around engineering data, mostly stored in files. In a data-centric approach, there is the challenge to exchange data between a PLM system and an ERP system. Usually there is a need to share information between two systems, mainly the items. Different definitions of an item on the PLM and ERP side make it hard to exchange information from one system to the other. It is for that reason why there are many discussions around PLM and ERP integration and the BOM.

ebom_mbom_problem

In the modern data-centric approach however we should think less and less in systems and more and more in business processes performed on actual data elements. This requires a company-wide, actually an enterprise-wide or industry-wide data definition of all information that is relevant for the business processes. This leads into Master Data Management, the new required skill for enterprise solution architects

black holeThe data-centric approach creates the impression that you can achieve a single source of the truth as all objects are stored uniquely in a database. SAP solves the problem by stating everything fits in their single database. To my opinion this is more a black hole approach: Everything gets inside, but even light cannot escape. Usability and reuse of information that was stored with the intention not to be found is the big challenge here.

Other PLM and ERP vendors have different approaches. Either they choose for a service bus architecture where applications in the background link and synchronize common data elements from each application. Therefore, there is some redundancy, however everything is connected. More and more PLM vendors focus on building a platform of connected data elements, where on top applications will run, like the 3DExperience platform from Dassault Systèmes.

androidAs users we are more and more used to platforms as Google, Apple provide these platforms already in the cloud for common use on our smartphones. The large amount of apps run on shared data elements (contacts, locations …) and store additional proprietary data.

Platforms, Networks and standards

And here we enter an interesting area of discussion. I think it is a given that a single database concept is a utopia. Therefore, it will be all about how systems and platforms communicate with each other to provide in the end the right information to the user. The systems and platforms need to be data-centric as we learned from the discussion around the document (file centric) or data-centric approach.

In this domain, there are several companies already active for years. Datamation from Dr. Kais Al-Timimi in the UK is such a company. Kais is a veteran in the PLM and data modeling industry, and they provide a platform for data-centric collaboration. This quote from one of his presentations, illustrates we share the same vision:

“……. the root cause of all interoperability and data challenges is the need to transform data between systems using different, and often incompatible, data models.

It is fundamentally different from the current Application Centric Approach, in that data is SHARED, and therefore, ‘NOT OWNED’ by the applications that create it.

This means in a Data Centric Approach data can deliver MORE VALUE, as it is readily sharable and reusable by multiple applications. In addition, it removes the overhead of having to build and maintain non-value-added processes, e.g. to move data between applications.”

Another company in the same domain is Eurostep, who are also focusing on business collaboration between in various industries. Eurostep has been working with various industry standards, like AP203/214, PLCS and AP233. Eurostep has developed their Share-A-space platform to enable a data-centric collaboration.

ISO-BIMThis type of data collaboration is crucial for all industries. Where the aerospace and automotive industry are probably the most mature on this topic, the process industry and construction industry are currently also focusing on discovering data standards and collaboration models (ISO 15926 / BIM). It will be probably the innovators in these industries that clear the path for others. For sure it will not come from the software vendors as I discussed before.

Conclusion

If you reach this line, it means the topic has been interesting in depth for you. In the past three post starting from the future trend, an example and the data modeling background, I have tried to describe what is happening in a simplified manner.

If you really want to dive into the PLM for the future, I recommend you visit the upcoming PDT 2014 conference in Paris on October 14 and 15. Here experts from different industries will present and discuss the future PLM platform and its benefits. I hope to meet you there.

pdteurope

 

Some more to read:

https://us.sogeti.com/wp-content/uploads/2014/04/PLM-Systems-White-Paper.pdf

dataIn my previous post, I talked about the unstoppable trend towards digital information and knowledge based on data becoming the new business paradigm.

Building knowledge based on information extracted from data, instead of working with documents and people, who need to manipulate these documents.

Moreover, the reasons to move towards a digital data-oriented approach are the immense business benefits it can bring to an organization. Having online visibility on information in context of other information from different stakeholders allows companies to be more proactive.

A proactive company will react faster to the market or customer. This will reduce waste and resources (materials / people) and therefore in the end be more competitive. This is all described in my first post, with relevant links to various global references.

In this post, I want to describe in an example what the differences are between a document-oriented and a data-oriented approach and how it affects people and business. This might give you an impression of the expected business benefits.

The ultimate goal behind a data-oriented approach is to have a single version of the truth for a product, project or plant. This can be realized by treating information as data elements in various connected database, where on demand reports or dashboards can be created based on actual information, instead of documents generated by duplicating data in new systems and locations. Digital data will provide paperless processes accessible almost anywhere around the world.

push or pull

As an example, I will explain the difference between document-centric and data-centric when dealing with specifications.

The specification

Everyone knows the challenge with specifications. Most of the time in printed documents describing how a product or service should work from the client point of view. There are two principles behind specifications:

  • Complexity. The more complex product or service is, the bigger chance that specifications are not complete or hundred percent understood, leading to an iterative change process. The challenge here is to manage the change and the consistency of the full specifications.
  • Industry and margins. In a repetitive business, for example, the automotive or other mass consumer products, products can be quiet complex and once sold hard to maintain and repair. In a competitive business, an error in the field can consume a lot of the expected profit. In the construction industry, where most of the time single projects are executed by a chain of disciplines, the industry (still) accepts the costs overrun and the high costs of fixing issues in the field, instead of being clearer upfront during the design and planning.

Let’s stay with the example in the middle of complexity and industry volume. In color the various stages of the process.

The document based specification – 1

pdfWhen the document based specification arrives, the company has to get an understanding of the content. The project manager has a first read through the document (100+ pages) and decides to send the document (it is a pdf) to sales, engineering, legal and planning. Engineering decides to distribute the document internally to mechanical, electrical and quality (for compliance).

The project manager stresses everyone on a weekly base to deliver the responses and tries to understand if the answers will come in time. There are some meetings needed with the stakeholders as the whole understanding needs to be consistent. Based on several iterations a response is compiled.

The data-oriented approach – 1

ReqStructureWhen the document based specification arrives, the project leader first stores the document as a reference in the PLM system and extracts all the customer requirements as data elements in the system. While extracting the requirements, the projects manager groups them into digital folders (functional / non-functional, contractual, regulations, etc.) and assigns them to the relevant stakeholders, who get notified by the system. Each of the persons assigned, again the engineering manager has distributed the discipline specific requirements internally.

The project manager watches the progress of the requirements analyses which are around a virtual model. There is still a need for meetings with the stakeholders to agree on the solution approach. Everything is stored and visible online in the system. This visibility has helped some of the stakeholders to be better-aligned upfront. In the end, the response is generated and converted to the customer’s format.

Not much benefit for step 1

imageIf you compare the two approaches, there is mainly one person happy: the project manager. Instead of spending time to collect the status of all information, direct visibility on the response helps him/her to prioritize of focus where attention is needed, instead of discovering it on a weekly base.

There is some small benefit from the virtual model as other stakeholders can have a better understanding of the actual progress.

However, for the rest, all stakeholders are complaining. It is difficult to work. (Fl)Excel was much easier. Moreover, thinking about a virtual model takes time as we are not used working in this way. Typically something for aerospace you might think.

And now the benefits come – step 2

The customer has placed the order, and the project has started. The design has started, and people start to discover discrepancies or ambiguous demands that need to be negotiated with the customer. Is it part of the project and if not, should it become part of the project and at which costs (for whom)

The document oriented approach – 2

searchSeveral engineers are now discussing with the counterparts at the customer the detailed interpretation of the requirements, either through face-to-face meetings or emails. Changes are collected and sent to the project manager, who tries to understand what has changed and how to merge it in an on-going specification document. To avoid many revisions, he/she tries to update the document on a bi-weekly base, send it to the internal stakeholders for review and with their feedback generates a specification document for the customer that supposed to cover the latest agreements.

Unfortunate not all changes have reached the document as some of the stakeholders were busy and forgot to include some of the changes agreed with the customer as they were in a lost email. Also, a previous change of a requirement was overwritten as an update from quality used the old data. Finally, some design solutions were changed, which raised the costs. And not sure if the product with all its changes will be compliant after delivery. However, luckily nobody noticed so far, not even the customer

The data-oriented approach – 2

whyworryThanks to the virtual model and the relations between all the requirements, any change in a requirement gets notified in the system. When a requirement is further clarified, it is updated in the system. When a requirement needs to be changed, it is clear what the impact of this change is. A change workflow assures that decisions are made visible and approved. Potentially changes that lead to more work were quoted to the customer for acceptance. Luckily the compliancy engineer noted that the change of materials used would lead to a compliancy issue. On a bi-weekly base, the project manager generates an agreed specification for the customer based on the data in the system.

Benefits are growing.

imageThe project manager remains the happiest person and is even happier as less discussion is needed about who changed what and why. Alternatively, discussions about changes that should exist and cannot be found. The time saved by the project manager could be used to collaborate even better with the teams (without annoying them) or perhaps a second project to manage in parallel.

Other stakeholders start to enjoy the data-oriented approach too. Less ambiguity on their side too, fewer iterations because changes were not apparent. As all information is related to the virtual model online, the actual status is clear when making a decision. Less fixing afterwards and luckily still project meeting between the stakeholders to synchronize. The PLM system does not eliminate communication; it provides a reliable baseline of the truth. No need (and option) to look in your archives.

At this stage, benefits start to become clear. Fewer iterations and better decisions will have an impact on the costs and project stress. Still a complaint from the engineers might be that they need to do too much upfront thinking although some years later they might discover that this will be their main job. Fixing issues from the past have diminished.

And then the ultimate benefits

Now the project has reached the physical state. It is manufactured or under commissioning.

The document oriented approach – 3

No_roiIn the document oriented approach, many issues might pop-up because they have not been considered in the early phase, or they got lost during document exchanges. Does the product work as specified? Is the building certified as specified?

The customer is king and for manufacturing companies this might lead to product recalls or launch delays. In the construction world, people in the field, will fix the issues by using skilled resources and creating a waste of materials and/or resources.

Data handover to the owner is a nightmare for a the project-centric delivery. Several people have been searching for documents, specifications and emails to build and compile the required documents for handover.

The data-oriented approach – 3

sel_aIn the data-centric approach, the behavior of the physical product works as expected as most of the issues have been solved in the virtual model. When testing the product it works as specified as the specifying requirements have always been linked to the product. Moreover, they have been agreed and approved by the relevant stakeholders. Where relevant, the customer has paid for the extra work specified.

The handover process was not so stressful as before with the document-oriented approach. As the required information was known and specified upfront related to the requirements, the maturity process of the virtual model assured this data exists in the system. Now the as-built information matches the as-specified information. What a relief.

Conclusion

imageIt is clear that the significant benefits can be found in step 3. I wrote the comparison in an extreme manner, knowing that reality lies in the middle. Excellent people can comprehend and fix more upfront because of their experience. Building the ultimate virtual model is not yet an easy achievement either.

The savings in materials and required resources are significant in a data-oriented approach. The time savings and the quality enhancements might change your company into a market-leader. The cost savings achieved through a pro-active approach will make your margin growing (unless competition does the same) and enable you to innovate.

One final remark on business change

ideaIf business change could be achieved by selecting the right tool or system without a business change, you will never get a competitive advantage. As your competitor can buy these too.

However, if you change to a data-centric approach, it will be a though change process and therefore once implemented you will leave competitors behind that keep on hanging on the past.

data futureMy holidays are over. After reading and cycling a lot, it is time to focus again on business and future. Those of you who have followed my blog the past year must have noticed that I have been talking on a regular base about business moving to a data-oriented approach instead of a document / file-based approach. I wrote an introduction to this topic at the beginning of this year: Did you notice PLM has been changing?

It is part of a bigger picture, which some people might call the Second Machine Age, Industry 4.0, The Third Wave or even more disturbing The onrushing wave.

This year I have had many discussions around this topic with companies acting in various industries; manufacturing, construction, oil & gas, nuclear and general EPC-driven companies. There was some commonality in all these discussions:

  • PLUS: Everyone believes it is a beautiful story and it makes sense
  • MINUS: Almost nobody wants to act upon it as it is an enormous business change and to change the way a company works you need C-level understanding
  • PLUS: Everyone thinks the concept is clear to them
  • MINUS: Few understand what it means to work data-oriented and what the impact on their business would be

Therefore, what I will try to do in the upcoming blog posts (two-three-four ??) is to address the two negative observations and how to make them more precise.

What is data / information / knowledge?

dataData for me is a collection of small artifacts (numbers, characters, lines, sound bits, …) which have no meaning at all. This could be bundled together as a book, a paper drawing, a letter but also bundled together as a digital format like an eBook, a CAD file, an email and even transmission bytes of a network / internet provider can be considered as data.

informationData becomes significant once provided in the context of each other or in the context of other data. At that time, we start calling it information. For that reason, a book or a drawing provides information as the data has been structured in such a manner to become meaningful. The data sent through the network cable only becomes information when it is filtered and stripped from the irrelevant parts.

knowledgeInformation is used to make decisions based on knowledge. Knowledge is the interpretation of information, which combined in a particular way, helps us to make decisions. And the more decisions we make and the more information we have about the results of these decisions, either by us or other, it will increase our knowledge.

 

Data and big data

Now we have some feeling about data, information and knowledge. For academics, there is room to discuss and enhance the definition. I will leave it by this simple definition.

Big data is the term for all digital data that is too large to handle in a single data management system, but available and searchable through various technologies. Data can come from any source around the world as through the internet an infrastructure exists to filter and search for particular data.

big dataBy analyzing and connecting the data coming from these various sources, you can generate information (placing the data in context) and build knowledge. As it is an IT-driven activity, this can be done in the background and give almost actual data to any person. This is a big difference with information handling in the old way, where people have to collect and connect manual the data.

The power of big data applies to many business areas. If you know how your customers are thinking and associating their needs to your products, you can make them better and more targeted to your potential market. Or, if you know how your products are behaving in the field during operation (Internet of Things) you can provide additional services, instant feedback and be more proactive. Plus the field data once analyzed provide actual knowledge helping you to make better products or offer more accurate services.

Wasn’t there big data before?

clip_image012Yes, before the big data era there was also a lot of information available. This information could be stored in “analogue” formats ( microfiche, paper, clay tablets, papyrus) or in digital formats, better known as files or collections of files (doc, pdf, CAD-files, ZIP….).

Note the difference. Here I am speaking about information as the data is contained in these formats.

You have to open or be in front of information container first, before seeing the data. In the digital world, this is often called document management, content management. The challenge of these information containers is that you need to change the whole container version once you modify one single piece of data inside it. And each information container holds duplicated information from a data element. Therefore, it is hard to manage a “single version of the truth” approach.

And here comes the data-oriented approach

The future is about storing all these pieces of data inside connected data environments, instead of storing a lot of data inside a (versioned) information container (a file / a document).

Managing these data elements in the context of each other allow people to build information from any viewpoint – project oriented, product oriented, manufacturing oriented, service oriented, etc.

The data remains unique, therefore supporting much closer the single version of the truth approach. Personally I consider the single version of the truth as a utopia, however reducing the amount of duplicated data by having a data-oriented approach will bring a lot more efficiency.

In my next post, I will describe an example of a data-oriented approach and how it impacts business, both from the efficiency point of view and from the business transformation point of view. As the data-oriented approach can have immense benefits . However, they do not come easy. You will have to work different.

Some more details

An important point to discuss is that this data-oriented approach requires a dictionary, describing the primary data elements used in a certain industry. The example below demonstrates a high-level scheme for a plant engineering environment.

plant data model

Data standards exist in almost any industry or they are emerging and crucial for the longevity and usage of the data. I will touch it briefly in one of the upcoming posts, however, for those interested in this topic in relation to PLM, I recommend attending the upcoming PDT Europe. If you look at the agenda there is a place to learn and discuss a lot about the future of PLM.

pdteurope

I hope to see you there.

imageLast week I attended the PI Apparel conference in London. It was the second time this event was organized and approximate 100 participants were there for two full days of presentations and arranged network meetings. Last year I was extremely excited about this event as the different audience, compare to classical PLM events, and was much more business focused.

Read my review from last year here: The weekend after PI Apparel 2013

This year I had the feeling that the audience was somewhat smaller, missing some of the US representatives and perhaps there was a slightly more, visible influence from the sponsoring vendors. Still an enjoyable event and hopefully next year when this event will be hosted in New York, it will be as active as last year.

Here are some of my observations.

Again the event had several tracks in parallel beside the keynotes, and I look forward in the upcoming month to see the sessions I could not attend. Obvious where possible I followed the PLM focused sessions.


clip_image002First keynote came from Micaela le Divelec Lemmi, Executive Vice President and Chief Corporate Operations Officer of Gucci. She talked us through the areas she is supervising and gave some great insights. She talked about how Gucci addresses sustainability through risk and cost control. Which raw materials to use, how to ensure the brands reputation is not at risk, price volatility and the war on talent. As Gucci is a brand in the high-end price segment, image and reputation are critical, and they have the margins to assure it is managed. Micaela spoke about the short-term financial goals that a company as Gucci has related to their investors. Topics she mentioned (I did not write them down as I was tweeting when I heard them) were certainly worthwhile to consider and discuss in detail with a PLM consultant.

clip_image003

Micaela further described Gucci´s cooperate social responsibility program with a focus on taking care of the people, environment and culture. Good to learn that human working conditions and rights are a priority even for their supply chain. Although it might be noted that 75 % of Gucci´s supply chain is in Italy. One of the few brands that still has the “Made in Italy” label.

My conclusion was that Micaela did an excellent PR job for Gucci, which you would expect for a brand with such a reputation. Later during the conference we had a discussion would other brands with less exclusivity and more operating in the mass consumer domain be able to come even close to such programs?


clip_image005Next Göktug and Hakan gave us their insights deploying their first PLM system at the AYDINLI group.

The company is successful in manufacturing and selling licensed products from Pierre Cardin, Cacharel and US Polo Association mainly outside the US and Western Europe.

Their primary focus was to provide access to the most accurate and most updated information from one source. In parallel, standardization of codes and tech packs was a driver. Through standardization quality and (re)use could be improved, and people would better understand the details. Additional goals are typical PLM goals: following the product development stages during the timeline, notify relevant users about changes in the design, work on libraries and reuse and integrate with SAP.

Interesting Hakan mentioned that in their case SAP did not recommend to use their system for the PLM related part due to lack of knowledge of the apparel industry. A wise decision which would need followup for other industries.

In general the PLM implementation described by Göktug and Hakan was well phased and with a top-down push to secure there is no escape to making the change. As of all PLM implementations in apparel they went live in their first phase rather fast as the complex CAD integrations from classical PLM implementations were not needed here.


Next I attended the Infor session with the title: Work the Way you Live: PLM built for the User. A smooth marketing session with a function / feature demo demonstrating the flexibility and configuration capabilities of the interface. Ease of use is crucial in the apparel industry, where Excel is still the biggest competitor. Excel might satisfy the needs from the individual, it lacks the integration and collaboration aspect a PLM system can offer.


clip_image007More interesting was the next session that I attended from Marcel Oosthuis, who was responsible as Process Re-Engineering Director (read PLM leader). Marcel described how they had implemented PLM at Tommy Hilfiger, and it was an excellent story (perhaps too good to be true).

I believe larger companies with the right focus and investment in PLM resources can achieve this kind of results. The target for Tommy Hilfiger´s PLM implementation was beyond 1000 users, therefore, a serious implementation.

Upfront the team defined first what the expected from the PLM system to select (excellent !!). As the fashion industry is fast, demanding and changing all the time, the PLM system needs to be Swift, Flexible and Prepared for Change. This was not a classical PLM requirement.

In addition, they were looking for a high-configurable system, providing best practices and a vendor with a roadmap they could influence. Here I got a little more worried as high-configurable and best practices not always match the prepared for change approach. A company might be tempted to automate the way they should work based on the past (best practices from the past)

It was good to hear that Marcel did not have to go into the classical ROI approach for the system. His statement, which I fully endorse that it is about the capability to implement new and better processes. They are often not comparable with the past (and nobody measured the past)

Marcel described how the PLM team (eight people + three external from the PLM vendor) made sure that the implementation was done with the involvement of the end users. End user adoption was crucial as also key user involvement when building and configuring the system.

It was one of the few PLM stories where I hear how all levels of the organization were connected and involved.


imageNext Sue Butler, director from Kurt Salmon, described how to maximize ROI from your PLM investment. It is clear that many PLM consultants are aligned, and Sue brought up all the relevant points and angles you needed to look at for successful PLM implementation.

Main points: PLM is about changing the organization and processes, not about implementing a tool. She made a point that piloting the software is necessary as part of the learning and validation process. I agree on that under the condition that it is an agile pilot which does not take months to define and perform. In that case, you might be already locked in into the tool vision too much – focus on the new processes you want to achieve.

Moreover, because Sue was talking about maximize ROI from a PLM implementation, the topics focus on business areas that support evolving business processes and measure (make sure you have performance metrics) came up.


imageThe next session Staying Ahead of the Curve through PLM Roadmap Reinvention conducted by Austin Mallis, VP Operations, Fashion Avenue Sweater Knits, beautifully completed previous sessions related to PLM.

Austin nicely talked about setting the right expectations for the future (There is no perfect solution / Success does not mean stop / Keeping the PLM vision / No True End). In addition, he described the human side of the implementation. How to on-board everyone (if possible) and admitting you cannot get everyone on-board for the new way of working.


imageNext in row was my presentation with potential the longest title: “How to transform your Business to ensure you Benefit from the Value PLM can deliver”.

Luckily the speakers before me that day already addressed many of the relevant topics, and I could focus on three main thoughts completing the story:

1. Who decides on PLM and Why?

I published the results from a small survey I did a month ago via my blog (A quick PLM survey). See the main results below.

clip_image010

It was interesting to observe that both the management and the users in the field are the majority demanding for PLM. Consultants have some influence and PLM vendors even less. The big challenge for a company is that the management and consultants often talk about PLM from a strategic point of view, where the PLM vendor and the users in the field are more focused on the tool(s).

From the expectations you can see the majority of PLM implementations is about improving collaboration, next time to market, increase quality and centralizing and managing all related information.

2. Sharing data instead of owning data

(You might have read about it several times in my blog) and the trend that we move to platforms with connected data instead of file repositories. This should have an impact on your future PLM decisions.

3. Choosing the right people

The third and final thought was about choosing the right people and understanding the blocker. I elaborated on that topic already before in my recent blog post: PLM and Blockers

My conclusions for the day were:

A successful PLM implementation requires a connection in communication and explanation between all these levels. These to get a company aligned and have an anchored vision before even starting to implement a system (with the best partner)


imageThe day was closed by the final keynote of the day from Lauren Bowker heading T H E U N S E E N. She and her team are exploring the combinations of chemistry and materials to create new fashion artifacts. Clothes and materials that change color based on air vent, air pollution or brain patterns. New and inspiring directions for the fashion lovers.

Have a look here: http://seetheunseen.co.uk/


The morning started with Suzanne Lee, heading BioCouture who is working on various innovative methodologies to create materials for the apparel industry by using all kind of live micro-organisms like bacteria, fungi and algae and using materials like cellulose, chitin and protein fibers, which all can provide new possibilities for sustainability, comfort, design, etc. Suzanne´s research is about exploring these directions perhaps shaping some new trends in the 5 – 10 years future ahead. Have a look into the future here:


clip_image012Renate Eder took us into the journey of visualization within Adidas, with her session: Utilizing Virtualization to Create and Sell Products in a Sustainable Manner.

It was interesting to learn that ten years ago she started the process of having more 3D models in the sales catalogue. Where classical manufacturing companies nowadays start from a 3D design, here at Adidas at the end of the sales cycle 3D starts. Logical if you see the importance and value 3D can have for mass market products.

Adidas was able to get 16000 in their 3D catalogue thanks to the work from 60 of their key suppliers who were fully integrated in the catalogue process. The benefit from this 3D catalogue was that their customers, often the large stores, need lesser samples, and the savings are significant here (plus a digital process instead of transferring goods).

Interesting discussion during the Q&A part was that the virtual product might even look more perfect than the real product, demonstrating how lifelike virtual products can be.

And now Adidas is working further backwards from production patterns (using 3D) till at the end 3D design. Although a virtual 3D product cannot 100 % replace the fit and material feeling, Renate believes that also introducing 3D during design can reduce the work done during pilots.


Finally for those who stayed till the end there was something entirely different. Di Mainstone elaborating on her project: Merging Architecture & the Body in Transforming the Brooklyn Bridge into a Playable Harp. If you want something entirely different, watch here:

Conclusion

The apparel industry remains an exciting industry to follow. For some of the concepts – being data-centric, insane flexible, continuous change and rapid time to market are crucial here.

This might lead development of PLM vendors for the future, including using it based on cloud technology.

From the other side, the PLM markets in apparel is still very basic and learning, see this card that I picked up from one of the vendors. Focus on features and functions, not touching the value (yet)

clip_image014

image

Friends, this is the first evening that there is no soccer on television for two weeks. So I have time to write something.

Currently, I am preparing my session for PI Apparel 2014 in London on 15/16 July. Last year´s PI Apparel was a discovery for me as the audience was so different compared to classical PLM conferences. Of course, the products might be not as complex, but the time to market needs and, therefore, the need to work as fast and concurrent as possible is a huge differentiator. See my post from last year´s conference here: The weekend after PI Apparel 2013

In a way, PLM for apparel companies is more data-centric than some of the original industries where PLM was born. In the traditional way, file management and document sharing were the initial drivers.

At this years conference, I will talk about the needed change in the way people work that comes with a PLM implementation.  I will share the full story plus my observations in my next post end of July.

Before that, I have a question to all readers of this blog who are working for a company that has implemented or starts implementing PLM to answer two questions from the survey below. The answers will help me to confirm my prejudgments or change my mind.

So if you have some time between the soccer matches, please respond to the survey below if you qualify:

 

Click here to answer a quick survey before PI Apparel

 

Thanks and enjoy the upcoming matches

image

image

Two weeks ago I attended the Nobletek PLM forum in Belgium, where a group of experts, managers and users discussed topics related to my favorite theme: “Is PLM changing? “

Dick Terleth (ADSE) lead a discussion with title “PLM and Configuration Management as a proper profession” or "How can the little man grow?". The context of the discussion was related to the topic: “How is it possible that the benefits of PLM (and Configuration Management) are not understood at C-level?” or with other words: “Why is the value for Configuration Management and PLM not obvious?”.

In my previous post, PLM is doomed unless …., I quoted Ed Lopategui (www.eng-eng.com), who commented that being a PLM champion (or a Configuration Management expert as Dick Terleth would add) is bad for your career. Dick Terleth asked the same question, showing pictures of the self-assured accountant and the Configuration Management or PLM professional. (Thanks Dick for the pictures). Which job would you prefer?

image

The PLM ROI discussion

No_roiA first attempt to understand the difference could be related to the ROI discussion, which seems to be only applicable for PLM. Apparently ERP and financial management systems are a must for companies. No ROI discussion here. Persons who can control/report the numbers seem to have the company under control. For the CEO and CFO the value of PLM is often unclear. And to make it worse, PLM vendors and implementers are fighting for their unique definition of PLM so we cannot blame companies to be confused. This makes it clear that if you haven´t invested significant time to understand PLM, it will be hard to see the big picture. And at C-level people do not invest significant time to understand the topic. It is the C-level´s education, background or work experience that make him/her decide.

So if the C-level is not educated on PLM, somebody has to sell the value to them. Oleg Shilovitsky wrote about it recently in his post Why is it hard to sell PLM ROI and another respected blogger, Joe Barkai, sees the sun come up behind the cloud, in his latest post PLM Service Providers Ready To Deliver Greater Value. If you follow the posts of independent PLM bloggers (although who is 100 % independent), you will see a common understanding that implementing PLM currently requires a business transformation as old processes were not designed for a modern infrastructure and digital capabilities.

PLM is about (changing) business processes

imageBack to the Nobletek PLM forum. Douglas Noordhoorn, the moderator of the forum challenged the audience stating that PLM has always been there (or not there – if you haven´t discovered it). It is all about managing the product development processes in a secure way. Not talking about “Best Practices” but “Good practices." Those who had a proper education in the aerospace industry learned that good processes are crucial to deliver planes that can fly and are reliable.

Of course, the aerospace industry is not the same as other industries. However, more and more other industries in my network, like Nuclear new build, the construction industry or other Engineering, Procurement and Construction companies want to learn from aerospace and automotive good practices. They realize they are losing market share due to the fact that the cost of failure combined with relative high labor costs makes them too expensive. But from where to they get their proper good practices education?

The PLM professional?

myplmAnd this was an interesting point coming up from the Nobletek forum. There is no proper, product agnostic education for PLM (anymore). If you study logistics, you will learn a lot about various processes and how they can be optimized for a certain scenario. When you study engineering, there is a lot of focus on engineering disciplines and methods. But there is no time to educate engineers in-depth to understand the whole product development process and how to control it. Sometimes I give a guest lecture to engineering classes. It is never an important part of the education.

To become a PLM professional

imageFor those who never had any education in standard engineering processes, there is Frank Watts Engineering control book, which probably would be a good base. But it is not the PLM professional only that should be aware, of the good practices. Moreover, all companies manufacturing products, plants or buildings should learn these basics. As a side step, it would make a discussion around BIM more clear. At this time, manufacturing companies are every time discovering their good practices in the hard way.

And when this education exists, companies will be aware that it is not only about the tools, but it is the way the information is flowing through the organization. Even there is a chance that somewhere at C-level someone has been educated and understands the value. For ERP everyone agrees. For PLM, it remains a labyrinth of processes designed by companies learning on the job currently. Vendors and implementers pushing what they have learned. Engineering is often considered as a hard-to-manage discipline. As a SAP country manager once said to me: “Engineers are actually resources that do not want be managed, but we will get them …..”

And then the future ……

PLM bookI support the demand for a better education in engineering processes especially for industries outside aerospace or automotive. I doubt if it will have a significant impact although it might create the visibility and understanding for PLM at C-level. No need anymore for the lone ranger who fights for PLM. Companies will have better educated people that understand the need for good practices that exist. These good practices will be the base for companies when discussing with PLM vendors and implementers. Instead of vendors and implementers pushing their vision, you can articulate, and follow your vision.

However, we need a new standard book too. We are currently in the middle of a big change. Thanks to modern technology and connectivity the world is changing. I wrote and spoke about it in: Did you notice PLM is changing?

doc2dataThis is a change of generations and concepts which have not been foreseen by Frank Watts and others. What will be the new standard for data-centric companies instead of document based control?

The digital revolution is here (Industry 4.0), and here (digital revolution), and here (the third industrial revolution).

 

This awareness needs to become visible at C-level.
Who will educate them ??

 

spain_nl

Now back to soccer – 4 years ago Spain-The Netherlands was the last match – the final. Now it is the first match for them – will the Dutch change the game ?

NoChangeHuman beings are a strange kind of creatures. We think we make a decision based on logic, and we think we act based on logic. In reality, however, we do not like to change, if it does not feel good, and we are lazy in changing our habits.

Disclaimer: It is a generalization which is valid for 99 % of the population. So if you feel offended by the previous statement, be happy as you are one of the happy few.

Our inability to change can be seen in the economy (only the happy few share). We see it in relation to global climate change. We see it in territorial fights all around the world.

Owning instead of sharing.  ?

The cartoon below gives an interesting insight how personal interests are perceived more important than general interest.

clip_image001

It is our brain !

More and more I realize that the success of PLM is also related to his human behavior; we like to own and find it difficult to share. PLM primarily is about sharing data through all stages of the lifecycle. A valid point why sharing is rare , is that current PLM systems and their infrastructures are still too complex to deliver shared information with ease. However, the potential benefits are clear when a company is able to transform its business into a sharing model and therefore react and anticipate much faster on the outside world.

But sharing is not in our genes, as:

  • In current business knowledge is power. Companies fight for their IP; individuals fight for their job security by keeping some specific IP to themselves.
  • As a biological organism, composed of a collection of cells, we are focused on survival of our genes. Own body/family first is our biological message.

Breaking these habits is difficult, and I will give some examples that I noticed the past few weeks. Of course, it is not completely a surprise for readers of my blog, as a large number of my recent posts are related to the complexity of change. Some are related to human behavior:

August 2012: Our brain blocks PLM acceptance
April 2014: PLM and Blockers

Ed Lopategui, an interesting PLM blogger, see http://eng-eng.com, wrote a long comment to my PLM and Blockers post. The (long) quote below is exactly describing what makes PLM difficult to implement within a company full of blockers :

“I also know that I was focused on doing the right thing – even if cost me my position; and there were many blockers who plotted exactly that. I wore that determination as a sort of self-imposed diplomatic immunity and would use it to protect my team and concentrate any wrath on just myself. My partner in that venture, the chief IT architect admitted on several occasions that we wouldn’t have been successful if I had actually cared what happened to my position – since I had to throw myself and the project in front of so many trains. I owe him for believing in me.

But there was a balance. I could not allow myself to reach a point of arrogance; I would reserve enough empathy for the blockers to listen at just the right moments, and win them over. I spent more time in the trenches than most would reasonably allow. It was a ridiculously hard thing and was not without an intellectual and emotional cost.

In that crucible, I realized that finding people with such perspective (putting the ideal above their own position) within each corporation is *exceptionally* rare. People naturally don’t like to jump in front of trains. It can be career-limiting. That’s kind of a problem, don’t you think? It’s a limiting factor without a doubt, and not one that can be fulfilled with consultants alone. You often need someone with internal street cred and long-earned reputation to push through the tough parts”

Ed concludes that it is exceptionally rare to find people putting the ideal above their own position. Again referring to the opening statement that only a (happy) few are advocates for change

Now let´s look at some facts why it is exceptionally rare, so we feel less guilty.

On Intelligence

clip_image003Last month I read the book On Intelligence from Jeff Hawkins well written by Sandra Blakeslee. (Thanks Joost Schut from KE-Works for pointing me to this book).

Although it was not the easiest book to read during a holiday, it was well written considering the complexity of the topic discussed. Jeff describes how the information architecture of the brain could work based on the neocortex layering.

In his model, he describes how the brain processes information from our senses, first in a specific manner but then more and more in an invariant approach. You have to read the book to get the full meaning of this model. The eye opener for me was that Jeff described the brain as a prediction engine. All the time the brain anticipates what is going to happen, based on years of learning. That’s why we need to learn and practice building and enrich this information model.

And the more and more specialized you are on a particular topic, it can be knowledge but it can also be motoric skill, the deeper in the neocortex this pattern is anchored. This makes is hard to change (bad) practices.

The book goes much further, and I was reading it more in the context of how artificial intelligence or brain-like intelligence could support the boring PLM activities. I got nice insights from it, However the main side observation was; it is hard to change our patterns. So if you are not aware of it, your subconscious will always find reasons to reject a change. Follow the predictions !

Thinking Fast and Slow

clip_image005And this is exactly the connection with another book I have read before: Thinking Fast and Slow from Daniel Kahneman. Daniel explains that our brain is running its activities on two systems:

System 1: makes fast and automatic decisions based on stereotypes and emotions. System 1 is what we are using most of the time, running often in subconscious mode. It does not cost us much energy to run in this mode.

System 2: takes more energy and time; therefore, it is slow and pushes us to be conscious and alert. Still system 2 can be influenced by various external, subconscious factors.

Thinking Fast and Slow nicely complements On Intelligence, where system 1 described by Daniel Kahneman is similar to the system Jeff Hawkins describes as the prediction engine. It runs in an subconscious mode, with optimal energy consumption allowing us to survive most of the time.

Fast thinking leads to boiling frogs

clip_image007And this links again to the boiling frog syndrome. If you are not familiar with the term follow the link. In general it means that people (and businesses) are not reacting on (life threating) outside change when it goes slowly, but would react immediately if they are confronted with the end result. (no more business / no more competitive situation)

Conclusion: our brain by default wants to keep business in predictive mode, so implementing a business change is challenging, as all changes are painful and against our subconscious system.

So PLM is doomed, unless we change our brain behavior ?

The fact that we are not living in caves anymore illustrates that there have been always those happy few that took a risk and a next step into the future by questioning and changing comfortable habits. Daniel Kahneman´s system 2 and also Jeff Hawkins talk about the energy it takes to change habits, to learn new predictive mechanisms. But it can be done.

I see two major trends that will force the classical PLM to change:

  • The amount of connected data becomes so huge, it does not make sense anymore to store it and structure the information in a single system. The time required to structure data does not deliver enough ROI in a fast moving society. The old “single system that stores all”-concept is dying.
  • The newer generations (generation Y and beyond) grew up with the notion that it is impossible to learn, capture and own specific information. They developed different skills to interpret data available from various sources, not necessary own and manage it all.

These two trends lead to the point where it becomes clear that the future in system thinking becomes obsolete. It will be about connectivity and interpretation of connected data, used by apps, running on a platform. The openness of the platform towards other platform is crucial and will be the weakest link.

Conclusion:

The PLM vision is not doomed and with a new generations of knowledge workers the “brain change” has started. The challenge is to implement the vision across systems and silos in an organization. For that we need to be aware that it can be done and allocate the “happy few” in your company to enable it.

 

image

What do you think  ???????????????????????????

The past month I had several discussions related to the complexity of PLM. Why is PLM conceived as complex ? Why is it hard to sell PLM internal into an organization ? Or to phrase it differently: “What makes PLM so difficult for normal human beings. As conceptually it is not so complex”

So what makes it complex ? What´s behind PLM ?

ConcurrentEngineeringThe main concept behind PLM is that people share data. It can be around a project, a product, a plant through the whole lifecycle. In particular during the early lifecycle phases, there is a lot of information that is not yet 100 percent mature. You could decide to wait till everything is mature before sharing it with others (the classical sequential manner), however the chance of doing it right the first time is low. Several iterations between disciplines will be required before the data is approved. The more and more a company works sequential, the higher costs of changes are and the longer the time to market. Due to this rigidness of this sequential approach, it becomes difficult to respond rapidly to customer or market demands. Therefore in theory, (and it is not a PLM theory), concurrent engineering should reduce the amount of iterations and the total time to market by working in parallel in not approved data yet.

plmPLM goes further, it is also about sharing of data and as it started originally in the early phases of the lifecycle, the concept of PLM was often considered something related to engineering. And to be fair, most of the PLM (CAD-related) vendors have a high focus on the early stages of the lifecycle and strengthen this idea. However sharing can go much further, e.g. early involvement of suppliers (still engineering) or support for after-sales/services (the new acronym SLM). In my recent blog posts I discussed the concepts of SLM and the required data model for that.

Anticipated sharing

The complexity lies in the word “sharing”. What does sharing mean for an organization, where historically every person was awarded for the knowledge he/she has/owned, instead of being awarded for the knowledge this person made available and shared. Many so-called PLM implementations have failed to reach the sharing target as the implementation focus was on storing data per discipline and not necessary storing data to become shareable and used by others. This is a huge difference.

PLM binSome famous (ERP) vendors claim if you store everything in their system, you have a “single version of the truth”. Sounds attractive. My garbage bin at home is also a place where everything ends up in a single place, but a garbage bin has not been designed for sharing, as another person has no clue and time to analyze what´s inside. Even data in the same system can be hidden for others as the way to find data is not anticipated.

Data sharing instead of document deliverables

The complexity of PLM is that data should be created and shared in a matter not necessary the most efficient manner for a single purpose, however with some extra effort, to make it usable and searchable for others. A typical example is drawings and documents management, where the whole process for a person is focused on delivering a specific document. Ok for that purpose, but this document on its own becomes a legacy for the long-term as you need to know (or remember) what´s inside the document.

doc2dataA logical implication of data sharing is that, instead of managing documents, organizations start to collect and share data elements (a 3D model, functional properties, requirements, physical properties, logistical properties, etc). Data can be connected and restructured easily through reports and dashboards, therefore, proving specific views for different roles in the organization. Sharing becomes possible and it can be online. Nobody needed to consolidate and extract data from documents (Excels ?)

This does not fit older generations and departmental managed business units that are rewarded only on their individual efficiency. Have a look at this LinkedIn discussion where the two extremes are visible.

Joe stating:

“The sad thing about PLM is that only PLM experts can understand it! It seems to be a very tight knit club with very little influence from any outside sources.
I think PLM should be dumped. It seems to me that computerizing engineering documentation is relatively easy process. I really think it has been over complicated. Of course we need to get the CAD vendors out of the way. Yes it was an obvious solution, but if anyone took the time to look down the road they would see that they were destroying a well established standard that were so cost effective and simple. But it seems that there is no money in simple”

And a the other side Kais stating:

“If we want to be able to use state-of-the art technology to support the whole enterprise, and not just engineering, and through-life; then product information, in its totality, must be readily accessible and usable at all times and not locked in any perishable CAD, ERP or other systems. The Data Centric Approach that we introduced in the Datamation PLM Model is built on these concepts”

Readers from my blog will understand I am very much aligned with Kais and PLM guys have a hard time to convince Joe of the benefits of PLM (I did not try).

Making the change happen

blockerBeside this LinkedIn discussion, I had discussions with several companies where my audience understood the data-centric approach. It was nice to be in the room together, sharing the ideas what would be possible. However the outside world is hard to convince and here it is about change management.

I read an interesting article in IndustryWeek from John Dyer with the title: What Motivates Blockers to Resist Change?

John describes the various types of blockers and when reading the article combined with my PLM twisted brain, I understood again that this is one of the reasons PLM is perceived as complex – you need to change and there are blockers:

Blocker (noun)Someone who purposefully opposes any change (improvement) to a process for personal reasons

“Blockers” can occupy any position in a company. They can be any age, gender, education level or pay rate. We tend to think of blockers as older, more experienced workers who have been with the company for a long time, and they don’t want to consider any other way to do things. While that may be true in some cases, don’t be surprised to find blockers who are young, well-educated and fairly new to the company.

The problem with blockers

The combination of business change and the existence of blockers are one of the biggest risks for companies to go through a business transformation. By the way, this is not only related to PLM, it is related to any required change in business.

Some examples:

imageA company I have been working with was eager in studying their path to the future, which required more global collaboration, a competitive business model and a more customer centric approach. After a long evaluation phase they decided they need PLM, which was new for most of the people in the company. Although the project team was enthusiastic, they were not able to pass the blockers for a change. Ironically enough they lost a significant part of their business to companies that have implemented PLM. Defending the past is not a guarantee for the future.

A second example is Nokia. Nokia was famous for they ways they were able to transform their business in the past. How come they did not see the smartphone and touch screens upcoming ? Apparently based on several articles presented recently, it was Nokia´s internal culture and superior feeling that they were dominating the market, that made it impossible to switch. The technology was known, the concepts were there, however the (middle) management was full of blockers.

Two examples where blockers had a huge impact on the company.

Conclusion:

Staying in business and remaining competitive is crucial for companies. In particular the changes that currently happen require people to work different in order to stay completive. Documents will become reports generated from data. People handling and collecting documents to generate new documents will become obsolete as a modern data-centric approach makes them redundant. Keeping the old processes might destroy a company. This should convince the blockers to give up

future exit

observationIn my previous post, I wrote about the different ways you could look at Service Lifecycle Management (SLM), which, I believe, should be part of the full PLM vision. The fact that this does not happen is probably because companies buy applications to solve issues instead of implementing a consistent company wide vision (When and Where to start is the challenge). Oleg Shilovitsky just referred one more time to this phenomena – Why PLM is stuck in PDM.

I believe PLM as the enterprise information backbone for product information. I will discuss the logical flow of data that might be required in a PLM data model, to support SLM. Of course all should be interpreted in the context of the kind of business your company is in.

This post is probably not the easiest to digest as it assumes you are somehow aware and familiar with the issues relevant for the ETO (Engineering To Order) /EPC (Engineering Procurement Construction) /BTO (Build To Order) business

A collection of systems or a single device

The first significant differentiation I want to make is between managing an installation or a single device as I will focus only on installations.

clip_image002An installation can be a collection of systems, subsystems, equipment and/or components, typically implemented by companies that deliver end-to-end solutions to their customers. A system can be an oil rig, a processing production line (food, packages, …), a plant (processing chemicals, nuclear materials), where maintenance and service can be performed on individual components providing full traceability.

Most of the time a customer specific solution is delivered to a customer, either direct or through installation / construction partners. This is the domain I will focus on.

clip_image004I will not focus on the other option for a single device (or system) with a unique serial number that needs to be maintained and serviced as a single entity. For example a car, a computer device. Usually a product for mass consumption, not to be traced individually.

In order to support SLM at the end of the PLM lifecycle, we will see a particular data model is required which has dependencies on the early design phases.

Let´s go through the lifecycle stages and identify the different data types.

The concept / sales phase

concept2

In the concept/sales phase the company needs to have a template structure to collect and process all the information shared and managed during their customer interaction.

In the implementations that I guided, this was often a kind of folder structure grouping information into a system view (what do we need), a delivery view (how and when can we deliver), a services view (who does what ) and a contractual view (cost, budget, time constraints). Most of these folders had initially relations to documents. However the system view was often already based on typical system objects representing the major systems, subsystems and components with metadata.

In the diagram, the colors represent various data types often standard available in a rich PLM data model. Although it can be simplified by going back to the old folder/document approach shared on a server, you will recognize the functional grouping of the information and its related documents, which can be further detailed into individual requirements if needed and affordable. In addition, a first conceptual system structure can already exist with links to potential solutions (generic EBOMs) that have been developed before. A PLM system provides the ideal infrastructure to store and manage all data in context of each other.

The Design phase

Before the design phase starts, there is an agreement around the solution to be delivered. In that situation, an as-sold system structure will be leading for the project delivery, and later this evolved structure will be the reference structure for the as-maintained and as-services environment.

A typical environment at this stage will support a work breakdown structure (WBS), a system breakdown structure (SBS) and a product breakdown structure (PBS). In cases where the location of the systems and subsystems are relevant for the solution, a geographical breakdown structure (GBS) can be used. This last method is often used in shipbuilding (sections / compartments) and plant design (areas / buildings / levels) and is relevant for any company that needs to combine systems and equipment in shared locations.

design

The benefit of having the system breakdown structure is that it manages the relations between all systems and subsystems. Potentially when a subsystem will be delivered by a supplier this environment supports the relationship to the supplier and the tracking of the delivery related to the full system / project.

Note: the system breakdown structure typically uses a hierarchical tag numbering system as the primary id for system elements.  In a PLM environment, the system breakdown elements should be data objects, providing the metadata describing the performance of the element, including the mandatory attributes that are required for exchange with MRO (Maintenance Repair Overhaul) systems.

Working with a system breakdown structure is common for plant design or a asset maintenance project and this approach will be very beneficial for companies delivering process lines, infrastructure projects and other solutions that need to be delivered as a collection of systems and equipment.

The delivery phase

During the delivery phase, the system breakdown structure supports the delivery of each component in detail. In the example below you can see the relation between the tag number, the generic part number and the serial number of a component.

The example below demonstrates the situation where two motors (same item – same datasheet) is implemented at two positions in a subsystem with a different tag number, a unique serial number and unique test certificates per motor.

The benefit of a system breakdown structure here is that it supports the delivery of unique information per component that needs to be delivered and verified on-site. Each system element becomes traceable.

delivery

The maintenance phase

For the maintenance phase the system breakdown structure (or a geographical breakdown structure) could be the place holder to follow up the development of an installation at a customer site.

Imagine that, in the previous example, the motor with tag number S1.2-M2 appears to be under dimensioned and needs to be replaced by a more powerful one. The situation after implementing this change would look like the following picture:

maintenance

Through the relationships with the BOM items (not all are shown in the diagram), there is the possibility to perform a where-used query and identify other customers with a similar motor at that system position. Perhaps a case for preventive maintenance?

Note: the diagram also demonstrates that the system breakdown structure elements should have their own lifecycle in order to support changes through time (and provide traceability).

From my experience, this is a significant differentiator PLM systems can bring in relation to an MRO system. MRO and ERP (Enterprise Resource Planning)systems are designed to work with the latest and actual data only. Bringing in versioning of assets and traceability towards the initial design intent is almost impossible to achieve for these systems (unless you invest in a heavy customized system).

Conclusion

In this post and my previous post, I tried to explain the value of having at least a system breakdown structure as part of the overall PLM data model. This structure supports the early concept phase and connects data from the delivery phase to the maintenance phase.

Where my mission in the past 8 years was teaching non-classical PLM industries the benefits of PLM technology and best practices, in this situation you might say it is where classical BTO companies can learn from best practices from the process and oil & gas industry.

Note: Oleg just published a new blog post: PLM Best Practices and Henry Ford Mass Production System where he claims PLM vendors, Service partners and consultants like to sell Best Practices and still during implementation discover mass customization needs to be made to become customer specific, therefore, the age of Best Practices is over.

I agree with that conclusion, as I do not believe in an Out-Of-The-Box approach, to lead a business change.

Still Best Practices are needed to explain to a company what could be done and in that context without starting from a blank sheet.

Therefore I have been sharing this Best Practice (for free)

clip_image002

Some weeks ago there was a vivid discussion around the need for SLM (service lifecycle management) besides PLM started in a PLM LinkedIn group. Of course, the discussion was already simmering in the background in other LinkedIn groups and fora (forums) triggered by PTC´s announcement to focus on SLM and their “observation” that they were probably the only PLM vendor to observe that need. The Internet of Things is in one pen stroke connected with SLM. (Someone still using a pen?)

Of course it is not that simple and I will try to bring some logic in the thought process, the potential hype and the various approaches you could take related to SLM

SLM

clip_image004First SLM as a TLA (Three Letter Acronym). If you would Google what is the meaning of SLM the most common meaning is Hello, often said on IRC, this is short for “salaam”, or hello.

In the context of PLM it is a relative new acronym and the discussion on LinkedIn was also about the fact if we needed a new TLA. In general. What we try to achieve with SLM is: the ability to trace and follow existing products at customers and to provide advanced or integrated services to them. In a basic matter this could be providing documentation and service information (spare parts information). In an advanced manner, this could be thinking about the Internet of Things, be products that connect to the home base and provide information for preventive maintenance, performance monitoring and enhancements, etc.

clip_image006The topic is not new for companies around the world that have a “what can we do beyond PDM” vision, as I was involved already in 2001 in discussion with a large Swiss company providing solutions for the food processing industry. They wanted to leverage their internal customer centric delivery process and extend it to their customer support using a web interface for relevant content: spare parts lists and documentation.

I am sure one or two readers of this blog post will remember “the spindle case” (the only part in the demo concept that had real data behind it at that time)

For many industries and businesses the customer services (and the margin on spare parts) are the main areas where they make a sustainable profit to secure the company’s future. Most of the time, the initial sale and/or delivery of their products are done with relative low margin due to the competitive sales situation they are during selling. And of course the sale itself is surrounded with uncertainty which vendors have to accept.

If they would ask for more certainty – it would require a more detailed research, which is costly for them or considered as a disadvantage by their potential customer. As other competing vendors do not insist on further research, your company might consider not being “skilled” enough to estimate properly a product.

clip_image007The above paragraph implicitly clarifies that we are mainly talking about companies, where their primary process is Engineering to Order or Build to Order. For companies where the product is delivered through a Configure to Order or an Off-the-Shelf approach, there is no need to work in a similar manner. Buying a computer or a car has no sales engineering involved anymore. There is a clear understanding of the target price and of course resellers will still focus on differentiating themselves by providing adjacent services.

So for simplicity I will focus on companies with a BTO or ETO primary business process

SLM and ETO

In a real Engineering to Order process, traditionally the company that delivers the solution to the client will not be really involved in the follow up of the lifecycle of the products delivered. The delivered product (small machinery, large machinery or even an installation or plant) is delivered to the customer and with the commissioning and handover a lot of information is transferred to the customer, based on the requirements of the customer.clip_image009

Usually during this handover, a lot of intelligence of the information is gone, as the customer does not have the same engineering environment and therefore requires information is “neutral” formats: paper (less and less), PDFs (the majority) and (stripped) CAD data combined with Excels.

The information battle here between the ETO-delivery company and the customer is, that the ETO-delivery company does not want to provide too much information to the customer, to make the customer fully independent, as the service and spare parts business is the area where they can make their margin. The customer, however, often wants to have ownership of the majority of data, but also there is the awareness if they ask too much; they will pay for it (as an engineering company will consider this as extra work). So finding the right balance is the point.

However, the balance is changing, and this is where SLM comes in.

More and more we see that companies who purchased in the past an Engineering to Order product (or even plant) are changing their business model towards using the product or running the plant and ask from the Engineering to Order company to provide the solution as a service. A kind of operation lease including resources. This means solutions are no longer sold as a collection of products, but as an operational model (40.000 chickens / day, 1 Mio liter/day, 100 000 Tons / year, etc., etc.)

clip_image011The owner of the equipment is no longer the owner, but pays for the service to perform the business. Very similar to SaaS (software as a service) solutions. You do not own the software anymore; you pay for using it, no matter what kind of hardware / software architecture there is behind the offering.

In that case, the Engineering to Order company can provide much more advanced services when they extend their delivery process with capabilities for the operational phase of the product. As a more integrated approach eliminates the need for this disruptive handover process. Data does not need to be made “stupid” again, it is a continuous flow of information.

How this can be done, I will describe in an upcoming, more technical, blog post. This approach brings value to both the Engineering to Order company and the owner/operator of the product / plant.
As it is a continuous flow of information, I would like to conclude this topic by stating that, for Engineering to Order companies, there is no need to think about an extra SLM solution. You could label the last part of the PLM process the SLM domain.

As the customer data is already unique, it is just a normal continuation of the PLM process.

Two closing notes here:

  • I have seen already Engineering to Order companies that provide the whole maintenance and service of the delivered product / plant to their customer integrated in their data environment. (so it is happening !)
  • Engineering to Order companies are still discovering the advantages of PLM to get a cross-project, cross-discipline understanding and working methodology for their delivery process. Historically they were thinking in isolated projects, where the brain of experienced engineers was the connection between different projects. Now PLM practices are becoming the foundation for sharing and capitalizing on knowledge.

And with the last remark on capitalizing the knowledge, we move from the Engineering to Order industry to the Build to Order

SLM and BTO

In the Build to Order industry, the company that delivers a solution to their customer, has tried, in a way, to standardize certain parts of their total solution. These parts can be standardized/configurable machinery or standardized/configurable equipment, or even a level higher standardized systems and subsystems.

More configurable/modular standardization is what most companies are aiming for. As the more you modularize your solution parts, the clearer it will be that there are two different main processes inside the same organization:

  • One process, the main process for the company, fulfilling the customer need. In this process it is about combining existing solution components and engineering them together in a customer specific solution. This could be a PLM delivery model like ETO.
  • One process to enhance, maintain and develop new solution components, which is a typical R&D process. Here I would state PLM is indisputable needed, to bring new technology and solutions to the main business process

So within a company, there might be the need for two different PLM solution processes. From my observations in the past 10 years, companies invest in PDM for their R&D process and try to do a little of PLM on top of this PDM implementation for their delivery process. This basic PLM process usually focuses again on the core of the engineering process of delivery, starting somewhere from the specifications till the delivery of the solution.

clip_image013

So “full” PLM is very rare to find. The front end of the delivery process, systems engineering, is often considered complex and often the customer does not want to engage fully in the front end definition of the solution.

“You are the experts, you know best what we want” is often heard.

Ironically in an analogue situation this is often the case of PLM implementations at risk. Here the company expects the PLM implementer to know what they want, without being explicit or understanding what is needed.

To extend the discussion for PLM and SLM, I would like to change the question to a different dimension first:

Do we need two PLM implementations within one company ?

One for R&D and one for the delivery process ?

Reasons to say No are:

  • Simplicity – it is easier to have one system instead of two systems
  • The amount of R&D activity is so low compared to the delivery process; the main PLM system can support this.

Reasons to say Yes are:

  • The R&D process is extremely important as is the delivery process
  • The R&D process is extremely important and we have a large customer base to serve

Reading these two options, it brings some clarity.

If the R&D process is a significant differentiator and you are aiming to serve many customers, it makes sense to have two PLM implementations.

Still two PLM implementations could be based on the same PLM infrastructure and I would challenge readers of this post to explain why it should be a single instance of a PLM infrastructure.

Why two PLM systems

  • I believe based on the potential huge amount of data a single instance would create a data monster, where we can see that connected systems (using big data) is the future.
  • In other concepts there is an enterprise PLM and local PDMs exactly because there is no single system that can do all in an efficient manner.

Still I haven´t talked about SLM, which could be part of the delivery process, where you manage customer specific data. For that, more detail in my next blog post, there is are some data model constraints for the PLM system.

clip_image015I would state you only can use a separate SLM system if you are not interested in data from the early phases of the delivery process. In the early phase, you use conceptual structures to define the product /installation/plant. These conceptual structures are to my opinion the connection between the concept phase and the service phase. Usually tag numbers are used to describe the functional usage of a product or system, and they are the ones referenced by service engineers to start a service operation.

clip_image017Only when this view or need does not exist, I can imagine, SLM is needed, where potential based on serial numbers, services are tracked and monitored and are fed back to the R&D environment. The R&D environment then would publish product data into the SLM system

You might be confused at this time, as I did not bring the various information structures into this post to clarify the data flow for the delivery process. This I will do in my upcoming post.

Why not CTO and SLM ?

I haven´t discussed Configure to Order (CTO) here, as I consider CTO a logistical process, which logically is addressed by the ERP system. The definitions of the configurations and its related content probably will be delivered through a PDM/PLM system, so the R&D type of PLM system will exist in the company.

imageSLM most logically would be performed in this situation by the ERP system, as there is no PLM delivery layer. Having said this, a new religion discussion might come up. Is SLM a separate discipline or is it part of the ERP system?

This topic is no discussion for the big ERP vendors – they do it all J, but it is up to your company if a Swiss knife is the right tool to work within your organization.

Conclusion

For the moment I would like to conclude:

  • PLM and SLM –> No (only Yes in isolated cases)
  • PLM and PLM –> Yes (as SLM requires the front end of PLM too)

Do we need SLM ? Perhaps yes as a way to describe a functional domain. No when we are talking about another silo system. I believe the future is in connectivity of data and in the long term PLM, ERP and SLM will be functional domains describing how connected data will serve particular needs.

Looking forward to your thoughts

Follow

Get every new post delivered to your Inbox.

Join 335 other followers

%d bloggers like this: