imageLast week I attended the PI Apparel conference in London. It was the second time this event was organized and approximate 100 participants were there for two full days of presentations and arranged network meetings. Last year I was extremely excited about this event as the different audience, compare to classical PLM events, and was much more business focused.

Read my review from last year here: The weekend after PI Apparel 2013

This year I had the feeling that the audience was somewhat smaller, missing some of the US representatives and perhaps there was a slightly more, visible influence from the sponsoring vendors. Still an enjoyable event and hopefully next year when this event will be hosted in New York, it will be as active as last year.

Here are some of my observations.

Again the event had several tracks in parallel beside the keynotes, and I look forward in the upcoming month to see the sessions I could not attend. Obvious where possible I followed the PLM focused sessions.


clip_image002First keynote came from Micaela le Divelec Lemmi, Executive Vice President and Chief Corporate Operations Officer of Gucci. She talked us through the areas she is supervising and gave some great insights. She talked about how Gucci addresses sustainability through risk and cost control. Which raw materials to use, how to ensure the brands reputation is not at risk, price volatility and the war on talent. As Gucci is a brand in the high-end price segment, image and reputation are critical, and they have the margins to assure it is managed. Micaela spoke about the short-term financial goals that a company as Gucci has related to their investors. Topics she mentioned (I did not write them down as I was tweeting when I heard them) were certainly worthwhile to consider and discuss in detail with a PLM consultant.

clip_image003

Micaela further described Gucci´s cooperate social responsibility program with a focus on taking care of the people, environment and culture. Good to learn that human working conditions and rights are a priority even for their supply chain. Although it might be noted that 75 % of Gucci´s supply chain is in Italy. One of the few brands that still has the “Made in Italy” label.

My conclusion was that Micaela did an excellent PR job for Gucci, which you would expect for a brand with such a reputation. Later during the conference we had a discussion would other brands with less exclusivity and more operating in the mass consumer domain be able to come even close to such programs?


clip_image005Next Göktug and Hakan gave us their insights deploying their first PLM system at the AYDINLI group.

The company is successful in manufacturing and selling licensed products from Pierre Cardin, Cacharel and US Polo Association mainly outside the US and Western Europe.

Their primary focus was to provide access to the most accurate and most updated information from one source. In parallel, standardization of codes and tech packs was a driver. Through standardization quality and (re)use could be improved, and people would better understand the details. Additional goals are typical PLM goals: following the product development stages during the timeline, notify relevant users about changes in the design, work on libraries and reuse and integrate with SAP.

Interesting Hakan mentioned that in their case SAP did not recommend to use their system for the PLM related part due to lack of knowledge of the apparel industry. A wise decision which would need followup for other industries.

In general the PLM implementation described by Göktug and Hakan was well phased and with a top-down push to secure there is no escape to making the change. As of all PLM implementations in apparel they went live in their first phase rather fast as the complex CAD integrations from classical PLM implementations were not needed here.


Next I attended the Infor session with the title: Work the Way you Live: PLM built for the User. A smooth marketing session with a function / feature demo demonstrating the flexibility and configuration capabilities of the interface. Ease of use is crucial in the apparel industry, where Excel is still the biggest competitor. Excel might satisfy the needs from the individual, it lacks the integration and collaboration aspect a PLM system can offer.


clip_image007More interesting was the next session that I attended from Marcel Oosthuis, who was responsible as Process Re-Engineering Director (read PLM leader). Marcel described how they had implemented PLM at Tommy Hilfiger, and it was an excellent story (perhaps too good to be true).

I believe larger companies with the right focus and investment in PLM resources can achieve this kind of results. The target for Tommy Hilfiger´s PLM implementation was beyond 1000 users, therefore, a serious implementation.

Upfront the team defined first what the expected from the PLM system to select (excellent !!). As the fashion industry is fast, demanding and changing all the time, the PLM system needs to be Swift, Flexible and Prepared for Change. This was not a classical PLM requirement.

In addition, they were looking for a high-configurable system, providing best practices and a vendor with a roadmap they could influence. Here I got a little more worried as high-configurable and best practices not always match the prepared for change approach. A company might be tempted to automate the way they should work based on the past (best practices from the past)

It was good to hear that Marcel did not have to go into the classical ROI approach for the system. His statement, which I fully endorse that it is about the capability to implement new and better processes. They are often not comparable with the past (and nobody measured the past)

Marcel described how the PLM team (eight people + three external from the PLM vendor) made sure that the implementation was done with the involvement of the end users. End user adoption was crucial as also key user involvement when building and configuring the system.

It was one of the few PLM stories where I hear how all levels of the organization were connected and involved.


imageNext Sue Butler, director from Kurt Salmon, described how to maximize ROI from your PLM investment. It is clear that many PLM consultants are aligned, and Sue brought up all the relevant points and angles you needed to look at for successful PLM implementation.

Main points: PLM is about changing the organization and processes, not about implementing a tool. She made a point that piloting the software is necessary as part of the learning and validation process. I agree on that under the condition that it is an agile pilot which does not take months to define and perform. In that case, you might be already locked in into the tool vision too much – focus on the new processes you want to achieve.

Moreover, because Sue was talking about maximize ROI from a PLM implementation, the topics focus on business areas that support evolving business processes and measure (make sure you have performance metrics) came up.


imageThe next session Staying Ahead of the Curve through PLM Roadmap Reinvention conducted by Austin Mallis, VP Operations, Fashion Avenue Sweater Knits, beautifully completed previous sessions related to PLM.

Austin nicely talked about setting the right expectations for the future (There is no perfect solution / Success does not mean stop / Keeping the PLM vision / No True End). In addition, he described the human side of the implementation. How to on-board everyone (if possible) and admitting you cannot get everyone on-board for the new way of working.


imageNext in row was my presentation with potential the longest title: “How to transform your Business to ensure you Benefit from the Value PLM can deliver”.

Luckily the speakers before me that day already addressed many of the relevant topics, and I could focus on three main thoughts completing the story:

1. Who decides on PLM and Why?

I published the results from a small survey I did a month ago via my blog (A quick PLM survey). See the main results below.

clip_image010

It was interesting to observe that both the management and the users in the field are the majority demanding for PLM. Consultants have some influence and PLM vendors even less. The big challenge for a company is that the management and consultants often talk about PLM from a strategic point of view, where the PLM vendor and the users in the field are more focused on the tool(s).

From the expectations you can see the majority of PLM implementations is about improving collaboration, next time to market, increase quality and centralizing and managing all related information.

2. Sharing data instead of owning data

(You might have read about it several times in my blog) and the trend that we move to platforms with connected data instead of file repositories. This should have an impact on your future PLM decisions.

3. Choosing the right people

The third and final thought was about choosing the right people and understanding the blocker. I elaborated on that topic already before in my recent blog post: PLM and Blockers

My conclusions for the day were:

A successful PLM implementation requires a connection in communication and explanation between all these levels. These to get a company aligned and have an anchored vision before even starting to implement a system (with the best partner)


imageThe day was closed by the final keynote of the day from Lauren Bowker heading T H E U N S E E N. She and her team are exploring the combinations of chemistry and materials to create new fashion artifacts. Clothes and materials that change color based on air vent, air pollution or brain patterns. New and inspiring directions for the fashion lovers.

Have a look here: http://seetheunseen.co.uk/


The morning started with Suzanne Lee, heading BioCouture who is working on various innovative methodologies to create materials for the apparel industry by using all kind of live micro-organisms like bacteria, fungi and algae and using materials like cellulose, chitin and protein fibers, which all can provide new possibilities for sustainability, comfort, design, etc. Suzanne´s research is about exploring these directions perhaps shaping some new trends in the 5 – 10 years future ahead. Have a look into the future here:


clip_image012Renate Eder took us into the journey of visualization within Adidas, with her session: Utilizing Virtualization to Create and Sell Products in a Sustainable Manner.

It was interesting to learn that ten years ago she started the process of having more 3D models in the sales catalogue. Where classical manufacturing companies nowadays start from a 3D design, here at Adidas at the end of the sales cycle 3D starts. Logical if you see the importance and value 3D can have for mass market products.

Adidas was able to get 16000 in their 3D catalogue thanks to the work from 60 of their key suppliers who were fully integrated in the catalogue process. The benefit from this 3D catalogue was that their customers, often the large stores, need lesser samples, and the savings are significant here (plus a digital process instead of transferring goods).

Interesting discussion during the Q&A part was that the virtual product might even look more perfect than the real product, demonstrating how lifelike virtual products can be.

And now Adidas is working further backwards from production patterns (using 3D) till at the end 3D design. Although a virtual 3D product cannot 100 % replace the fit and material feeling, Renate believes that also introducing 3D during design can reduce the work done during pilots.


Finally for those who stayed till the end there was something entirely different. Di Mainstone elaborating on her project: Merging Architecture & the Body in Transforming the Brooklyn Bridge into a Playable Harp. If you want something entirely different, watch here:

Conclusion

The apparel industry remains an exciting industry to follow. For some of the concepts – being data-centric, insane flexible, continuous change and rapid time to market are crucial here.

This might lead development of PLM vendors for the future, including using it based on cloud technology.

From the other side, the PLM markets in apparel is still very basic and learning, see this card that I picked up from one of the vendors. Focus on features and functions, not touching the value (yet)

clip_image014

image

Friends, this is the first evening that there is no soccer on television for two weeks. So I have time to write something.

Currently, I am preparing my session for PI Apparel 2014 in London on 15/16 July. Last year´s PI Apparel was a discovery for me as the audience was so different compared to classical PLM conferences. Of course, the products might be not as complex, but the time to market needs and, therefore, the need to work as fast and concurrent as possible is a huge differentiator. See my post from last year´s conference here: The weekend after PI Apparel 2013

In a way, PLM for apparel companies is more data-centric than some of the original industries where PLM was born. In the traditional way, file management and document sharing were the initial drivers.

At this years conference, I will talk about the needed change in the way people work that comes with a PLM implementation.  I will share the full story plus my observations in my next post end of July.

Before that, I have a question to all readers of this blog who are working for a company that has implemented or starts implementing PLM to answer two questions from the survey below. The answers will help me to confirm my prejudgments or change my mind.

So if you have some time between the soccer matches, please respond to the survey below if you qualify:

 

Click here to answer a quick survey before PI Apparel

 

Thanks and enjoy the upcoming matches

image

image

Two weeks ago I attended the Nobletek PLM forum in Belgium, where a group of experts, managers and users discussed topics related to my favorite theme: “Is PLM changing? “

Dick Terleth (ADSE) lead a discussion with title “PLM and Configuration Management as a proper profession” or "How can the little man grow?". The context of the discussion was related to the topic: “How is it possible that the benefits of PLM (and Configuration Management) are not understood at C-level?” or with other words: “Why is the value for Configuration Management and PLM not obvious?”.

In my previous post, PLM is doomed unless …., I quoted Ed Lopategui (www.eng-eng.com), who commented that being a PLM champion (or a Configuration Management expert as Dick Terleth would add) is bad for your career. Dick Terleth asked the same question, showing pictures of the self-assured accountant and the Configuration Management or PLM professional. (Thanks Dick for the pictures). Which job would you prefer?

image

The PLM ROI discussion

No_roiA first attempt to understand the difference could be related to the ROI discussion, which seems to be only applicable for PLM. Apparently ERP and financial management systems are a must for companies. No ROI discussion here. Persons who can control/report the numbers seem to have the company under control. For the CEO and CFO the value of PLM is often unclear. And to make it worse, PLM vendors and implementers are fighting for their unique definition of PLM so we cannot blame companies to be confused. This makes it clear that if you haven´t invested significant time to understand PLM, it will be hard to see the big picture. And at C-level people do not invest significant time to understand the topic. It is the C-level´s education, background or work experience that make him/her decide.

So if the C-level is not educated on PLM, somebody has to sell the value to them. Oleg Shilovitsky wrote about it recently in his post Why is it hard to sell PLM ROI and another respected blogger, Joe Barkai, sees the sun come up behind the cloud, in his latest post PLM Service Providers Ready To Deliver Greater Value. If you follow the posts of independent PLM bloggers (although who is 100 % independent), you will see a common understanding that implementing PLM currently requires a business transformation as old processes were not designed for a modern infrastructure and digital capabilities.

PLM is about (changing) business processes

imageBack to the Nobletek PLM forum. Douglas Noordhoorn, the moderator of the forum challenged the audience stating that PLM has always been there (or not there – if you haven´t discovered it). It is all about managing the product development processes in a secure way. Not talking about “Best Practices” but “Good practices." Those who had a proper education in the aerospace industry learned that good processes are crucial to deliver planes that can fly and are reliable.

Of course, the aerospace industry is not the same as other industries. However, more and more other industries in my network, like Nuclear new build, the construction industry or other Engineering, Procurement and Construction companies want to learn from aerospace and automotive good practices. They realize they are losing market share due to the fact that the cost of failure combined with relative high labor costs makes them too expensive. But from where to they get their proper good practices education?

The PLM professional?

myplmAnd this was an interesting point coming up from the Nobletek forum. There is no proper, product agnostic education for PLM (anymore). If you study logistics, you will learn a lot about various processes and how they can be optimized for a certain scenario. When you study engineering, there is a lot of focus on engineering disciplines and methods. But there is no time to educate engineers in-depth to understand the whole product development process and how to control it. Sometimes I give a guest lecture to engineering classes. It is never an important part of the education.

To become a PLM professional

imageFor those who never had any education in standard engineering processes, there is Frank Watts Engineering control book, which probably would be a good base. But it is not the PLM professional only that should be aware, of the good practices. Moreover, all companies manufacturing products, plants or buildings should learn these basics. As a side step, it would make a discussion around BIM more clear. At this time, manufacturing companies are every time discovering their good practices in the hard way.

And when this education exists, companies will be aware that it is not only about the tools, but it is the way the information is flowing through the organization. Even there is a chance that somewhere at C-level someone has been educated and understands the value. For ERP everyone agrees. For PLM, it remains a labyrinth of processes designed by companies learning on the job currently. Vendors and implementers pushing what they have learned. Engineering is often considered as a hard-to-manage discipline. As a SAP country manager once said to me: “Engineers are actually resources that do not want be managed, but we will get them …..”

And then the future ……

PLM bookI support the demand for a better education in engineering processes especially for industries outside aerospace or automotive. I doubt if it will have a significant impact although it might create the visibility and understanding for PLM at C-level. No need anymore for the lone ranger who fights for PLM. Companies will have better educated people that understand the need for good practices that exist. These good practices will be the base for companies when discussing with PLM vendors and implementers. Instead of vendors and implementers pushing their vision, you can articulate, and follow your vision.

However, we need a new standard book too. We are currently in the middle of a big change. Thanks to modern technology and connectivity the world is changing. I wrote and spoke about it in: Did you notice PLM is changing?

doc2dataThis is a change of generations and concepts which have not been foreseen by Frank Watts and others. What will be the new standard for data-centric companies instead of document based control?

The digital revolution is here (Industry 4.0), and here (digital revolution), and here (the third industrial revolution).

 

This awareness needs to become visible at C-level.
Who will educate them ??

 

spain_nl

Now back to soccer – 4 years ago Spain-The Netherlands was the last match – the final. Now it is the first match for them – will the Dutch change the game ?

NoChangeHuman beings are a strange kind of creatures. We think we make a decision based on logic, and we think we act based on logic. In reality, however, we do not like to change, if it does not feel good, and we are lazy in changing our habits.

Disclaimer: It is a generalization which is valid for 99 % of the population. So if you feel offended by the previous statement, be happy as you are one of the happy few.

Our inability to change can be seen in the economy (only the happy few share). We see it in relation to global climate change. We see it in territorial fights all around the world.

Owning instead of sharing.  ?

The cartoon below gives an interesting insight how personal interests are perceived more important than general interest.

clip_image001

It is our brain !

More and more I realize that the success of PLM is also related to his human behavior; we like to own and find it difficult to share. PLM primarily is about sharing data through all stages of the lifecycle. A valid point why sharing is rare , is that current PLM systems and their infrastructures are still too complex to deliver shared information with ease. However, the potential benefits are clear when a company is able to transform its business into a sharing model and therefore react and anticipate much faster on the outside world.

But sharing is not in our genes, as:

  • In current business knowledge is power. Companies fight for their IP; individuals fight for their job security by keeping some specific IP to themselves.
  • As a biological organism, composed of a collection of cells, we are focused on survival of our genes. Own body/family first is our biological message.

Breaking these habits is difficult, and I will give some examples that I noticed the past few weeks. Of course, it is not completely a surprise for readers of my blog, as a large number of my recent posts are related to the complexity of change. Some are related to human behavior:

August 2012: Our brain blocks PLM acceptance
April 2014: PLM and Blockers

Ed Lopategui, an interesting PLM blogger, see http://eng-eng.com, wrote a long comment to my PLM and Blockers post. The (long) quote below is exactly describing what makes PLM difficult to implement within a company full of blockers :

“I also know that I was focused on doing the right thing – even if cost me my position; and there were many blockers who plotted exactly that. I wore that determination as a sort of self-imposed diplomatic immunity and would use it to protect my team and concentrate any wrath on just myself. My partner in that venture, the chief IT architect admitted on several occasions that we wouldn’t have been successful if I had actually cared what happened to my position – since I had to throw myself and the project in front of so many trains. I owe him for believing in me.

But there was a balance. I could not allow myself to reach a point of arrogance; I would reserve enough empathy for the blockers to listen at just the right moments, and win them over. I spent more time in the trenches than most would reasonably allow. It was a ridiculously hard thing and was not without an intellectual and emotional cost.

In that crucible, I realized that finding people with such perspective (putting the ideal above their own position) within each corporation is *exceptionally* rare. People naturally don’t like to jump in front of trains. It can be career-limiting. That’s kind of a problem, don’t you think? It’s a limiting factor without a doubt, and not one that can be fulfilled with consultants alone. You often need someone with internal street cred and long-earned reputation to push through the tough parts”

Ed concludes that it is exceptionally rare to find people putting the ideal above their own position. Again referring to the opening statement that only a (happy) few are advocates for change

Now let´s look at some facts why it is exceptionally rare, so we feel less guilty.

On Intelligence

clip_image003Last month I read the book On Intelligence from Jeff Hawkins well written by Sandra Blakeslee. (Thanks Joost Schut from KE-Works for pointing me to this book).

Although it was not the easiest book to read during a holiday, it was well written considering the complexity of the topic discussed. Jeff describes how the information architecture of the brain could work based on the neocortex layering.

In his model, he describes how the brain processes information from our senses, first in a specific manner but then more and more in an invariant approach. You have to read the book to get the full meaning of this model. The eye opener for me was that Jeff described the brain as a prediction engine. All the time the brain anticipates what is going to happen, based on years of learning. That’s why we need to learn and practice building and enrich this information model.

And the more and more specialized you are on a particular topic, it can be knowledge but it can also be motoric skill, the deeper in the neocortex this pattern is anchored. This makes is hard to change (bad) practices.

The book goes much further, and I was reading it more in the context of how artificial intelligence or brain-like intelligence could support the boring PLM activities. I got nice insights from it, However the main side observation was; it is hard to change our patterns. So if you are not aware of it, your subconscious will always find reasons to reject a change. Follow the predictions !

Thinking Fast and Slow

clip_image005And this is exactly the connection with another book I have read before: Thinking Fast and Slow from Daniel Kahneman. Daniel explains that our brain is running its activities on two systems:

System 1: makes fast and automatic decisions based on stereotypes and emotions. System 1 is what we are using most of the time, running often in subconscious mode. It does not cost us much energy to run in this mode.

System 2: takes more energy and time; therefore, it is slow and pushes us to be conscious and alert. Still system 2 can be influenced by various external, subconscious factors.

Thinking Fast and Slow nicely complements On Intelligence, where system 1 described by Daniel Kahneman is similar to the system Jeff Hawkins describes as the prediction engine. It runs in an subconscious mode, with optimal energy consumption allowing us to survive most of the time.

Fast thinking leads to boiling frogs

clip_image007And this links again to the boiling frog syndrome. If you are not familiar with the term follow the link. In general it means that people (and businesses) are not reacting on (life threating) outside change when it goes slowly, but would react immediately if they are confronted with the end result. (no more business / no more competitive situation)

Conclusion: our brain by default wants to keep business in predictive mode, so implementing a business change is challenging, as all changes are painful and against our subconscious system.

So PLM is doomed, unless we change our brain behavior ?

The fact that we are not living in caves anymore illustrates that there have been always those happy few that took a risk and a next step into the future by questioning and changing comfortable habits. Daniel Kahneman´s system 2 and also Jeff Hawkins talk about the energy it takes to change habits, to learn new predictive mechanisms. But it can be done.

I see two major trends that will force the classical PLM to change:

  • The amount of connected data becomes so huge, it does not make sense anymore to store it and structure the information in a single system. The time required to structure data does not deliver enough ROI in a fast moving society. The old “single system that stores all”-concept is dying.
  • The newer generations (generation Y and beyond) grew up with the notion that it is impossible to learn, capture and own specific information. They developed different skills to interpret data available from various sources, not necessary own and manage it all.

These two trends lead to the point where it becomes clear that the future in system thinking becomes obsolete. It will be about connectivity and interpretation of connected data, used by apps, running on a platform. The openness of the platform towards other platform is crucial and will be the weakest link.

Conclusion:

The PLM vision is not doomed and with a new generations of knowledge workers the “brain change” has started. The challenge is to implement the vision across systems and silos in an organization. For that we need to be aware that it can be done and allocate the “happy few” in your company to enable it.

 

image

What do you think  ???????????????????????????

The past month I had several discussions related to the complexity of PLM. Why is PLM conceived as complex ? Why is it hard to sell PLM internal into an organization ? Or to phrase it differently: “What makes PLM so difficult for normal human beings. As conceptually it is not so complex”

So what makes it complex ? What´s behind PLM ?

ConcurrentEngineeringThe main concept behind PLM is that people share data. It can be around a project, a product, a plant through the whole lifecycle. In particular during the early lifecycle phases, there is a lot of information that is not yet 100 percent mature. You could decide to wait till everything is mature before sharing it with others (the classical sequential manner), however the chance of doing it right the first time is low. Several iterations between disciplines will be required before the data is approved. The more and more a company works sequential, the higher costs of changes are and the longer the time to market. Due to this rigidness of this sequential approach, it becomes difficult to respond rapidly to customer or market demands. Therefore in theory, (and it is not a PLM theory), concurrent engineering should reduce the amount of iterations and the total time to market by working in parallel in not approved data yet.

plmPLM goes further, it is also about sharing of data and as it started originally in the early phases of the lifecycle, the concept of PLM was often considered something related to engineering. And to be fair, most of the PLM (CAD-related) vendors have a high focus on the early stages of the lifecycle and strengthen this idea. However sharing can go much further, e.g. early involvement of suppliers (still engineering) or support for after-sales/services (the new acronym SLM). In my recent blog posts I discussed the concepts of SLM and the required data model for that.

Anticipated sharing

The complexity lies in the word “sharing”. What does sharing mean for an organization, where historically every person was awarded for the knowledge he/she has/owned, instead of being awarded for the knowledge this person made available and shared. Many so-called PLM implementations have failed to reach the sharing target as the implementation focus was on storing data per discipline and not necessary storing data to become shareable and used by others. This is a huge difference.

PLM binSome famous (ERP) vendors claim if you store everything in their system, you have a “single version of the truth”. Sounds attractive. My garbage bin at home is also a place where everything ends up in a single place, but a garbage bin has not been designed for sharing, as another person has no clue and time to analyze what´s inside. Even data in the same system can be hidden for others as the way to find data is not anticipated.

Data sharing instead of document deliverables

The complexity of PLM is that data should be created and shared in a matter not necessary the most efficient manner for a single purpose, however with some extra effort, to make it usable and searchable for others. A typical example is drawings and documents management, where the whole process for a person is focused on delivering a specific document. Ok for that purpose, but this document on its own becomes a legacy for the long-term as you need to know (or remember) what´s inside the document.

doc2dataA logical implication of data sharing is that, instead of managing documents, organizations start to collect and share data elements (a 3D model, functional properties, requirements, physical properties, logistical properties, etc). Data can be connected and restructured easily through reports and dashboards, therefore, proving specific views for different roles in the organization. Sharing becomes possible and it can be online. Nobody needed to consolidate and extract data from documents (Excels ?)

This does not fit older generations and departmental managed business units that are rewarded only on their individual efficiency. Have a look at this LinkedIn discussion where the two extremes are visible.

Joe stating:

“The sad thing about PLM is that only PLM experts can understand it! It seems to be a very tight knit club with very little influence from any outside sources.
I think PLM should be dumped. It seems to me that computerizing engineering documentation is relatively easy process. I really think it has been over complicated. Of course we need to get the CAD vendors out of the way. Yes it was an obvious solution, but if anyone took the time to look down the road they would see that they were destroying a well established standard that were so cost effective and simple. But it seems that there is no money in simple”

And a the other side Kais stating:

“If we want to be able to use state-of-the art technology to support the whole enterprise, and not just engineering, and through-life; then product information, in its totality, must be readily accessible and usable at all times and not locked in any perishable CAD, ERP or other systems. The Data Centric Approach that we introduced in the Datamation PLM Model is built on these concepts”

Readers from my blog will understand I am very much aligned with Kais and PLM guys have a hard time to convince Joe of the benefits of PLM (I did not try).

Making the change happen

blockerBeside this LinkedIn discussion, I had discussions with several companies where my audience understood the data-centric approach. It was nice to be in the room together, sharing the ideas what would be possible. However the outside world is hard to convince and here it is about change management.

I read an interesting article in IndustryWeek from John Dyer with the title: What Motivates Blockers to Resist Change?

John describes the various types of blockers and when reading the article combined with my PLM twisted brain, I understood again that this is one of the reasons PLM is perceived as complex – you need to change and there are blockers:

Blocker (noun)Someone who purposefully opposes any change (improvement) to a process for personal reasons

“Blockers” can occupy any position in a company. They can be any age, gender, education level or pay rate. We tend to think of blockers as older, more experienced workers who have been with the company for a long time, and they don’t want to consider any other way to do things. While that may be true in some cases, don’t be surprised to find blockers who are young, well-educated and fairly new to the company.

The problem with blockers

The combination of business change and the existence of blockers are one of the biggest risks for companies to go through a business transformation. By the way, this is not only related to PLM, it is related to any required change in business.

Some examples:

imageA company I have been working with was eager in studying their path to the future, which required more global collaboration, a competitive business model and a more customer centric approach. After a long evaluation phase they decided they need PLM, which was new for most of the people in the company. Although the project team was enthusiastic, they were not able to pass the blockers for a change. Ironically enough they lost a significant part of their business to companies that have implemented PLM. Defending the past is not a guarantee for the future.

A second example is Nokia. Nokia was famous for they ways they were able to transform their business in the past. How come they did not see the smartphone and touch screens upcoming ? Apparently based on several articles presented recently, it was Nokia´s internal culture and superior feeling that they were dominating the market, that made it impossible to switch. The technology was known, the concepts were there, however the (middle) management was full of blockers.

Two examples where blockers had a huge impact on the company.

Conclusion:

Staying in business and remaining competitive is crucial for companies. In particular the changes that currently happen require people to work different in order to stay completive. Documents will become reports generated from data. People handling and collecting documents to generate new documents will become obsolete as a modern data-centric approach makes them redundant. Keeping the old processes might destroy a company. This should convince the blockers to give up

future exit

observationIn my previous post, I wrote about the different ways you could look at Service Lifecycle Management (SLM), which, I believe, should be part of the full PLM vision. The fact that this does not happen is probably because companies buy applications to solve issues instead of implementing a consistent company wide vision (When and Where to start is the challenge). Oleg Shilovitsky just referred one more time to this phenomena – Why PLM is stuck in PDM.

I believe PLM as the enterprise information backbone for product information. I will discuss the logical flow of data that might be required in a PLM data model, to support SLM. Of course all should be interpreted in the context of the kind of business your company is in.

This post is probably not the easiest to digest as it assumes you are somehow aware and familiar with the issues relevant for the ETO (Engineering To Order) /EPC (Engineering Procurement Construction) /BTO (Build To Order) business

A collection of systems or a single device

The first significant differentiation I want to make is between managing an installation or a single device as I will focus only on installations.

clip_image002An installation can be a collection of systems, subsystems, equipment and/or components, typically implemented by companies that deliver end-to-end solutions to their customers. A system can be an oil rig, a processing production line (food, packages, …), a plant (processing chemicals, nuclear materials), where maintenance and service can be performed on individual components providing full traceability.

Most of the time a customer specific solution is delivered to a customer, either direct or through installation / construction partners. This is the domain I will focus on.

clip_image004I will not focus on the other option for a single device (or system) with a unique serial number that needs to be maintained and serviced as a single entity. For example a car, a computer device. Usually a product for mass consumption, not to be traced individually.

In order to support SLM at the end of the PLM lifecycle, we will see a particular data model is required which has dependencies on the early design phases.

Let´s go through the lifecycle stages and identify the different data types.

The concept / sales phase

concept2

In the concept/sales phase the company needs to have a template structure to collect and process all the information shared and managed during their customer interaction.

In the implementations that I guided, this was often a kind of folder structure grouping information into a system view (what do we need), a delivery view (how and when can we deliver), a services view (who does what ) and a contractual view (cost, budget, time constraints). Most of these folders had initially relations to documents. However the system view was often already based on typical system objects representing the major systems, subsystems and components with metadata.

In the diagram, the colors represent various data types often standard available in a rich PLM data model. Although it can be simplified by going back to the old folder/document approach shared on a server, you will recognize the functional grouping of the information and its related documents, which can be further detailed into individual requirements if needed and affordable. In addition, a first conceptual system structure can already exist with links to potential solutions (generic EBOMs) that have been developed before. A PLM system provides the ideal infrastructure to store and manage all data in context of each other.

The Design phase

Before the design phase starts, there is an agreement around the solution to be delivered. In that situation, an as-sold system structure will be leading for the project delivery, and later this evolved structure will be the reference structure for the as-maintained and as-services environment.

A typical environment at this stage will support a work breakdown structure (WBS), a system breakdown structure (SBS) and a product breakdown structure (PBS). In cases where the location of the systems and subsystems are relevant for the solution, a geographical breakdown structure (GBS) can be used. This last method is often used in shipbuilding (sections / compartments) and plant design (areas / buildings / levels) and is relevant for any company that needs to combine systems and equipment in shared locations.

design

The benefit of having the system breakdown structure is that it manages the relations between all systems and subsystems. Potentially when a subsystem will be delivered by a supplier this environment supports the relationship to the supplier and the tracking of the delivery related to the full system / project.

Note: the system breakdown structure typically uses a hierarchical tag numbering system as the primary id for system elements.  In a PLM environment, the system breakdown elements should be data objects, providing the metadata describing the performance of the element, including the mandatory attributes that are required for exchange with MRO (Maintenance Repair Overhaul) systems.

Working with a system breakdown structure is common for plant design or a asset maintenance project and this approach will be very beneficial for companies delivering process lines, infrastructure projects and other solutions that need to be delivered as a collection of systems and equipment.

The delivery phase

During the delivery phase, the system breakdown structure supports the delivery of each component in detail. In the example below you can see the relation between the tag number, the generic part number and the serial number of a component.

The example below demonstrates the situation where two motors (same item – same datasheet) is implemented at two positions in a subsystem with a different tag number, a unique serial number and unique test certificates per motor.

The benefit of a system breakdown structure here is that it supports the delivery of unique information per component that needs to be delivered and verified on-site. Each system element becomes traceable.

delivery

The maintenance phase

For the maintenance phase the system breakdown structure (or a geographical breakdown structure) could be the place holder to follow up the development of an installation at a customer site.

Imagine that, in the previous example, the motor with tag number S1.2-M2 appears to be under dimensioned and needs to be replaced by a more powerful one. The situation after implementing this change would look like the following picture:

maintenance

Through the relationships with the BOM items (not all are shown in the diagram), there is the possibility to perform a where-used query and identify other customers with a similar motor at that system position. Perhaps a case for preventive maintenance?

Note: the diagram also demonstrates that the system breakdown structure elements should have their own lifecycle in order to support changes through time (and provide traceability).

From my experience, this is a significant differentiator PLM systems can bring in relation to an MRO system. MRO and ERP (Enterprise Resource Planning)systems are designed to work with the latest and actual data only. Bringing in versioning of assets and traceability towards the initial design intent is almost impossible to achieve for these systems (unless you invest in a heavy customized system).

Conclusion

In this post and my previous post, I tried to explain the value of having at least a system breakdown structure as part of the overall PLM data model. This structure supports the early concept phase and connects data from the delivery phase to the maintenance phase.

Where my mission in the past 8 years was teaching non-classical PLM industries the benefits of PLM technology and best practices, in this situation you might say it is where classical BTO companies can learn from best practices from the process and oil & gas industry.

Note: Oleg just published a new blog post: PLM Best Practices and Henry Ford Mass Production System where he claims PLM vendors, Service partners and consultants like to sell Best Practices and still during implementation discover mass customization needs to be made to become customer specific, therefore, the age of Best Practices is over.

I agree with that conclusion, as I do not believe in an Out-Of-The-Box approach, to lead a business change.

Still Best Practices are needed to explain to a company what could be done and in that context without starting from a blank sheet.

Therefore I have been sharing this Best Practice (for free)

clip_image002

Some weeks ago there was a vivid discussion around the need for SLM (service lifecycle management) besides PLM started in a PLM LinkedIn group. Of course, the discussion was already simmering in the background in other LinkedIn groups and fora (forums) triggered by PTC´s announcement to focus on SLM and their “observation” that they were probably the only PLM vendor to observe that need. The Internet of Things is in one pen stroke connected with SLM. (Someone still using a pen?)

Of course it is not that simple and I will try to bring some logic in the thought process, the potential hype and the various approaches you could take related to SLM

SLM

clip_image004First SLM as a TLA (Three Letter Acronym). If you would Google what is the meaning of SLM the most common meaning is Hello, often said on IRC, this is short for “salaam”, or hello.

In the context of PLM it is a relative new acronym and the discussion on LinkedIn was also about the fact if we needed a new TLA. In general. What we try to achieve with SLM is: the ability to trace and follow existing products at customers and to provide advanced or integrated services to them. In a basic matter this could be providing documentation and service information (spare parts information). In an advanced manner, this could be thinking about the Internet of Things, be products that connect to the home base and provide information for preventive maintenance, performance monitoring and enhancements, etc.

clip_image006The topic is not new for companies around the world that have a “what can we do beyond PDM” vision, as I was involved already in 2001 in discussion with a large Swiss company providing solutions for the food processing industry. They wanted to leverage their internal customer centric delivery process and extend it to their customer support using a web interface for relevant content: spare parts lists and documentation.

I am sure one or two readers of this blog post will remember “the spindle case” (the only part in the demo concept that had real data behind it at that time)

For many industries and businesses the customer services (and the margin on spare parts) are the main areas where they make a sustainable profit to secure the company’s future. Most of the time, the initial sale and/or delivery of their products are done with relative low margin due to the competitive sales situation they are during selling. And of course the sale itself is surrounded with uncertainty which vendors have to accept.

If they would ask for more certainty – it would require a more detailed research, which is costly for them or considered as a disadvantage by their potential customer. As other competing vendors do not insist on further research, your company might consider not being “skilled” enough to estimate properly a product.

clip_image007The above paragraph implicitly clarifies that we are mainly talking about companies, where their primary process is Engineering to Order or Build to Order. For companies where the product is delivered through a Configure to Order or an Off-the-Shelf approach, there is no need to work in a similar manner. Buying a computer or a car has no sales engineering involved anymore. There is a clear understanding of the target price and of course resellers will still focus on differentiating themselves by providing adjacent services.

So for simplicity I will focus on companies with a BTO or ETO primary business process

SLM and ETO

In a real Engineering to Order process, traditionally the company that delivers the solution to the client will not be really involved in the follow up of the lifecycle of the products delivered. The delivered product (small machinery, large machinery or even an installation or plant) is delivered to the customer and with the commissioning and handover a lot of information is transferred to the customer, based on the requirements of the customer.clip_image009

Usually during this handover, a lot of intelligence of the information is gone, as the customer does not have the same engineering environment and therefore requires information is “neutral” formats: paper (less and less), PDFs (the majority) and (stripped) CAD data combined with Excels.

The information battle here between the ETO-delivery company and the customer is, that the ETO-delivery company does not want to provide too much information to the customer, to make the customer fully independent, as the service and spare parts business is the area where they can make their margin. The customer, however, often wants to have ownership of the majority of data, but also there is the awareness if they ask too much; they will pay for it (as an engineering company will consider this as extra work). So finding the right balance is the point.

However, the balance is changing, and this is where SLM comes in.

More and more we see that companies who purchased in the past an Engineering to Order product (or even plant) are changing their business model towards using the product or running the plant and ask from the Engineering to Order company to provide the solution as a service. A kind of operation lease including resources. This means solutions are no longer sold as a collection of products, but as an operational model (40.000 chickens / day, 1 Mio liter/day, 100 000 Tons / year, etc., etc.)

clip_image011The owner of the equipment is no longer the owner, but pays for the service to perform the business. Very similar to SaaS (software as a service) solutions. You do not own the software anymore; you pay for using it, no matter what kind of hardware / software architecture there is behind the offering.

In that case, the Engineering to Order company can provide much more advanced services when they extend their delivery process with capabilities for the operational phase of the product. As a more integrated approach eliminates the need for this disruptive handover process. Data does not need to be made “stupid” again, it is a continuous flow of information.

How this can be done, I will describe in an upcoming, more technical, blog post. This approach brings value to both the Engineering to Order company and the owner/operator of the product / plant.
As it is a continuous flow of information, I would like to conclude this topic by stating that, for Engineering to Order companies, there is no need to think about an extra SLM solution. You could label the last part of the PLM process the SLM domain.

As the customer data is already unique, it is just a normal continuation of the PLM process.

Two closing notes here:

  • I have seen already Engineering to Order companies that provide the whole maintenance and service of the delivered product / plant to their customer integrated in their data environment. (so it is happening !)
  • Engineering to Order companies are still discovering the advantages of PLM to get a cross-project, cross-discipline understanding and working methodology for their delivery process. Historically they were thinking in isolated projects, where the brain of experienced engineers was the connection between different projects. Now PLM practices are becoming the foundation for sharing and capitalizing on knowledge.

And with the last remark on capitalizing the knowledge, we move from the Engineering to Order industry to the Build to Order

SLM and BTO

In the Build to Order industry, the company that delivers a solution to their customer, has tried, in a way, to standardize certain parts of their total solution. These parts can be standardized/configurable machinery or standardized/configurable equipment, or even a level higher standardized systems and subsystems.

More configurable/modular standardization is what most companies are aiming for. As the more you modularize your solution parts, the clearer it will be that there are two different main processes inside the same organization:

  • One process, the main process for the company, fulfilling the customer need. In this process it is about combining existing solution components and engineering them together in a customer specific solution. This could be a PLM delivery model like ETO.
  • One process to enhance, maintain and develop new solution components, which is a typical R&D process. Here I would state PLM is indisputable needed, to bring new technology and solutions to the main business process

So within a company, there might be the need for two different PLM solution processes. From my observations in the past 10 years, companies invest in PDM for their R&D process and try to do a little of PLM on top of this PDM implementation for their delivery process. This basic PLM process usually focuses again on the core of the engineering process of delivery, starting somewhere from the specifications till the delivery of the solution.

clip_image013

So “full” PLM is very rare to find. The front end of the delivery process, systems engineering, is often considered complex and often the customer does not want to engage fully in the front end definition of the solution.

“You are the experts, you know best what we want” is often heard.

Ironically in an analogue situation this is often the case of PLM implementations at risk. Here the company expects the PLM implementer to know what they want, without being explicit or understanding what is needed.

To extend the discussion for PLM and SLM, I would like to change the question to a different dimension first:

Do we need two PLM implementations within one company ?

One for R&D and one for the delivery process ?

Reasons to say No are:

  • Simplicity – it is easier to have one system instead of two systems
  • The amount of R&D activity is so low compared to the delivery process; the main PLM system can support this.

Reasons to say Yes are:

  • The R&D process is extremely important as is the delivery process
  • The R&D process is extremely important and we have a large customer base to serve

Reading these two options, it brings some clarity.

If the R&D process is a significant differentiator and you are aiming to serve many customers, it makes sense to have two PLM implementations.

Still two PLM implementations could be based on the same PLM infrastructure and I would challenge readers of this post to explain why it should be a single instance of a PLM infrastructure.

Why two PLM systems

  • I believe based on the potential huge amount of data a single instance would create a data monster, where we can see that connected systems (using big data) is the future.
  • In other concepts there is an enterprise PLM and local PDMs exactly because there is no single system that can do all in an efficient manner.

Still I haven´t talked about SLM, which could be part of the delivery process, where you manage customer specific data. For that, more detail in my next blog post, there is are some data model constraints for the PLM system.

clip_image015I would state you only can use a separate SLM system if you are not interested in data from the early phases of the delivery process. In the early phase, you use conceptual structures to define the product /installation/plant. These conceptual structures are to my opinion the connection between the concept phase and the service phase. Usually tag numbers are used to describe the functional usage of a product or system, and they are the ones referenced by service engineers to start a service operation.

clip_image017Only when this view or need does not exist, I can imagine, SLM is needed, where potential based on serial numbers, services are tracked and monitored and are fed back to the R&D environment. The R&D environment then would publish product data into the SLM system

You might be confused at this time, as I did not bring the various information structures into this post to clarify the data flow for the delivery process. This I will do in my upcoming post.

Why not CTO and SLM ?

I haven´t discussed Configure to Order (CTO) here, as I consider CTO a logistical process, which logically is addressed by the ERP system. The definitions of the configurations and its related content probably will be delivered through a PDM/PLM system, so the R&D type of PLM system will exist in the company.

imageSLM most logically would be performed in this situation by the ERP system, as there is no PLM delivery layer. Having said this, a new religion discussion might come up. Is SLM a separate discipline or is it part of the ERP system?

This topic is no discussion for the big ERP vendors – they do it all J, but it is up to your company if a Swiss knife is the right tool to work within your organization.

Conclusion

For the moment I would like to conclude:

  • PLM and SLM –> No (only Yes in isolated cases)
  • PLM and PLM –> Yes (as SLM requires the front end of PLM too)

Do we need SLM ? Perhaps yes as a way to describe a functional domain. No when we are talking about another silo system. I believe the future is in connectivity of data and in the long term PLM, ERP and SLM will be functional domains describing how connected data will serve particular needs.

Looking forward to your thoughts

picongressThe product innovation conference in February has become one of my favorite events, mainly for networking. Perhaps PLM vendors try to give you the impression that we are in a fast moving world. In reality, most companies are moving in a much slower pace than these vendors dream of. In general for an outsider, last year might have looked similar to what happened this year. In this post, I will describe the subtle differences that I noticed.

The event

The event was in the same location as last year with approximately. 400 participant including 60 speakers. The conference had three main streams: keynotes, PLM and design. The PLM and design sessions were most of the time parallel sessions. Great if you are interested in one domain only, a little more challenging for people who are enjoying to be in both domains. However the good news is that all participants will have access to the recorded sessions in a week or two. And from last years’ experience I can say the recordings are good, so I am looking forward to a virtual additional conference in two weeks from now.

The sessions

Some remarks about the sessions that I was able to attend

Going to Mars ?

SNAGHTML10b65892Bas Lansdorp explained us about the Mars One mission, what was the drive and challenge behind establishing a permanent human settlement on Mars. It was an inspiring opening session to make you think out of the box. Several interesting topics came up.

1. First of all that most of the mission’s materials need to be basic, proven technology instead of modern, innovative concepts. As maintenance and risks for issues need to be minimized, it is better to keep it with proven technology.

2. The crew selection is a long process – the first crew will fly in 10 years from now, so who are those individuals that want to take up the challenge to stay forever with 3 others, and every few years some more people will come. But hard to escape, and there is no way back. Amazing!

3. Part of the funding can be done by media rights. Bas explained the revenues that are related, for example, with the Olympic Games are already stunning. Imagine to have “Live on Mars” as a reality soap available all around the world. Programs like Big Brother demonstrate that it is in our nature just to watch ordinary people see how the behave. Will they fight? Will they have sex? Public voyeurism and eternal fame.

Although the keynote had no relation to PLM, I felt energized by the entrepreneurial thinking of Bas, following his passion and wanting to realize it. As Mars does not need the first centuries entrepreneurs, it was clear Bas is not part of the first crew.

Managing complexity and volume

imageNext Peter Smith from VF International presented the huge challenge his group of companies had to manage the complexity of the various products and their seasonal deliveries, up to 12 collection models per year. The group with famous brands like The North Face, Lee, Wranglers, JanSport, Kipling and Timberland has the challenge to deliver 500 Mio units/year which means 16 units/second ! For sure an execution engine. So where does PLM fit?

For Peter PLM is part of the infrastructure, a glue for the innovation process, but not driving the innovation process. They try to standardize on a single PLM system, but some of the brands have such characteristics and history that this was not possible to realize. As the business must go on, a new PLM should not be disruptive for business.

The two main challenges Peter sees for current PLM are:

  • The software models available for them as consumers. Changes go here too slow
  • Organizational change implications. How to change when change is hard?

It was clear from Peter’s experience that many of his points were from the IT-perspective. During the networking break when I spoke with others, some of them mentioned that the business value for PLM was missing in Peter’s analysis – too much tool/infrastructure.

The digital value chain

imageAn interesting session from Michael Bitzer (Accenture) and Sebastien Handschuh (Daimler). After an introduction about the German initiative Digital Industry 4.0 the remaining part of the session was around Daimler´s approach to use JT as a neutral, application independent format for their 3D data. At this time, Daimler has already over a 6 Mio JT-files and the format has been proven to fulfill their process needs.

Where possible Daimler aims to collaborate with suppliers in JT format for 3D. In this manner, their suppliers are not forced to use exclusively CATIA or NX. And the answer one question from the audience if Daimler was supporting the Siemens flavored JT or the real neutral JT format, it was clear that Daimler was aiming for the neutral format. I believe an interesting move to a more generic data approach in this case for 3D CAD data instead of original file formats. Hopefully more standardization to follow.

PLM selection: Do´s and Don’ts

questionaireI was moderating a discussion session for companies that were in the process of selecting a PLM system or that wanted to share their experience. Unfortunate the session was overpopulated with a lot of people not all necessary in the selection process. Due to the large audience not really an opportunity to have an in-depth discussion. Still it was amazing to see that there are still companies where the value of PLM is not clear at the management level and therefore the focus is on quick ROI.

In a one-to-one discussion afterwards I learned about a company where the shareholders/investors of a company forced the PLM project to fail by pushing unrealistic deadlines and not understanding the human and business change required. Unrealistic ROI expectations and lack of understanding where PLM really brings a competitive advantage is missing. Worst case due to their short-term focus the company will slowly be out of business as competitiveness and margins will reduce. For this type of situations, there is the excellent Dilbert cartoon below.

image

Source: Dilbert.com

Secure data sharing in the extended enterprise

SNAGHTML10cde882An interesting session was organized by Häkan Kårdén (Eurostep) and Kristofer Thoresson (Siemens Industrial Turbomachinery). Siemens had chosen to use the Eurostep Share-A-space environment between their internal data (their PDM system and other data sources) and the external data from suppliers, customers and field services. A pragmatic concept and interesting to see Share-A-space Found-Its-place. PLM Vendors probably would claim that their system could organize this secure and remote access without the need for a system in between. But the fact that a Siemens company decides to use Share-A-space demonstrates there is still a gap between a potential safe, single PLM based implementation and a pragmatic separation approach.

PLM is changing

imageIn my session that afternoon I focused on the visible change in PLM. From an IT infrastructure for file collaboration towards a more data-centric business driven approach. And from there looking into the future anticipate that moving towards a data-centric approach is crucial to be ready for advanced computer power and brain-matching algorithms. This will be the game changers I believe in the upcoming decade in line with the Industry 4.0 ideas. My past two post have been indicating this direction:

A Circular economy

imagePeter Bilello from CIMdata had a good presentation related to the change in business we see and must make. No longer can we afford an economy where we waste raw materials. The circular economy is about supporting the product lifecycle from cradle-to-cradle instead of the classical cradle-to-grave. This is what you could call the circular economy; This matches the trend that companies more and more will deliver services to their customers instead of selling products to them. Instead of buying a fridge you pay for cooling capacity and your supplier changes the current model with a new model after three years. The service or experience economy fitting very nicely with the new generations that seem to prefer more to live and share at the moment instead of owning property.

Your digital shadow

imageThe closing keynote from Stephanie Hankey was like the starting keynote. No relation to PLM but interesting in the context of what the effects are from digitalization and mobility. She provided some insights about the data that is already collected from each individual (or device) and how this all can be combined in profiles – your digital shadow. And of course your shadow might give the wrong impression. You can imagine that with growing trend of smart devices and the Internet of Things it will be hard to stay out of it. Companies will sell and buy data sets from their potential customers (victims). Scary as it all happens in the background and you are not fully aware of it.

(At the point, I was writing this paragraph my computer crashed with a blue screen – coincidence?)

Cultured beef ?

imageAfter a good burger and discussion in the evening, the opening keynote on day two was from Mark Post with the title Cultured Beef – changing the way we eat and think about food forever. Another interesting keynote where Mark explained how we can feed the growing world population in a more sustainable way by creating animal products through cell culture and bio fabrication instead of farming. The process is still in the early days of discovery but by using cell culture you can assure you get the right meat, even without fat, and it is real meat. Currently still expensive. Mark estimates that with current technology and up scaling of the process a price of $ 65 per kilo can be reached. Too expensive for consumers at this time but a promising number for the future. Another (Dutch) keynote speaker that made us think differently for the rest of the day.

The engine

SNAGHTML10e5e94dNext Bjarne Nørgaard from MAN Diesel & Turbo gave a good lecture for the audience, what it takes to design and build a ship. You build the engine and wrap the ship around it. The challenge for MAN is to follow, service and maintain the engine through is 30 year’s lifecycle and possibly longer. Next Bjarne went into the details of their information architecture, and it was surprising to learn that their PDM system was Siemens and that they used Aras on top of that for connecting data to the rest of the enterprise and lifecycle of the engine. You would assume two PLM systems in-house for one company is an overkill. Bjarne explained that they tried initially to achieve these goals with Teamcenter but failed due to lack of flexibility. Great marketing for Aras, bad for Siemens. Although I am sure the cultural aspect has played a role. No one likes their first PLM or ERP system, as the first implementation is this domain is the moment you have the biggest internal culture shock.

Using search and semantic technology

imageThe presentation from Moises Martines-Ablanado (Configuration Management Airbus Group) and Thomas Kamps (Conweaver) was interesting as they demonstrated one of the upcoming concepts I foresee will have a great future. Conweaver connects to existing enterprise systems (PLM, ERP, CRM, and legacy) and create a semantic mapping and linking of the data indexed from these systems. And through this network of data provide apps with a particular purpose. For example identify directly changes in the current EBOM and MBOM and potentially from there update the MBOM based in EBOM changes. A concept I have seen with Exalead too, illustrating that once you are in a data-centric environment, combining data sources for particular purposes can be achieved fast. No need for the classical approach of a single database that stores all.

A new TLA ? CLM

imageJoy Batchelor gave a clear presentation why besides PLM and ERP Jaguar Landrover (JLR) needs a third system supporting the connectivity of product configurations and sales configurations. They are able to manage 58.000.000.000 combinations for 170 different markets, which means every person on this planet could have its unique Jaguar Landrover. Joy introduced CLM (Configuration Lifecycle Management) as the third domain needed to support these configurations. The system they are using is ConfigIT, and I assume all automotive vendors have their own toolsets to manage the product and marketing configurations. I hope to learn more on that area. Will CLM be a separate domain or will it be absorbed by PLM or ERP vendors in the future ? Time will tell/

A game changer ?

SNAGHTML10f16946Henk Jan Pels from the Eindhoven University of Technology took us back in time and explained how ERP became visible on the CFO’s agenda eliminating the discussion on ROI. Where ERP is handling material flows, to develop and deliver products there is also a need for knowledge flows between requirements, functional and the physical definition of a product. Expanding these flows to a framework that covers the technology, the building blocks, the families and the individual products would be the ideal interaction Henk Jan is proposing. And a PLM system would be the environment to implement this concept. Henk Jan announced this as a game changer. I agree if management of companies spend times to understand the benefits, it will be a game changer. Somehow it remained an academic concept and I believe we are all eager to learn if companies will adapt this idea, knowing change to something that is not common or traditional is a cultural risk.

The German future ?

imageThe final presentation I could attend was from Martin Eigner, who first explained in some detail what the Industry 4.0 approach was about. From there he took us into the world of model based systems engineering. You could say an integration of PLM with more virtual system modeling and analysis as the front end of the development process. Somehow similar to last year’s presentation, but understandable as the world of PLM does not evolve so fast.

Conclusion

This is somehow also my conclusion from this year’s event. I was hoping to see some new sparks. For sure the keynotes were inspiring although less related to PLM. The case from Airbus and Conweaver was inspiring as I believe search and semantic based applications are a logical extension for the challenges companies want to address with PLM. JLR’s presentation explaining the need for Configuration Lifecycle Management strengthened my thought that in the future PLM and ERP will disappear. It is about a business platform with combined services, which might fall in one of the classical categories. I believe for many people the German Innovation 4.0 should be studied and replicated as it acknowledges exactly the future trend to remain competitive.

It was a pity for the public that Siemens PLM, Dassault Systèmes and Autodesk were not there. As the two largest PLM vendors and one of the largest PLM challengers, you would expect them be there and allow prospects and PLM consultants to compare where each of the PLM companies is different. Still it was a good conference. Well organized and as mentioned in the introduction, all presentations are recorded, giving everyone the opportunity to digest and review content again.

I am looking forward to the next Product Innovation conference with perhaps some more PLM related keynotes and big data practices.

imageI will be attending the annual Product Innovation conference again in Berlin next week. Looking forward to this event, as it is one of the places where you have the chance to network and listen to presentations from people that are PLM minded. A kind of relaxation, as strangely enough, most of the companies I am visiting, considerer PLM still considered as something difficult, something related to engineering, not so much connected to the future of their business.

I believe one of the reasons is that people have founded their opinion on the past. An expensive implementation horror story, an engineering focuses implementation or other stories that have framed PLM in a certain manner.

However PLM has changed and it significance has grown ! 

During the Product Innovation conference, I will present in more depth this topic related to the change of PLM.,with more examples and a surprising projection to the future. Later, when time permits, I will share the more in-depth observations in my blog, hopefully extended based on discussions during the conference. And if you attend the conference, don’t miss my session.

 

clip_image001Fifteen years ago,

the term PLM (Product Lifecycle Management) was introduced as a logical extension to cPDM (collaborative Product Data Management). Where the initial focus was of global file sharing of mechanical CAD data, PLM extended the scope with multidisciplinary support, connecting manufacturing preparation and providing an infrastructure for change management.

In the nineties product data management was in transition.

In the early 90s, UNIX dominated, and installing a PDM system was the work of IT-experts. Large enterprises, already operating globally, were pushing for standardization, and control of data to connect their engineers in a more efficient manner. Connectivity was achieved through expensive lease lines; people like me, had to connect to the internet through dial-up modems and its usage was limited, providing static web pages with minimal graphics.

It was obvious that cPDM and the first PLM projects were extremely expensive. There was no experience; it was learning on the job. The costs were high and visible at the management level. Giving the management the impression that PLM is potentially the same challenge as ERP, but with a less clear scope. And the projects were executed by IT-experts, end-users were not really in the game.

At the end of the 90s, a small revolution started to take place. The power of the PC combined with Microsoft technology provided a much cheaper and flexible alternative for a complex UNIX based implementation. SNAGHTMLc988d04

Affordable 3D CAD emerged in the mid-market, leading to the need for Windows-based PDM systems and with Windows came Excel, the PDM/PLM killer application.

A person with some rudimentary Visual Basic skills could do magic with Excel and although not an IT-expert would become the champion of the engineering department.

At that time, PLM conferences provided a platform on which industry could discuss and share their tips and tricks on how to implement in the best manner a system. The focus was mainly on the IT-side and large enterprises. The scope was engineering centric, connecting the various disciplines including mechanical, electrical and simulation, in a database and connecting files and versions.

 

clip_image002Ten years ago,

most large enterprises had already started to implement a PLM system. The term PLM became an accepted acronym associated with something that is needed for big companies and is complex and expensive, a logical statement based on the experiences of early adopters.

PLM was the infrastructure that could connect product information between disciplines and departments working from different locations. The NPI (New Product Introduction) process became a topic pushed by all enterprise PLM vendors and was a practice that demonstrated the value of providing visibility on information across a large, dispersed company, to better decision-making.

As this process was more data-centric instead of CAD-centric, these capabilities promoted the recognition and introduction of PLM in non-traditional manufacturing industries like Consumer Packaged Goods, Pharmaceuticals and Apparel where planning and coordination of information leads, instead of a Bill of Material.

In large enterprises, PLM still lay with the IT-architects as they were the ones deciding the standards and software to be used. PLM and ERP connectivity was an expensive topic.

PLM_profFor the mid-market, many PLM vendors were working on offers to standardize a PLM implementation; this usually involved a stripped-down or limited version from the full PLM system, a preconfigured system with templates or something connected to SharePoint. Connectivity was much easier then 15 years ago, thanks to a better internet infrastructure and the deployment of VPN.

For me at that time selling PLM to the mid-market was challenging; how do you explain the value and minimize the risk while current business was still running well? What was so wrong with the existing practices based on Excel? In summary, with good margins and growing business, wasn’t everything under control without the need for PLM? This was the time I started to share my experiences in my blog: A Virtual Dutchman´s introduction

Mid-market PLM projects focused on departmental needs, with IT providing implementation support and guidance. As the number of IT-staff is usually limited in these companies and often organized around ERP and what they learned from its implementation, it was hard to find business experts for PLM in the implementation teams.

 

clip_image003Five years ago,

the financial crisis had started, and globalization had started to become real through world-wide connectivity – better infrastructure and WEB 2.0. The world became an open space for consumers and competitors; the traditional offshore countries became consumers and began to invest in developing products and services for their domestic market but also targeted the rest of the world. Large enterprises were still expanding their huge PLM implementations though some were challenged because of a change of ownership. Capital investors did not come from the US or Europe anymore but from the BRIC (Brazil, Russia, India, China) countries, forcing some established companies to restructure and refocus.

jugleIn response to the crisis, mid-market companies started to reduce costs and focus on efficiency. Lots of discussions related to PLM began as it appeared to be THE strategy needed to survive, though a significant proportion of the investment in PLM was cancelled or postponed by management due to uncertainty and impact on the organization.

PLM conferences showed that almost all of the big enterprises and the mid-market companies still using PLM for connecting departments without fundamentally integrating them in one complete PLM concept. It is easier to streamline the sequential process (thinking lean) instead of making it a concurrent process with a focus on the market needs. PLM conferences were being attended by a greater mix of IT and Business representatives from different businesses learning from each other.

 

clip_image004Today,

everyone in the world is connected and consequently, the amount of data is piling up. And now it is more about data than about managing document. The introduction of smart devices has had an impact on how people want to work; instead of sharing files and documents, we start sharing and producing huge amounts of data. In addition the upcoming “Internet of Things” demonstrates we are moving to a world where connectivity through data becomes crucial.

Sharing data is the ideal strategy for modern PLM. PLM vendors and other leading companies in enterprise software are discovering that the classical method of storing all information into one database does not work anymore and will not work in the future.

SNAGHTMLca3d692In the future, a new generation of PLM systems, either as an evolution of existing systems or as a disruption from the current market, will come. No longer will the target be to store all information in one system; the goal will be to connect and interpret data and make the right decisions based on that. This is similar to what the new generation of workers are used to, and they will replace the (my) older generation in the upcoming decade

Combined with more and more cloud-based solutions and platforms, the role of IT will diminish, and the importance of business people driving PLM will become ever more crucial.

PLM has become a business-driven strategy and requires people that are strong enough to develop, justify and implement this approach in their companies. New champions are needed !

The value of communities, blogs and conferences

is bringing together the global brainpower in social environments. Complemented with presentations, opinions and discussions from all different industries and domains the ideal environment to grow new ideas.  Here you can associate the information, question its relevancy for your business and network with others – the perfect base for innovating and securing your future business.

Therefore, do not use communities or conferences to stick to your opinion but be open and learn.

One of my favorite quotes

Everyone wants to be a game changer and in reality almost no one is a game changer. Game changing is a popular term and personally I believe that in old Europe and probably also in the old US, we should have the courage and understanding changing the game in our industries.

Why ? Read the next analogy.

1974

With my Dutch roots and passion for soccer, I saw the first example of game changing happening in 1974 with soccer. The game where 22 players kick a ball from side to side, and the Germans win in the last minute.

clip_image002My passion and trauma started that year where the Dutch national team changed the soccer game tactics by introducing totaalvoetbal.

The Dutch team at that time coached by Rinus Michels and with star player Johan Cruyff  played this in perfection.

Defenders could play as forwards and they other way around. Combined with the offside-trap; the Dutch team reached the finals of the world championship soccer both in 1974 and 1978. Of course losing the final in both situations to the home playing teams (Germany in 74 – Argentina in 78 with some help of the referee we believe)

This concept brought the Dutch team for several years at the top, as the changed tactics brought a competitive advantage. Other teams and players, not educated in the Dutch soccer school could not copy that concept so fast

image

At the same time, there was a game changer for business upcoming in 1974, the PC.

On the picture, you see Steve Jobs and Steve Wozniak testing their Apple 1 design. The abbreviation IT was not common yet and the first mouse device and Intel 8008 processor were coming to the market.

This was disruptive innovation at that time, as we would realize 20 years later. The PC was a game changer for business.

2006

Johan Cruyff remained a game changer and when starting to coach and influence the Barcelona team, it was his playing concept tika-taka that brought the Spanish soccer team and the Barcelona team to the highest, unbeatable level in the world for the past 8 years

clip_image002[6]Instead of having strong and tall players to force yourself to the goal, it was all about possession and control of the ball. As long as you have the ball the opponent cannot score. And if you all play very close together around the ball, there is never a big distance to pass when trying to recapture the ball.

This was a game changer, hard to copy overnight, till the past two years. Now other national teams and club teams have learned to use these tactics too, and the Spanish team and Barcelona are no longer lonely at the top.

Game changers have a competitive advantage as it takes time for the competition to master the new concept. And the larger the change, the bigger the impact on business.

Also, PLM was supposed to be a game changer in 2006. The term PLM became more and more accepted in business, but was PLM really changing the game ?

imagePLM at that time was connecting departments and disciplines in a digital manner with each other, no matter where they were around the globe. And since the information was stored in centralized places, databases and file sharing vaults, it created the illusion that everyone was working along the same sets of data.

The major successes of PLM in this approach are coming from efficiency through digitization of data exchange between departments and the digitization of processes. Already a significant step forward and bringing enough benefits to justify a PLM implementation.

Still I do not consider PLM in 2006 a real game changer. There was often no departmental or business change combined with it. If you look at the soccer analogy, the game change is all about a different behavior to reach the goal, it is not about better tools (or shoes).

The PLM picture shows the ideal 2006 picture, how each department forwards information to the next department. But where is PLM supporting after sales/services in 2006 ? And the connection between After Sales/Services and Concept is in most of the companies not formalized or existing. And exactly that connection should give the feedback from the market, from the field to deliver better products.

The real game changer starts when people learn and understand sharing data across the whole product or project lifecycle. The complexity is in the word sharing. There is a big difference between storing everything in a central place and sharing data so other people can find it and use it.

imagePeople are not used to share data. We like to own data, and when we create or store data, we hate the overhead of making data sharable (understandable) or useful for others. As long as we know where it is, we believe our job is safe.

But our job is no longer safe as we see in the declining economies in Europe and the US. And the reason for that:

Data is changing the game

In the recent years the discussion about BI (Business Intelligence) and Big Data emerged. There is more and more digital information available. And it became impossible for companies to own all the data or even think about storing the data themselves and share it among their dispersed enterprises. Combined with the rise of cloud-based platforms, where data can be shared (theoretically) no matter where you are, no matter which device you are using, there is a huge potential to change the game.

It is a game changer as it is not about just installing the new tools and new software. There are two major mind shifts to make.

  • It is about moving from documents towards data. This is an extreme slow process. Even if your company is 100 % digital, it might be that your customer, supplier still requires a printed and wet-signed document or drawing, as a legal confirmation for the transaction. Documents are comfortable containers to share, but they are killing for fast and accurate processing of the data that is inside them.
  • It is about sharing and combining data. It does not make sense to dump data again in huge databases. The value only comes when the data is shared between disciplines and partners. For example, a part definition can have hundreds of attributes, where some are created by engineering, other attributes created by purchasing and some other attributes directly come from the supplier. Do not fall in the ERP-trap that everything needs to be in one system and controlled by one organization.

imageBecause of the availability of data, the world has become global and more transparent for companies. And what you see here is that the traditional companies in Europe and the US struggle with that. Their current practices are not tuned towards a digital world, more towards the classical, departmental approach. To change this, you need to be a game changer, and I believe many CEOs know that they need to change the game.

The upcoming economies have two major benefits:

  • Not so much legacy, therefore, building a digital enterprise for them is easier. They do not have to break down ivory towers and 150 years of proud ownership.
  • The average cost of labor is lower than the costs in Europe and the US, therefore, even if they do not do it right at the first time; there is enough margin to spend more resources to meet the objectives.

imageThe diagram I showed in July during the PI Apparel conference was my interpretation of the future of PLM. However, if you analyze the diagram, you see that it is not a 100 % classical PLM scope anymore. It is also about social interaction, supplier execution and logistics. These areas are not classical PLM domains and therefore I mentioned in the past, the typical PLM system might dissolve in something bigger. It will be all about digital processes based on data coming for various sources, structured and unstructured. Will it still be PLM or will we call it different ?

The big consultancy firms are all addressing this topic – not necessary on the PLM level:

2012  Cap Gemini – The Digital advantage: …..

2013  Accenture – Dealing with digital technology’s disruptive impact on the workforce

2014  McKinsey – Why every leader should care about digitization and disruptive innovation

For CEOs it is important to understand that the new, upcoming generations are already thinking in data (generation Y and beyond). By nature, they are used to share data instead of owning data in many aspects. Making the transition to the future is, therefore, also a process of connecting and understanding the future generations.  I wrote about it last year: Mixing past and future generations with a PLM sauce

This cannot be learned from an ivory tower. The easiest way is not to be worried by this trend and continue working as before, losing business and margin slowly year by year.

As in many businesses people are fired for making big mistakes, doing nothing unfortunate is most of the time not considered as a big mistake, although it is the biggest mistake.

picongressDuring the upcoming PI Conference in Berlin I will talk about this topic in more detail and look forward to meet and discuss this trend with those of you who can participate.

The soccer analogy stops here, as the data approach kills the the old game.
In soccer, the maximum remains 11 players on each side and one ball. In business, thanks to global connectivity, the amount of players and balls involved can be unlimited.

clip_image002[8]A final observation:
In my younger days, I celebrated many soccer championships, still I am not famous as a soccer player.

Why ?

Because the leagues I was playing in, were always limited in scope: by age, local,regional, etc. Therefore it was easy to win in a certain scope and there are millions of soccer champions beside me. For business, however, there are almost no borders.

Global competition will require real champions to make it work !!!

Follow

Get every new post delivered to your Inbox.

Join 330 other followers

%d bloggers like this: