You are currently browsing the category archive for the ‘Education’ category.

What I want to discuss this time is the challenging transformation related to product data that needs to take place.

The top image of this post illustrates the current PLM world on the left, and on the right the potential future positioning of PLM in a digital enterprise.  How the right side will behave is still vague – it can be a collection of platforms or a vast collection of small services all contributing to the performance of the company.  Some vendors might dream, all these capabilities are defined in one system of systems, like the human body; all functions are available and connected.

Coordinated or connected?

This is THE big question for a future digital enterprise. In the current PLM approach, there are governance structures that allow people to share data along the product lifecycle in a structured way.

These governance structures can be project breakdown structures, where with a phase-gate approach the full delivery is guided. Deliverables related to task and gates will make sure information is stored available for every stakeholder. For example, a well-known process in the automotive industry, the Advanced Product Quality Process ( APQP process) is a standardized approach to make sure parts or products are introduced with the right quality for the customer.

Deliverables at any stage in the process can be reviewed or consumed by another stakeholder. The result is most of the time a collection of approved documents (Office-type, Design & Test files) stored centrally. This is what I would call a coordinated data approach.

In complex environments, besides the project governance, there will be product structures and Bill of Materials, where each object in such a structure will be the placeholder for related information. In case of a product structure it can be its specifications per component, in case of a Bill of Materials, it can be its design specification (usually in CAD models) and its manufacturing specifications, in case of an MBOM.

An example of structures used in Enovia

Although these structures contain information about the product composition themselves, the related information makes the content understandable/realizable.

Again it is a coordinated approach, and most PLM systems and implementations are focusing on providing these structures.

Sometimes with their own system only – you need to follow the vendor portfolio to get the full benefit  or sometimes the system is positioned as an overlay to existing systems in the company, therefore less invasive.

Presentation from Martin Eigner – explaining the overlay concept based on Aras

Providing the single version of the truth is often associated with this approach. The question is: Is the green bin on the left the single version of the truth?

The Coordinated – Single Version of the Truth – problem

The challenge of a coordinated approach is that there is no thorough consistency checking if the data delivered is representing the real truth. Through serious review procedures, we do our best to make sure every deliverable has the required content and quality. As information inside these deliverables is not connected to the outside world, there will be discrepancies between reality and what has been stored. Still, we feel comfortable enough as an organization to pretend we know where the risks are. Until the costly impossible happens !

The connected enterprise

The ultimate dream of a digital enterprise is that everything relevant is connected in context. This means no more documents or files but a very granular information model for linking data and keeping it in context. We can apply algorithms and automation to connected data and use Artificial Intelligence to make sense of massive amounts of data.

Connected data allows us to share combined sets of information that are relevant to a particular role. Real-time dashboarding is one of the benefits of such an infrastructure. There are still a lot of challenges with this approach. How do we know which information is valid in the context of other information? What are the rules that describe a valid product or project baseline at a particular time?

Although all data is stored as unique information objects in a network of information, we cannot apply the old mechanisms for a coordinated approach all the time. Generated reports from a connected environment can still serve as baselines or records related to a specific state, such as when the design was approved for manufacturing, we can generate approved Product Baselines structures or Bill of Materials structures.

However, this linearity in lifecycle for passing information through an enterprise will not exist anymore. It might be there are various design alternatives and the delivery process is already part of the design phase. Through integrated virtual simulation and testing, we reach a state that the product satisfies the market for that moment and the delivery process is known at the same time

Almost immediately and based on first experiences from the field, new features can be added virtually tested and validated for the next stage. We need to design new PLM infrastructures that can support this granularity and therefore complexity.

The connected – Single Version of the Truth – problem

The concepts I described related to the connected enterprise made me realize that this is analogue to how the brain works. Our brain is a giant network of connected information, dynamically maintaining associations, having different abstraction levels and always pretending there is one truth.

If you want to understand a potential model of the brain, please read On Intelligence from Jeff Hawkins. With the possible upcoming of the Quantum Computer, we might be able to create performing brain models.

In my earlier post: Are we blocking our future,  I referred to the book; The Idiot Brain: What Your Head is Really Up To from Dean Burnett, where Dean is stating that due to the complexity of stored information our brain continuously adapts “non-compliant” information to make sure the owner of the brain feels comfortable.

What we think that is the truth might be just the creation from the brain, combining the positive parts into a compelling story and suppressing or deleting information that does not fit.  Although it sounds absurd, I believe if we are able to create a connected digital enterprise we will face the same symptoms.  Due to the complexity of connected information, we are looking for the best suitable version, and as all became so complex, ordinary human beings will no longer be able to distinguish this

 

Conclusion:

As part of the preparation for the upcoming PDT Europe 2018, I was investigating the topics coordinated and connected enterprise to discover potential transformation steps. We all need to explore the future with an open mind, and the challenge is: WHERE and HOW FAST can we transform from coordinated to connected? I am curious if you have experiences or thoughts on this topic.

 

 

Advertisements

During my holiday I have read some interesting books. Some for the beauty of imagination and some to enrich my understanding of the human brain.

Why the human brain? It is the foundation and motto of my company: The Know-How to Know Now.
In 2012 I wrote a post: Our brain blocks PLM acceptance followed by a post in 2014  PLM is doomed, unless …… both based on observations and inspired by the following books (must read if you are interested in more than just PLM practices and technology):

In 2014, Digital Transformation was not so clear. We talked about disruptors, but disruption happened outside our PLM comfort zone.

Now six years later disruption or significant change in the way we develop and deliver solutions to the market has become visible in the majority of companies. To stay competitive or meaningful in a global market with changing customer demands, old ways of working no longer bring enough revenue to sustain.  The impact of software as part of the solution has significantly changed the complexity and lifecycle(s) of solutions on the market.

Most of my earlier posts in the past two years are related to these challenges.

What is blocking Model-Based Definition?

This week I had a meeting in the Netherlands with three Dutch peers all interested and involved in Model-Based Definition – either from the coaching point of view or the “victim” point of view.  We compared MBD-challenges with Joe Brouwer’s AID (Associated Information Documents) approach and found a lot of commonalities.

No matter which method you use it is about specifying unambiguously how a product should be manufactured – this is a skill and craftsmanship and not a technology. We agreed that a model-based approach where information (PMI) is stored as intelligent data elements in a Technical Data Package (TPD) will be crucial for multidisciplinary usage of a 3D Model and its associated information.

If we would store the information again as dumb text in a view, it will need human rework leading to potential parallel information out of sync, therefore creating communication and quality issues. Unfortunate as it was a short meeting, the intention is to follow-up this discussion in the Netherlands to a broader audience. I believe this is what everyone interested in learning and understanding the needs and benefits of a model-based approach (unavoidable) should do. Get connected around the table and share/discuss.

We realized that human beings indeed are often the blocking reason why new ways of working cannot be introduced. Twenty-five years ago we had the discussion moving from 2D to 3D for design. Now due to the maturity of the solutions and the education of new engineers this is no longer an issue. Now we are in the next wave using the 3D Model as the base for manufacturing definition, and again a new mindset is needed.

There are a few challenges here:

  • MBD is still in progress – standards like AP242 still needs enhancements
  • There is a lack of visibility on real reference stories to motivate others.
    (Vendor-driven stories often are too good to be true or too narrow in scope)
  • There is no education for (modern) business processes related to product development and manufacturing. Engineers with new skills are dropped in organizations with traditional processes and silo thinking.

Educate, or our brain will block the future!

The above points need to be addressed, and here the human brain comes again into the picture.  Our unconscious, reptile brain is continuously busy to spend a least amount of energy as described in Thinking, Fast and Slow. Currently, I am reading the Idiot Brain: What Your Head Is Really Up To by Dean Burnett, another book confirming that our brain is not a logical engine making wise decisions

And then there is the Dunning-Kruger effect, explaining that the people with the lowest skills often have the most outspoken opinion and not even aware of this flaw. We see this phenomenon in particular now in social media where people push their opinion as if they are facts.

So how can we learn new model-based approaches and here I mean all the model-based aspects I have discussed recently, i.e., Model-Based Systems Engineering, Model-Based Definition/ Model-Based Enterprise and the Digital Twin? We cannot learn it from a book, as we are entering a new era.

First, you might want to understand there is a need for new ways of working related to complex products. If you have time, listen to Xin Guo Zhang’s opening keynote with the title: Co-Evolution of Complex Aeronautical Systems & Complex SE. It takes 30 minutes so force yourself to think slow and comprehend the message related to the needed paradigm shift for systems engineering towards model-based systems engineering

Also, we have to believe that model-based is the future. If not, we will find for every issue on our path a reason not to work toward the ultimate goal.

You can see this in the comments of my earlier post on LinkedIn, where Sami Grönstrand writes:

I warmly welcome the initiative to “clean up” these concepts  (It is time to clean up our model-based problem and above all, await to see live examples of transformations — even partial — coupled with reasonable business value identification. 

There are two kinds of amazing places: those you have first to see before you can believe they exist.
And then those kinds that you have to believe in first before you can see them…

And here I think we need to simplify en enhance the Model-Based myth as according to Yuval Harari in his book Sapiens, the power of the human race came from creating myths to align people to have long-term, forward-looking changes accepted by our reptile brain. We are designed to believe in myths. Therefore, the need for a Model-based myth.In my post PLM as a myth? from 2017, I discussed this topic in more detail.

Conclusion

There are so many proof points that our human brain is not as reliable as we think it is.  Knowing less about these effects makes it even harder to make progress towards a digital future. This post with all its embedded links can keep your brain active for a few hours. Try it, avoid to think fast and avoid assuming you know it all. Your thoughts?

 

Learning & Discussing more?
Still time to register for CIMdata PLM Roadmap and PDT Europe

 

 

 

Earth GIF - Find & Share on GIPHY

At this moment we are in the middle of the year. Usually for me a quiet time and a good time to reflect on what has happened so far and to look forward.

Three themes triggered me to write this half-year:

  • The changing roles of (PLM) consultancy
  • The disruptive effect of digital transformation on legacy PLM
  • The Model-driven approaches

A short summary per theme here with links to the original posts for those who haven’t followed the sequence.

The changing roles of (PLM) consultancy

Triggered by Oleg Shilovitsky’s post Why traditional PLM ranking is dead. PLM ranking 2.0 a discussion started related to the changing roles of PLM choice and the roles of a consultant.  Oleg and I agreed that using the word dead in a post title is a way to catch extra attention. And as many people do not read more than the introduction, this is a way to frame ideas (not invented by us, look at your newspaper and social media posts).  Please take your time and read this post till the end.

Oleg and I concluded that the traditional PLM status reports provided by consultancy firms are no longer is relevant. They focus on the big vendors, in a status-quo and most of them are 80 % the same on their core PLM capabilities. The challenge comes in how to select a PLM approach for your company.

Here Oleg and I differ in opinion. I am more looking at PLM from a business transformation point of view, how to improve your business with new ways of working. The role of a consultant is crucial here as the consultant can help to formalize the company’s vision and areas to focus on for PLM. The value of the PLM consultant is to bring experience from other companies instead of inventing new strategies per company. And yes, a consultant should get paid for this added value.

Oleg believes more in the bottom-up approach where new technology will enable users to work differently and empower themselves to improve their business (without calling it PLM). More or less concluding there is no need for a PLM consultant as the users will decide themselves about the value of the selected technology. In the context of Oleg’s position as CEO/Co-founder of OpenBOM, it is a logical statement, fighting for the same budget.

The discussion ended during the PLMx conference in Hamburg, where Oleg and I met with an audience recorded by MarketKey. You can find the recording Panel Discussion: Digital Transformation and the Future of PLM Consulting here.
Unfortunate, like many discussions, no conclusion. My conclusion remains the same – companies need PLM coaching !

The related post to this topic are:

 

The disruptive effect of digital transformation on legacy PLM

A topic that I have discussed the past two years is that current PLM is not compatible with a modern data-driven PLM. Note: data-driven PLM is still “under-development”. Where in most companies the definition of the products is stored in documents / files, I believe that in order to manage the complexity of products, hardware and software in the future, there is a need to organize data related to models not to files. See also: From Item-centric to model-centric ?

For a company it is extremely difficult to have two approaches in parallel as the first reaction is: “let’s convert the old data to the new environment”.

This statement has been proven impossible in most of the engagements I am involved in and here I introduced the bimodal approach as a way to keep the legacy going (mode 1) and scale-up for the new environment (mode 2).

A bimodal approach is sometimes acceptable when the PLM software comes from two different vendors. Sometimes this is also called the overlay approach – the old system remains in place and a new overlay is created to connect the legacy PLM system and potentially other systems like ALM or MBSE environments. For example some of the success stories for Aras complementing Siemens PLM.

Like the bimodal approach the overlay approach creates the illusion that in the near future the old legacy PLM will disappear. I partly share that illusion when you consider the near future a period of 5 – 10+ years depending on the company’s active products. Faster is not realistic.

And related to bimodal, I now prefer to use the terminology used by McKinsey: our insights/toward an integrated technology operating model in the context of PLM.

The challenge is that PLM vendors are reluctant to support a bimodal approach for their own legacy PLM as then suddenly this vendor becomes responsible for all connectivity between mode 1 and mode 2 data – every vendors wants to sell only the latest.

I will elaborate on this topic during the PDT Europe conference in Stuttgart – Oct 25th . No posts on this topic this year (yet) as I am discussing, learning and collecting examples from the field. What kept me relative busy was the next topic:

The Model-driven approaches

Most of my blogging time I spent on explaining the meaning behind a modern model-driven approach and its three main aspects: Model-Based Systems Engineering, Model-Based Definition and Digital Twins. As some of these aspects are still in the hype phase, it was interesting to see the two different opinions are popping up. On one side people claiming the world is still flat (2D), considering model-based approaches just another hype, caused by the vendors. There is apparently no need for digital continuity. If you look into the reactions from certain people, you might come to the conclusion it is impossible to have a dialogue, throwing opinions is not a discussion..

One of the reasons might be that people reacting strongly have never experienced model-based efforts in their life and just chime in or they might have a business reason not to agree to model-based approached as it does not align with their business? It is like the people benefiting from the climate change theory – will the vote against it when facts are known ? Just my thoughts.

There is also another group, to which I am connected, that is quite active in learning and formalizing model-based approaches. This in order to move forward towards a digital enterprise where information is connected and flowing related to various models (behavior models, simulation models, software models, 3D Models, operational models, etc., etc.) . This group of people is discussing standards and how to use and enhance them. They discuss and analyze with arguments and share lessons learned. One of the best upcoming events in that context is the joined CIMdata PLM Road Map EMEA and the PDT Europe 2018 – look at the agenda following the image link and you should get involved too – if you really care.

 

And if you are looking into your agenda for a wider, less geeky type of conference, consider the PI PLMx CHICAGO 2018 conference on Nov 5 and 6. The agenda provides a wider range of sessions, however I am sure you can find the people interested in discussing model-based learnings there too, in particular in this context Stream 2: Supporting the Digital Value Chain

My related posts to model-based this year were:

Conclusion

I spent a lot of time demystifying some of PLM-related themes. The challenge remains, like in the non-PLM world, that it is hard to get educated by blog posts as you might get over-informed by (vendor-related) posts all surfing somewhere on the hype curve. Do not look at the catchy title – investigate and take time to understand HOW things will this work for you or your company. There are enough people explaining WHAT they do, but HOW it fit in a current organization needs to be solved first. Therefore the above three themes.

I was planning to complete the model-based series with a post related to the digital twin. However, I did not find the time to structure my thoughts to write it up in a structured story. Therefore, this time some topics I am working on that I would like to share.

Executive days at CADCAM Group

Last week I supported the executive days organized by the CADCAM Group in Ljubljana and Zagreb. The CADCAM is a large PLM Solution and Services Provider (60+ employees) in the region of South-East Europe with offices in Croatia, Slovenia, Serbia and Bosnia and Herzegovina. They are operating in a challenging region, four relative young countries with historically more an inside focus than a global focus. Many of CADCAM Group customers are in the automotive supply chain and to stay significant for the future they need to understand and develop a strategy that will help them to move forward.

My presentation was related to the learning path each company has to go through to understand the power of digital combined with the observation that current and future ways of working are not compatible therefore requiring a scaled and bimodal approach (see also PDT Europe further down this post).

This presentation matched nicely with Oscar Torres’s presentation related to strategy. You need to decide on the new things you are going to do, what to keep and what to stop. Sounds easy and of course the challenge is to define the what to start, stop and keep. There you need good insights into your current and future business.

Pierre Aumont completed the inspiring session by explaining how the automotive industry is being disrupted and it is not only Tesla. So many other companies are challenging the current status quo for the big automotive OEMs. Croatia has their innovator for electrical vehicles too, i.e. Rimac. Have a look here.

The presentations were followed by a (long) panel discussion. The common theme in both discussions is that companies need to educate and organize themselves to become educated for the future. New technologies, new ways of working need time and resources which small and medium enterprises often do not have. Therefore, universities, governments and interest groups are crucial.

A real challenge for countries that do not have an industrial innovation culture (yet).

CADCAM Group as a catalyst for these countries understands this need by organizing these executive days. Now the challenge is after these inspiring days to find the people and energy to follow-up.

Note: CADCAM Group graciously covered my expenses associated with my participation in these events but did not in any way influence the content of this paragraph.

 

The MBD/MBE discussion

In my earlier post, Model-Based: Connecting Engineering and Manufacturing,  I went deeper into the MBD/MBE topic and its potential benefits, closing with the request to readers to add their experiences and/or comments to MBD/MBE. Luckily there was one comment from Paul van der Ree, who had challenging experiences with MBD in the Netherlands. Together with Paul and a MBD-advocate (to be named) I will try to have discussion analyzing pro’s and con’s from all viewpoints and hopefully come to a common conclusion.

This to avoid that proponents and opponents of MBD just repeat their viewpoints without trying to converge. Joe Brouwer is famous for his opposition to MBD. Is he right or is he wrong I cannot say as there has never been a discussion. Click on the above image to see Joe’s latest post yourself. I plan to come back with a blog post related to the pro’s and con’s

 

The Death of PLM Consultancy

Early this year Oleg Shilovitsky and I had a blog debate related to the “Death of PLM Consultancy”. The discussion started here: The Death of PLM Consultancy ? and a follow-up post was PLM Consultants are still alive and have an exit strategy. It could have been an ongoing blog discussion for month where the value would be to get response from readers from our blogs.

Therefore I was very happy that MarketKey, the organizers behind the PLMx conferences in Europe and the US, agreed on a recorded discussion session during PLMx 2018 in Hamburg.  Paul Empringham was the moderator of this discussion with approx. 10 – 12 participants in the room to join the discussion. You can view the discussion here through this link: PLMx Hamburg debate

I want to thank MarketKey for their support and look forward to participating in their upcoming PLMx European event and if you cannot wait till next year, there is the upcoming PLMx conference in North America on November 5th and 6th – click on the image on the left to see the details.

 

 

PDT Europe call for papers

As you might have noticed I am a big supporter of the joint CIMdata/PDT Europe conference. This year the conference will be in Stuttgart on October 24th (PLM Roadmap) and October 25th (PDT).

I believe that this conference has a more “geeky” audience and goes into topics of PLM that require a good base understanding of what’s happening in the field. Not a conference for a newcomer in the world of PLM, more a conference for an experienced PLM person (inside a company or from the outside) that has experience challenging topics, like changing business processes, deciding on new standards, how to move to a modern digital business platform.

It was at these events where concepts as Model-Based were discussed in-depth, the need for Master Data Management, Industry standards for data exchange and two years ago the bimodal approach, also valid for PLM.

I hope to elaborate on experiences related to this bimodal or phased approach during the conference. If you or your company wants to contribute to this conference, please let the program committee know. There is already a good set of content planned. However, one or two inspiring presentations from the field are always welcome.
Click on this link to apply for your contribution

Conclusion

There is a lot on-going related to PLM as you can see. As I mentioned in the first topic it is about education and engagement. Be engaged and I am looking forward to your response and contribution in one or more of the topics discussed.

200-10This post is my two-hundredth blog post, and this week it is exactly ten years ago that I started blogging related to the topic of PLM.

The world was quite different at that time. Global connectivity started to become visible, digital transformation and digital twin were not a hype at that time. I remember 2008 as the years where I was advocating for PLM practices to be adopted by Small and Medium Enterprises (the initial goal of setting up this blog) and later to explain PLM practices to people in industries that were not even thinking about these terms (Engineering, Procurement, Construction companies, the construction industry in general and Owners/Operators of process plants (Nuclear, Energy, Chemical).

dialogueThe past 5 years you will recognize a shift more to the people side of PLM (what does PLM mean / impact my daily life/my organization), what makes sense/ nonsense of the new hypes mainly about the potential and risks related to becoming a digital enterprise. I learned and discussed these themes mostly through larger enterprises, as usually, they cannot change that fast. Therefore they have to be on the lookout for threats and trends earlier.

I did not expect 10 years ago to blog for such a long time and I do not expect to keep on blogging another 10 years. However, as the future cannot be predicted, for the moment I will continue based on observations and experiences from being in the field.

Conclusion

Below you find my first blog post from ten years ago. As you might discover after reading this post, the world of PLM is not changing fast or is it ? What is your opinion ?

Next post I will continue my series related to the term model-based.

 cropped-8years2.jpg

A Virtual Dutchman’s introduction
(May 22nd 2008)

Virtual Dutchman

Why Virtual ? This is my first post, and in the future, I will update you about my experiences in the world of PLM. Those of you not familiar with PLM I suggest searching for the definition on the web, and you will find many almost similar definitions – a neutral one you can find on Wikipedia. The main goal behind PLM is that by managing all steps of the product lifecycle from concept through development until even destruction, the company will be able to optimize and integrate all steps and information. This combined with best practices on how to develop, release and benefit from customer feedback will lead to higher revenues and a more competitive position for such a company.

Most of the PLM software companies provide their solutions around a 3D CAD system, as the 3D CAD model is the understandable representation of a product. Here we see the virtual products, and with analysis and simulation software we can test these products even before they are produced. Mobile phones undergo virtual crash tests; cars crash virtually and as I learned, even diapers are tested virtually.

Some PLM companies like Dassault Systèmes and Siemens UGS go even beyond the 3D CAD and integrate the whole manufacturing process initially through software to provide a virtual production process. This allows companies to fix (virtual) errors in the production process and the prototype even before a single product is manufactured in the real world. The time and costs savings of this virtualization allow companies to respond faster and better than their competitors. This change to define a complete virtual product and production process is costly and only affordable by the big enterprise, but for sure this trend will continue.

With the introduction of PLM 2.0, Dassault Systèmes even introduced another extension to PLM, the involvement of the customer, experiencing the virtual product before it even exists. The 2.0 version is a reference to WEB 2.0 bringing WEB content to be influenced by the consumer. In the same analogy, PLM 2.0  brings the world of product design to be influenced immediately by the customer, wherein the past customers only could review and select from existing products.

Look at the See What You Mean movie.

A virtual world seems to be a future trend, with possible virtual consumers. Currently, the trend to virtualization can be compared with teenage sex; they all talk about but …….

As a Dutchman working in the real world, I am targeting to become a virtual Dutchman. This allows me to experience things I have never done and dared before. But before reaching this goal, I will entertain you with my observations around PLM and look forward to real discussions.

image.png

 

PDT Europe is over, and it was this year a surprising aligned conference, showing that ideas and concepts align more and more for modern PLM. Håkan Kårdén opened the conference mentioning the event was fully booked, about 160 attendees from over 19 countries. With a typical attendance of approx. 120 participants, this showed the theme of the conference: Continuous Transformation of PLM to support the Lifecycle Model-Based Enterprise was very attractive and real. You can find a history of tweets following the hashtag #pdte17

Setting the scene

Peter Bilello from CIMdata kicked-off by bringing some structure related to the various Model-Based areas and Digital Thread. Peter started by mentioning that technology is the least important issue as organization culture, changing processing and adapting people skills are more critical factors for a successful adoption of modern PLM. Something that would repeatedly be confirmed by other speakers during the conference.

Peter presented a nice slide bringing the Model-Based terminology together on one page. Next, Peter took us through various digital threads in the different stages of the product lifecycle. Peter concluded with the message that we are still in a learning process redefining optimal processes for PLM, using Model-Based approaches and Digital Threads and thanks (or due) to digitalization these changes will be rapid. Ending with an overall conclusion that we should keep in mind:


It isn’t about what we call digitalization; It is about delivering value to customers and all other stakeholders of the enterprise

Next Marc Halpern busted the Myth of Digital Twins (according to his session title) and looked into realistic planning them. I am not sure if Marc smashed some of the myths although it is sure Digital Twin is at the top of the hype cycle and we are all starting to look for practical implementations. A digital twin can have many appearances and depends on its usage. For sure it is not just a 3D Virtual model.

There are still many areas to consider when implementing a digital twin for your products. Depending on what and how you apply the connection between the virtual and the physical model, you have to consider where your vendor really is in maturity and avoid lock in on his approach. In particular, in these early stages, you are not sure which technology will last longer, and data ownership and confidentially will play an important role. And opposite to quick wins make sure your digital twin is open and use as much as possible open standards to stay open for the future, which also means keep aiming for working with multiple vendors.

Industry sessions

Next, we had industry-focused sessions related to a lifecycle Model-Based enterprise and later in the afternoon a session from Outotec with the title: Managing Installed Base to Unlock Service opportunities.

The first presentation from Väino Tarandi, professor in IT in Construction at KTH Sweden presented his findings related to BIM and GIS in the context of the lifecycle, a test bed where PLCS meets IFC. Interesting as I have been involved in BIM Level 3 discussions in the UK, which was already an operational challenge for stakeholders in the construction industry now extended with the concept of the lifecycle. So far these projects are at the academic level, and I am still waiting for companies to push and discover the full benefits of an integrated approach.

Concepts for the industrial approach could be learned from Outotec as you might understand later in this post. Of course the difference is that Outotec is aiming for data ownership along the lifecycle, where in case of the construction industries, each silo often is handled by a different contractor.

Fredrik Ekström from Swedish Transport Administration shared his challenges of managing assets for both road and railway transport – see image on the left. I have worked around this domain in the Netherlands, where asset management for infrastructure and asset management for the rail infrastructure are managed in two different organizations. I believe Fredrik (and similar organizations) could learn from the concepts in other industries. Again Outotec’s example is also about having relevant information to increase service capabilities, where the Swedish Transport Administration is aiming to have the right data for their services. When you look at the challenges reported by Fredrik, I assume he can find the answers in other industry concepts.

Outotec’s presentation related to managing installed base and unlock service opportunities explained by Sami Grönstrand and Helena Guiterrez was besides entertaining easy to digest content and well-paced. Without being academic, they explained somehow the challenges of a company with existing systems in place moving towards concepts of a digital twin and the related data management and quality issues. Their practical example illustrated that if you have a clear target, understanding better a customer specific environment to sell better services, can be achieved by rational thinking and doing, a typical Finish approach. This all including the “bi-modal approach” and people change management.

Future Automotive

Ivar Hammarstadt, Senior Analyst Technology Intelligence for Volvo Cars Corporation entertained us with a projection toward the future based on 160 years of automotive industry. Interesting as electrical did not seem to be the only way to go for a sustainable future depending on operational performance demands.

 

Next Jeanette Nilsson and Daniel Adin from Volvo Group Truck shared their findings related to an evaluation project for more than one year where they evaluated the major PLM Vendors (Dassault Systemes / PTC / Siemens) on their Out-of-the-box capabilities related to 3D product documentation and manufacturing.

They concluded that none of the vendors were able to support the full Volvo Truck complexity in a OOTB matter. Also, it was a good awareness project for Volvo Trucks organization to understand that a common system for 3D geometry reduces the need for data transfers and manual data validation. Cross-functional iterations can start earlier, and more iterations can be performed. This will support a shortening of lead time and improve product quality. Personally, I believe this was a rather expensive approach to create awareness for such a conclusion, pushing PLM vendors in a competitive pre-sales position for so much detail.

Future Aerospace

Kenny Swope from Boeing talked us through the potential Boeing journey towards a Model-Based Enterprise. Boeing has always been challenging themselves and their partners to deliver environments close to what is possible. Look at the Boeing journey and you can see that already in 2005 they were aiming for an approach that most of current manufacturing enterprises cannot meet. And now they are planning their future state.

To approach the future state Boeing aims to align their business with a single architecture for all aspects of the company. Starting with collecting capabilities (over 400 in 6 levels) and defining value streams (strategic/operational) the next step is mapping the capabilities to the value streams.  Part of the process would be to look at the components of a value stream if they could be fulfilled by a service. In this way you design your business for a service-oriented architecture, still independent from any system constraints. As Kenny states the aerospace and defense industry has a long history and therefore slow to change as its culture is rooted in the organization. It will be interesting to learn from Kenny next hear how much (mandatory) progress towards a model-based enterprise has been achieved and which values have been confirmed.

Gearing up for day 2

Martin Eigner took us in high-speed mode through his vision and experience working in a bi-modular approach with Aras to support legacy environments and a modern federated layer to support the complexity of a digital enterprise where the system architecture is leading. I will share more details on these concepts in my next post as during day 2 of PDT Europe both Marc Halpern and me were talking related to this topic, and I will combine it in a more extended story.

The last formal presentation for day one was from Nigel Shaw from Eurostep Ltd where he took us through the journey of challenges for a model-based enterprise. As there will not be a single model that defines all, it will be clear various models and derived models will exist for a product/system.  Interesting was Nigel’s slide showing the multiple models disciplines can have from an airplane (1948). Similar to the famous “swing” cartoon, used to illustrate that every single view can be entirely different from the purpose of the product.

Next are these models consistent and still describing the same initial specified system. On top of that, even the usage of various modeling techniques and tools will lead to differences in the system. And the last challenge on top is managing the change over the system’s lifecycle. From here Nigel stepped into the need for digital threads to govern relations between the various views per discipline and lifecycle stage, not only for the physical and the virtual twin.  When comparing the needs of a model-based enterprise through its lifecycle, Nigel concluded that using PLCS as a framework provides an excellent fit to manage such complexity.

Finally, after a panel discussion, which was more a collection of opinions as the target was not necessary to align in such a short time, it was time for the PDT dinner always an excellent way to share thoughts and verify them with your peers.

Conclusion

Day 1 was over before you knew it without any moment of boredom and so I hope is also this post. Next week I will close reviewing the PDT conference with some more details about my favorite topics.

 

At this moment there are two approaches to implement PLM. The most common practice is item-centric and model-centric will be potentially the best practice for the future. Perhaps your company still using a method from the previous century called drawing-centric. In that case, you should read this post with even more attention as there are opportunities to improve.

 

The characteristics of item-centric

In an item-centric approach, the leading information carrier is an item also known as a part. The term part is sometimes confusing in an organization as it is associated with a 3D CAD part. In SAP terminology the item is called Material, which is sometimes confusing for engineering as they consider Material the raw material. Item-centric is an approach where items are managed and handled through the whole lifecycle. In theory, an item can be a conceptual item (for early estimates), a design item (describing the engineering intent), a manufacturing item (defining how an item is consumed) and potentially a service item.

The picture below illustrates the various stages of an item-centric approach. Don’t focus on the structure, it’s an impression.

It is clear these three structures are different and can contain different item types. To read more about the details for an EBOM/MBOM approach read these post on my blog:

Back to item-centric. This approach means that the item is the leading authority of the product /part. The id and revision describe the unique object in the database, and the status of the item tells you in the current lifecycle stage for the item. In some cases, where your company makes configurable products also the relation between two items can define effectivity characteristics, like data effectivity, serial number effectivity and more. From an item structure, you can find its related information in context. The item points to the correct CAD model, the assembly or related manufacturing drawings, the specifications. In case of an engineering item, it might point towards approved manufacturers or approved manufacturing items.

Releasing an item or a BOM means the related information in context needs to validated and frozen too. In case your company works with drawings for manufacturing, these drawings need to be created, correct and released, which sometimes can be an issue due to some last-minute changes that can happen. The above figure just gives an impression of the potential data related to an item. It is important to mention that reports, which are also considered documents, do not need an approval as they are more a snapshot of the characteristics at that moment of generation.

The advantages of an item-centric approach are:

  • End-to-end traceability of information
  • Can be implemented in an evolutionary approach after PDM-ERP without organizational changes
  • It enables companies to support sharing of information
  • Sharing of information forces companies to think about data governance
    (not sure if a company wants to invest on that topic)

The main disadvantages of an item-centric approach are:

  • Related information on the item is not in context and therefore requires its own management and governance to ensure consistency
  • Related information is contained in documents, where availability and access is not always guaranteed

Still, the item-centric approach brings big benefits to a company that was working in a classical drawing-driven PDM-ERP approach. An additional remark needs to be made that not every company will benefit from an item-centric approach as typically Engineering-to-Order companies might find this method creating too much overhead.

The characteristics of Model-Centric

A model-centric approach is considered the future approach for modern enterprises as it brings efficiency, speed, multidisciplinary collaboration and support for incremental innovation in an agile way. When talking about a model-centric approach, I do not mean a 3D CAD model-centric approach. Yes, in case the product is mature, there will be a 3D Model serving as a base for the physical realization of the product.

However, in the beginning, the model can be still a functional or logical model. In particular, for complex products, model-based systems engineering might be the base for defining the solution. Actually, when we talk about products that interact with the outside world through software, we tend to call them systems. This explains that model-based systems engineering is getting more and more a recommended approach to make sure the product works as expected, fulfills all the needs for the product and creates a foundation for incremental innovation without starting from scratch.

Where the model-based architecture provides a framework for all stakeholders, the 3D CAD model will be the base for a digital thread towards manufacturing. Linking parameters from the logical and functional model towards the physical model a connection is created without the need to create documents or input-files for other disciplines. Adding 3D Annotations to the 3D CAD model and manufacturing process steps related to the model provides a direct connection to the manufacturing process.

The primary challenge of this future approach is to have all these data elements (requirements, functions, components, 3D design instances, manufacturing processes & resources to be connected in a federated environment (the product innovation platform). Connecting, versioning and baselining are crucial for a model-centric approach. This is what initiatives like Industry 4.0 are now exploring through demonstrators, prototypes to get a coherent collection of managed data.

Once we are able to control this collection of managed data concepts of digital twin or even virtual twin can be exploited linking data to a single instance in the field.

Also, the model can serve as the foundation for introduction incremental innovation, bringing in new features.  As the model-based architecture provides direct visibility for change impact (there are no documents to study), it will be extremely lean and cost-efficient to innovate on an existing product.

Advantages of model-centric

  • End-to-end traceability of all data related to a product
  • Extremely efficient in data-handling – no overhead on data-conversions
  • Providing high-quality understanding of the product with reduced effort compared to drawing-centric or item-centric approaches
  • It is scalable to include external stakeholders directly (suppliers/customers) leading to potential different, more beneficial business models
  • Foundation for Artificial Intelligence at any lifecycle step.

Disadvantages of model-centric

  • It requires a fundamentally different way of working compared to past. Legacy departments, legacy people, and legacy data do not fit directly into the model-centric approach. A business transformation is required, not evolution.
  • It is all about sharing data, which requires an architecture that is built to share information across Not through a service bus but as a (federated) platform of information.
    A platform requires a strong data governance, both from the dictionary as well as authorizations which discipline is leading/following.
  • There is no qualified industrial solution from any vendor yet at this time. There is advanced technology, there are demos, but to my knowledge, there is no 100% model-centric enterprise yet. We are all learning. Trying to distinguish reality from the hype.

 

Conclusions

The item-centric approach is the current best practice for most PLM implementations. However, it has the disadvantage that it is not designed for a data-driven approach, the foundation of a digital enterprise. The model-centric approach is new. Some facets already exist. However, for the total solution companies, vendors, consultants, and implementers are all learning step-by-step how it all connects. The future of model-centric is promising and crucial for survival.

Do you want to learn where we are now related to a model-centric approach?
Come to PDT2017 in Gothenburg on 18-19th October and find out more from the experts and your peers.

During my summer holidays, I read some fantastic books to relax the brain. Confessions from Jaume Cabré was an impressive novel, and I finished Yuval Noah Harari’s book Sapiens.

However, to get my PLM-twisted brain back on track, I also decided to read the book “The Death of Expertise” from Tom Nichols, with the thought-provoking subtitle” “The Campaign Against Established Knowledge and Why it Matters.”

I wanted to read it and understand if and how this would apply for PLM.

Tom Nichols is an American, so you understand he has many examples to support his statement from his own experience, like the anti-vaccination “experts”,  the climate change “hoax” and an “expert” tweeting president in his country who knows everything. Besides these obvious examples, Tom explains in a structured way how due to more general education and the internet, the distance between an expert and a average person has disappeared and facts and opinions seem to be interchangeable. I talked about this phenomena during the Product Innovation conference in Munich 2016: The PLM identity crisis.

Further down the book, Tom becomes a little grumpy and starts to complain about the Internet, Google and even about Wikipedia. These information resources provide so often fake or skin-deep information, which is not scientifically proven by experts. It reminded me of a conference that I attended in the early nineties of the previous century.  An engineering society had organized this conference to discuss the issue that finite element analysis became more and more available to laymen. The affordable simulation software would be used by non-trained engineers, and they would make the wrong decisions. Constructions would fall down, machines would fail. Looking back now, we can see the liberation of finite element analysis leads to more usage of simulation technology providing better products and when really needed experts are still involved.

I have the same opinion for internet, Google, and Wikipedia. They rapidly provide information. Still, you need to do fact checking and look at multiple sources, even if you found the answer that you liked already. Usually, when I do my “research” using the internet, I try to find different sources with different opinions and if possible also from various countries. What you will discover is that, when using the internet, there is often detailed information, but not in the headlines of these pages. To get down to the details, we will need experts for certain cases, but we cannot turn the clock back to the previous century.

What about PLM Expertise?

In the case of PLM, it is hard to find real expertise. Although PLM is recognized as a business strategy / a domain / an infrastructure , PLM has so many faces depending on the industry and its application. It will be hard to find an expert who understands it all and I assume headhunters can confirm this. A search for “PLM Consultant” on LinkedIn gives me almost 4000 hits, and when searching for “PLM Expert,” this number is reduced to less than 200. With only one source of information (LinkedIn), these figures do not really give an in-depth result (as expected !)

However, what is a PLM expert? Recently I wrote a post sharing the observation that a lot of PLM product – or IT-focused discussions miss the point of education (see PLM for Small and Medium Enterprises – It is not the software). In this post, I referred to an initiative from John Stark striving for the recognition of a PLM professional. You can read John’s follow up on this activity here: How strong is the support for Professional PLM?  Would a PLM Professional bring expertise?

I believe when a company understands the need for PLM, they have to build this knowledge internally. Building knowledge is a challenge for small and medium enterprises. It is a long-term investment contributing to the viability of the company. Support from a PLM professional can help. However, like the job of a teacher, it is about the skill-set (subjects, experience) and the motivational power of such a person. A certificate won’t help to select a qualified person.

Conclusion

We still need PLM expertise, and it takes time to build it. Expertise is something different as an (internet) opinion. When gaining PLM expertise, use the internet and other resources wisely. Do not go for the headlines of an internet page. Go deeper than the marketing pages from PLM related companies (vendors/implementers). Take time and hire experts to help you, not to release you from your responsibility to collect the expertise.

 

Note: If you want to meet PLM Experts and get a vendor-independent taste of PLM, join me at PDT Europe 2017 on 18-19 October in Gothenburg.  The theme of the conference: Continuous transformation of PLM to support the Lifecycle Model-Based Enterprise.  The conference is preceded on 17th October by CIMdata’s PLM Roadmap Europe 2017. Looking forward to meet you there !

 

 

elevator_thumb.jpgRecently I connected with a fellow countryman, Flip, through LinkedIn and we had a small dialogue related to PLM. Flip describes himself as a millennial thinking loud about PLM and shared some of his thoughts trying to define “the job of PLM.” Instead of keeping it a Dutch dialogue, I would like to open the dialogue to all (millennials), as we need a new generation of PLM consultants

Point 1

observation_thumb.png(Flip) You cannot automate design activities easily, but the rest you can. Isn’t PLM an evolution of 3D Design tooling (and with that the next step in design – theory)

think_thumb.pngYou are right. Historically PLM originated from managing 3D design in a collaborative manner, although at that time we would call it cPDM (Collaborative Product Data Management).  PDM was very design focused. However, PDM also supported the connection to an Engineering Bill of Materials (EBOM) and connected engineering change processes (Engineering Change Request / Engineering Change Order – read more: ECR/ECO for Dummies)

PTC’s Windchill was the first modern cPDM software that still exists. At the same time, Dassault Systemes and Siemens extended the support for design towards the manufacturing planning and execution, introducing the term PLM (Product Lifecycle Management). In the following years, PLM systems started to support the full go-to-market lifecycle as the figure shows below.

lifecycle

This linear go-to-market process is currently rapidly changing because PLM is changing.

plm_txt_thumb.pngThe P standing for Product now represents a System (hardware & software interacting with the environment). The L standing for Lifecycle is also under change.

Support for the Lifecycle of a “product” has changed in two ways. First, the lifecycle is no longer going to be a linear process, but also be more iterative and incremental for the same “product.” Secondly, the lifecycle is stretched to support the “products” in the fields thanks to feedback from sensors (IoT – Internet of Things). That’s why PTC now claims IoT is PLM. Read more: Best Practices or Next Practices.

Finally, the M from Management is under change as thanks to a data-driven approach we should be able to (semi-)automate processes using algorithms. Favorite buzz words here are machine-learning, cobots (collaborative robots) and preventive actions thanks to data analysis & trends.

Point 2

observation_thumb.png (Flip) Storing data in a structured manner creates more complexity (you need to choose what to store). With simulation, complexity could be reduced to make meaningful (design) decisions, so PLM is about clever data hoarding?

image_thumb.pngI believe there is always a challenge with managing structured data for two reasons. People often only create the data they require.  Adding more context more data or a richer context is often considered “extra work,” for with the department is not rewarded or adding more data is not known as these persons do not know the future use of their information. This is a typical exercise for companies now engaging in a digital transformation. (read more: The importance of accurate data)

think_thumb.pngWhen you talk about simulation, I immediately thought about the current trend to work towards a model-based enterprise, where the model is the center of all information. And with the model, we do not only mean the 3D Model but also the functional and logical model which we can simulate. (Read more: Digital PLM requires a Model-Based Enterprise)

Point 3

observation_thumb.png(Flip) Automation from manufacturing with more and more resources requires new ways to drive manufacturing so a team of 8 people can do the work of 80 people through a PLM system?

Industry4Here you are addressing exactly the point that initiatives like Industry 4.0 or in the Netherlands Smart Industry are addressing. Instead of a linear, document-driven process, where each step new versions of information need to be created, the dream is to work around a model (the model-based enterprise).

The idea is that data is flowing through the organization – digital continuity / digital thread – without conversion and by using algorithms and machine learning, the data is consumed and created during the manufacturing process in an automated manner. Indeed, reducing the amount of people involved drastically.

think_thumb.pngI am not sure of we still would call this PLM, it is more a digital enterprise, where digital platforms interact together. PLM could be considered the source for the Product Innovation Platform, but there will also be Execution platforms (ERP and MES as the main source) and customer related platform (CRM as a source). As vendors from all these platforms will provide overlapping functionality, it will be hard to draw exact lines. The main goal for a company will be that the data is flowing and not locked into a proprietary format or systems. And here we still have a lot of work to do,

Conclusion

No conclusion this time as it is an on-going dialogue. Feel free to comment or send your questions, and we can all learn from the dialogue (always better than a monologue).

Your thoughts?

My last blog post was about reasons why PLM is not simple. PLM supporting a well-planned business transformation requires business change / new ways of working. PLM is going through different stages. We are moving from drawing-centric (previous century), through BOM-centric (currently) towards model-centric (current and future). You can read the post here: PLM is not simple!

I was happy to see  my blog buddy Oleg Shilovitsky chimed in on this theme, with his post: Who needs Simple PLM? Oleg reviewed the stakeholders around a PLM implementation. An analytical approach which could be correct in case predictive human beings were involved. Since human beings are not predictive and my focus is on the combination of PLM and human beings, here are some follow comments on the points Oleg made:

 

Customers (Industrial companies)

Oleg wrote:

A typical PLM customer isn’t a single user. A typical PLM buyer is engineering IT organization purchasing software to solve business problem. His interest to solve business problem, but not really to make it simple. Complex software requires more people, an increased budget and can become an additional reason to highlight IT department skills and experience. End-users hate complex software these days,therefore, usability is desired, but not top priority for enterprise PLM.

My comments on this part: PLM becomes more and more an infrastructure for product information along the whole lifecycle. PLM is no longer an engineering tool provided by IT.

There are now many other stakeholders that need product data, in particular when we are moving to a digital enterprise. A model-based approach connects Manufacturing and Service/Operations through a digital thread. It is the business demanding for PLM to manage their complexity. IT will benefit from a reduction in silo applications.

 

PLM Vendors

Oleg wrote:

…most PLM vendors are far away from a desired level of simplicity. Marketing will like “simple” messages, but if you know how to sell complex software, you won’t be much interested to see “simple package” everyone can sell. However, for the last decade, PLM vendors were criticized a lot for complexity of their solutions, so they are pretty much interested how to simplify things and present it as a competitive differentiation.

 

Here we are aligned. All PLM vendors are dreaming of simplifying their software. Imagine: if you have a simple product everyone can use, you would be the market leader and profitable like crazy without a big effort as the product is simple. Of course, this only works, assuming this dream can be realized.

Some vendors believe that easy customization or configuration of the system means simplification. Others believe a simple user-interface is the key differentiator. Compared to mass-consumer software products in the market, a PLM system is still a niche product, with a limited amount of users working with the exact same version of the software. Combined with the particular needs (customizations) every company has (“we are different”), there will never be a simple PLM solution. Coming back to the business transformation theme, human beings are the weakest link.

 

Implementation and Service Providers

Oleg wrote:

Complex software, customization, configuration, know-hows, best practices, installation… you name it.More of these things can only lead to more services which is core business of PLM service providers. PLM industry is very much competitive, but simplicity is not a desired characteristic for PLM when it comes to service business. Guess what… customer can figure it out how to make it and stop paying for services.

Here we are totally aligned. In the past, I have been involved in potential alliances where certain service providers evaluated SmarTeam as a potential tool for their business. In particular, the major PLM service providers did not see enough value in an easy to configure and relatively cheap product. Cheap means no budget for a huge amount of services.

Still, the biggest problem SmarTeam had after ten years was the fact that every implementation became a unique deployment. Hard to maintain and guarantee for the future. In particular, when new functionality was introduced which potentially already existed as customization.  Implementation and service providers will never say NO to a customer when it comes to further customization of the system. Therefore, the customer should be in charge and own the implementation. For making strategic decision support can come from a PLM consultant or coach.

 

PLM Consultants

Here Oleg wrote:

Complex software can lead to good consulting revenues. It was true many years for enterprise software. Although, most of PLM consultants are trying to distant from PLM software and sell their experience “to implement the future”, simplicity is not a favorite word in consulting language. Customer will hire consulting people to figure out the future and how to transform business, but what if software is simple enough to make it happen without consultant? Good question to ask, but most of them will tell you it is not a realistic scenario. Which is most probably true today. But here is the hint – remember the time PC technicians knew how to configured jumpers on PC cards to make printer actually print something?

Here we are not aligned. Business transformations will never happen because of simple tools. People are measured and pushed to optimize their silos in the organization. A digital transformation, which is creating a horizontal flow and transparency of information, will never happen through a tool. The organization needs to change, and this is always driven by a top-down strategy. PLM consultants are valuable to explain the potential future, to coach all levels of the organization. In theory, a PLM consultant’s job is tool independent. However, the challenge of being completely disconnected from the existing tools might allow for dreams that never can be realized. In reality, most PLM consultants are experienced in one or more specific tools they have been implementing. The customer should be aware of that and make sure they own the PLM roadmap.

My conclusion:

Don’t confuse PLM with a tool, simple or complex. All PLM tools have a common base and depending on your industry and company’s vision there will be a short list. However, before you touch the tools, understand your business and the transformation path you want to take. And that is not simple !!

 

Your opinion?

Oleg and I can continue this debate for a long time.  We would be interested in learning your view on PLM and Simplicity – please tune in through the comments section below:

%d bloggers like this: