You are currently browsing the category archive for the ‘PLM’ category.

simple

In my previous post, I shared my thoughts Why PLM is the forgotten domain in digital transformation. Legacy data, (legacy) people and slow organizations are the main inhibitors to moving forward. Moreover, all this legacy makes it hard to jump on the digital wagon.

When you talk with vendors and implementers of PLM solutions, they will all focus on the fact that with their solution and support PLM is simple. It is simple because:plm-vendor_thumb.jpg

  • We have the largest market share in your industry segment
  • We have the superior technology
  • We are cloud-based
  • We are insane customizable
  • Gartner is talking about us
  • We have implemented at 100+ similar companies

For my customers, implementing PLM was never simple as every PLM implementation was driving a business change. In the early days of SmarTeam, we had the theme “We work the way you work”, which is in hindsight a very bad statement. You do not want to automate the way a company is currently working. You want to use a PLM implementation to support a business change.

Never implement the past, implement the future

And there are changes ……

When I was discussing PLM with my potential customers ten years ago, the world was different. PLM was in a transition from being a PDM-tool from engineering into an extended PDM-tool centered around product development. A major theme for this kind of implementations was to move from a document-driven environment towards an item-centric environment. Instead of managing documents (CAD files and other files like Excel) the implementation was based on providing a data continuity, where the item (the physical part or in SAP terms the material) would be the main information placeholder. The continuity is implemented around EBOMs and MBOMs and thanks to automation the MBOM can be connected to the ERP system in a continuous flow.

Just search for item-centric or BOM-centric, and you will find many references from vendors and consultants for this approach.  Implementing PLM item-centric is already a big step forward in efficiency and quality for companies. However,…

Never implement the past, implement the future

And there will be changes …..

youtube

Digital Transformation & PLM on YouTube

Digital transformation is changing the way we do business and is changing the way companies should organize their data. A BOM-centric approach is no longer the ultimate implementation concept. To support a digital enterprise, the next step is a model-based enterprise. The model (not necessary the 3D-model) and its maturity and configurations are intended to be the reference for an organization. The model and its representation can connect hardware and software in a data-driven environment through the whole lifecycle. A model is needed to support smart manufacturing and the digital twin concept.There are many impressive marketing movies on YouTube explaining how companies/vendors implement digital continuity. Unfortunate the gap between marketing and reality is big at this time because moving to a model based enterprise is not an easy step. Coming back to the LEGACY-statement at the beginning of this post, it is not simple.

We all have to learn

PDT2017Digital transformation is just starting in the domain of PLM. Sharing and collecting knowledge is crucial, independent from particular solutions. For me, the upcoming PDT-conference in October is going to be a reference point where we are on this journey. In case your company has the experience to share related to this topic, please react to this link: http://pdteurope.com/call-for-abstract-now-open/

In case you want to learn and believe it is not simple, wait till the program it will be announced. The PDT conference has always been a conference where details are discussed. Looking forward and discuss with you.

Conclusion

Implementing and continuing with PLM is not simple for a company due to changes in paradigms. Digital transformation forces companies to investigate the details how to make it happen. Implementing PLM in scope of a digital transformation requires learning and time, not products first.

A month ago I attended PI Berlin 2017 and discussed how digital transformation should affect PLM. You can find the presentation here on Slideshare.  One of the conclusions of my presentation was that PLM is the forgotten domain in digital transformation, which lead to the tweet below from Nick Leeder from SKF.

PI-tweet

I am from the generation who believes answering complex issues through tweets is not a best practice. Therefore, I dedicate this post to answer Nick’s question.

Digital Transformation

OldTicket.pngA digital enterprise is the next ultimate dream after the paperless office. Where the paperless office was focusing on transforming paper-based information into electronic information, there was not a mind-shift in the way people could work. Of course, when information became available in an electronic format, you could easily centralize it and store in places accessible to many others. Centralizing and controlling electronic information is what we did in the previous century with document management, PDM, and classical PLM.  An example: your airline ticket now provided as a PDF-file – electronic, not digital.

This process is not a digital transformation

dig_ticketDigital Transformation means that information is broken down into granular information objects that can be stored in a database in the context of other information objects. As they have a status and/or relation to other information objects, in a certain combination they bring, in real-time, relevant information to a user. The big difference with electronic information is that the content does not need a person to format, translate or pre-process the data. An example: your boarding app, showing the flight, the departure time, the gate all in real-time. If there is a change, you are immediately updated.

 

Digital Transformation for an enterprise

In a digital enterprise, information needs to be available as granular information objects related to each other providing the end-to-end continuity of data. End-to-end continuity does not mean that all data is stored in a single environment. The solution can be based on digital platforms working together potentially enriched by “micro-services” to cover specific gaps the digital platforms do not deliver.

ERP platformERP systems by nature have been designed to be digital. Logistical information, financial information, part information for scheduling, etc., all is managed in database tables, to allow algorithms and calculations to take place in real-time. Documents are generated to store snapshots of information (a schedule / a report), or there are pointers to documents that should contain digital, unmanaged information, like contracts, drawings, models. Therefore, the digital transformation does not impact ERP so much.

IOTCustomer connected platforms are a typical new domain for manufacturers, as this is where the digital transformation takes place in business. Connecting either to your products in the field or connecting to your consumers in the market have been the typical business changes almost every manufacturer is implementing, thanks to IoT and thanks to global connectivity. As this part of the business is new for a company, there is no legacy to deal with and therefore exciting to present to the outside world and the management.

The problem of legacy

And here comes the problem why companies try to neglect their PLM environments. There is so much legacy data, stored in documents (electronic formats) that cannot be used in a digital PLM environment. Old PLM quality processes were about validating documents, the container of information, not about the individual information objects inside the document. And when information changes, there is no guarantee the document is going to be updated, due to economic reasons (time & resources)

IntNumber.jpgTo give an example. A year ago I wrote a post:  The Impact of Non-Intelligent Part Numbers where I explained in a digitally connected enterprise part numbers no longer need to have a meaning. As long as they are unique throughout the enterprise, automation will take care PLM, and ERP are connected. In one of the comments to this post, a reader mentioned that they were implementing now non-intelligent numbers in their company and the ERP consultant recommended to renumber all the old part numbers to have a clean start. From the ERP point of view, no issue. The consultant probably never had learned about the fact that part numbers are used in drawings, instructions, spare part manuals, which are all documents in the engineering domain. Renumbering them would be a waste of resources and money, just to have a “pure” part number. In the world of PLM, you have to deal with legacy.

The need for business transformation

Companies currently do not fully recognize that the old way of working in PLM, based on a document-driven approach, is not compatible with a modern data-driven approach. The old approach makes documents the formal decision carrier for product information. Documents are reviewed and approved and once approved stored. When information is changing, documents are most of the time not updated due to the cost of maintaining all these versions of documents in the context of the related products. Documents lock information and do not guarantee the information inside the document remains actual.

In a data-driven environment, we work in a much more granular manner, directly with the data. Working data-driven reduces the need for people in the organization to collect and transform information into documents for further communication.

GartnerWorkforce

As both approached do not match in a single business process or a single PLM system, the challenge for companies is to decide how to keep the old environment available and meanwhile introducing the new data-driven approach for PLM. Customizing this upon your old PLM environment would be a problem for the future as customizations are hard to maintain, in particular, if these are the customizations that need to support the future.

Building everything in a new environment, designed for a data-driven approach, will also be a guarantee for failure. The old data, stored in documents, does not have the granular quality a data-driven environment needs.

Combined with the fact that different people will be needed to support old or new businesses, the topic of solving PLM for the future is not an easy one.

And when things are not easy, it is hard to find the right support for changes. Management usually does not spend enough time to understand the big picture; politics come into play.

Unfortunately, it’s usually safer and better for one’s career to cut costs a little further than to try to hit the rare innovation homerun

Quote from Political Realities of PLM-Implementation Projects in Engineering.com

Conclusion

Why PLM is the forgotten domain in digital transformation is quite understandable, although it requires more than a tweet to picture the full story.  Understanding the reasons is the first step, making PLM part of the digital transformation is the main challenge – who has the energy and power to lead?

GettyImages-157335388[1]Last week I shared my observation from day 1 of the PI Berlin 2017 conference. If you have not read this review look here: The weekend after PI Berlin 2017.

Day 1 was the most significant day for me. I used the second day more for networking and some selective sessions that I wanted to attend. The advantage for the reader, this post is not as long as the previous one. Some final observations from day 2

PLM: The Foundation for Enterprise Digitalization

Peter Bilello from CIMdata gave an educational speech about digitalization and the impact of digitalization on current businesses. Peter considers digitalization as a logic next step in the PLM evolution process. See picture below.

clip_image002

Although it is an evolution process, the implementation of this next step requires a revolution. Digitalization will create a disruption in companies as the digital approach will reshape business models, internal business processes, roles and responsibilities. Peter further elaborated on the product innovation platform and its required characteristics. Similar to what I presented on the first day Peter concluded that we are in a learning stage how to build new methodology/infrastructure for PLM. For example, a concept of creating and maintaining a digital twin needs a solid foundation.
His conclusion: Digitalization requires PLM:

Boosting the value of PLM through
Advanced Analytics Assessment

autolivPaul Haesman from Autoliv introduced the challenges they have as a typical automotive company. Digitalization is reshaping the competitive landscape and the demands on more technology, still guaranteeing the highest safety levels of their products. In that context, they invited Tata Technologies to analyze their current PLM implementation and from there to provide feedback about their as-is readiness for the future.

Chris Hind from Tata Technologies presented their methodology where they provide benchmark information, a health check, impact and potential roadmap for PLM. A method that is providing great insights for both parties and I encourage companies that haven´t done such an assessment to investigate in such an activity. The major value of a PLM assessment is that it provides an agreed baseline for the company that allows management to connect the Why to the What and How. Often PLM implementations focus on What and How with not a real alignment to the Why, which results in unrealistic expectations or budgets due to the perceived value.

clip_image004

An interesting point address by Chris (see picture above) is that Document Management is considered as a trending priority !!!

It illustrates that digitalization in PLM has not taken off yet and companies still focusing on previous century capabilities 😦

The second highlight rating Manufacturing Process Management as the most immature PLM pillar can be considered in the same context. PLM systems are still considered engineering systems and manufacturing process management is in the gray area between PLM systems and ERP systems.

The last two bullets are clear. The roots of PLM are in managing quality and compliance and improving time to market.

Overcoming integration challenges –
Outotec´s Digital Journey

Outotec_RGBHelena Gutiérrez and Sami Grönstand explained in an entertaining manner the Outotec (providing technologies and services for the metal and mineral processing industries) company and their digital journey. Outotec has been working already for several years on simplifying their IT-landscape meanwhile trying to standardize in a modern, data-driven manner the flow of information.

Sami provided with great detail how the plant process definition is managed in PLM. The process definition is driven by the customer´s needs and largely defines the costs of a plant to build. Crucial for the quotation phase but also important if you want to create a digital continuity. Next, the process definition is further detailed with detailed steps, defining the key parameters characteristics of the main equipment.

ElephantAndAnts

And then the challenge starts. In the context of the plant structure, the right equipment needs to be selected. Here it is where plant meets product or as the Outotec team said where the elephant and ants do the tango.

In the end, as much as possible standardized products need to match the customer specific solution. The dream of most of these companies: combining Engineering To Order and Configure To Order and remember this in the context of digital continuity.

So far, a typical EPC (Engineering Procurement Construction) project, however, Outotec wants to extend the digital continuity to support also their customer´s installed plant. I remembered one of their quotes for the past: “Buy one (plant) and get two (a real one and a virtual one). “This concept managed in a digital continuity is something that will come up in many other industries – the digital twin.

clip_image008

Where companies like Outotec are learning to connect all data from the initiation of their customer specific solution through delivery and services, other product manufacturing companies are researching the same digital continuity for their product offerings to the field of consumers. Thanks to digitization these concepts become more and more similar. I wrote about this topic recently in my post PLM for Owner/Operators.

Final conclusion from PI Berlin 2017

It is evident participants and speakers are talking about the strategic value and role PLM can have an organization.

With digitalization, new possibilities arise where the need and value for end-to-end connectivity pop up in every industry.

We, the PLM community, are all learning and building new concepts. Keep sharing and meeting each other in blogs, forums, and conferences.

Last week I got the following question:

Many companies face the challenges relevant to the cooperation and joint ventures and need to integrate in a smart way the portfolio’s to offer integrated solutions. In the world of sharing and collaboration, this may be a good argument to dig into. Is PLM software ready for this challenge with best practice solutions or this is a matter that is under specific development case by case? Any guidelines?

Some history

When PLM solutions were developed their core focus was on bringing hardware products to the market in a traditional manner as shown in the figure below. clip_image001

Products were pushed to the market based on marketing research and closed innovation. Closed innovation meant companies were dependent on their internal R&D to provide innovative products. And this is the way most PLM systems are implemented: supporting internal development. Thanks to global connectivity, the internal development teams can collaborate together connected to a single PLM backbone/infrastructure.

Third Party Products (TPP) at that time were sometimes embedded in the EBOM, and during the development phase, there would be an exchange of information between the OEM and the TPP provider. Third Party Products were treated in a similar manner as purchase items. And as the manufacturing of the product was often defined in the ERP system, there the contractual and financial interactions with the TTP provider were handled, creating a discontinuity between what has been defined for the product and what has been shipped. The disconnect between the engineering intent and actual delivery to the customer often managed in Excel spreadsheets or proprietary databases developed to soften the pain

What is happening now?

In the past 10 – 15 years there is the growing importance of first electronic components and their embedded software now followed by new go-to-market approaches, where the customer proposition changes from just a product, towards a combined offering of hardware, software, and services. Let´s have a look how this could be done in a PLM environment.

From Products to Solutions

The first step is to manage the customer proposition in a logical manner instead of managing all in a BOM definition. In traditional businesses, most companies still work around multiple Bill of Materials. For example, read this LinkedIn post: The BOM is King. This approach works when your company only delivers hardware.

Not every PLM system supports Out-Of-The-Box a logical structure. I have seen implementations where this logical structure was stored in an external database (not preferred) or as a customized structure in the PLM system. Even in SmarTeam, this methodology was used to support Asset Lifecycle Management. I wrote about this concept early 2014 in the context of Service Lifecycle Management(SLM) two posts: PLM and/or SLM ? and PLM and/or SLM (continued). It is no coincidence that concepts used for connecting SLM to PLM are similar to defining customer propositions.

PropositionIn the figure to the left, you can see the basic structure to manage a customer proposition and how it would connect to the aspects of hardware, software, and services. In an advanced manner, the same structure could be used with configuration rules to define and create a portfolio of propositions. More about this topic potential in a future blog post.

For hardware, most PLM systems have their best practices based on the BOM as discussed before. When combining the hardware with embedded software, we enter the world of systems. The proposition is no longer a product it becomes a system or even an experience.

For managing systems, I see two main additions to the classical PLM approach:

  1. The need for connected systems engineering. As the behavior of the system is much more complicated than just a hardware product, companies discover the need to spend more time on understanding all the requirements for the system and its potential use cases in operation – the only way to define the full experience. Systems Engineering practices coming from Automotive & Aerospace are now coming into the world of high-tech, industrial equipment, and even consumer goods.
  2. The need to connect software deliverables. Software introduces a new challenge for companies, no matter if the software is developed internally or embedded through TTP. In both situations, there is the need to manage change in a fast and iterative manner. Classical ECR /ECO processes do not work here anymore. Working agile and managing a backlog becomes the mode. Application Lifecycle Management connected to PLM becomes a need.

In both domains, systems engineering, and ALM, PLM vendors have their offerings, and on the marketing side, they might all look the same to you. However, there is a fundamental need that is not always visible on the marketing slides, the need for complete openness.

Openness

opennessTo manage a portfolio based on systems a company can no longer afford to manually check in multiple management systems all the dependencies between the product and its components combined with the software deliverables and TTPs. Automation, traceability on changes and notifications are needed in a modern, digital environment, which you might call a product innovation platform. My high-speed blog buddy Oleg Shilovitsky just dedicated a post to “The Best PLM for Product Innovation Platform” sharing several quotes from CIMdata´s talk about characteristics of a Product Innovation Platform and stressing the need for openness.

It is true if you can only manage your hardware (mechanics & electronics) and software in dedicated systems, your infrastructure will be limited and rigid as the outside world is in constant and fast changes. No ultimate solution or product does it all and will do it all in the future. Therefore openness is crucial.

Services

In several companies, original in the Engineering, Procurement & Construction industry, I have seen the need to manage services in the context of the customer delivery too. Highly customized systems and/or disconnected systems were used here. I believe the domain of managing a proposition, a combination of hardware, software, AND services in a connected environment is still in its early days. Therefore the question marks in the diagram.

Conclusion

How Third Party Products management are supported by PLM depends very much on the openness of the PLM system. How it connects to ALM and how the PLM system is able to manage a proposition. If your PLM system has been implemented as a supporting infrastructure for Engineering only, you are probably not ready for the modern digital enterprise.

Other thoughts ???

PLM and IPTwo terms pass me every day: Digital Transformation appears in every business discussion, and IP Security, a topic also discussed in all parts of society. We realize it is easy to steal electronic data without being detected (immediately).

What is Digital Transformation?

Digital Transformation is reshaping business processes to enable new business models, create a closer relation with the market, and react faster while reducing the inefficiencies of collecting, converting and processing analog or disconnected information.

Digital Transformation became possible thanks to the lower costs of technology and global connectivity, allowing companies, devices, and customers to interact in almost real-time when they are connected to the internet.

IOTIoT (Internet of Things) and IIoT (Industrial Internet of Things) are terms closely related to Digital Transformation. Their focus is on creating connectivity with products (systems) in the field, providing a tighter relation with the customer and enabling new (upgrade) services to gain better performance. Every manufacturing company should be exploring IoT and IIoT possibilities now.

Digital Transformation is also happening in the back office of companies. The target is to create a digital data flow inside the company and with the outside stakeholders, e.g., customers, suppliers, authorities. The benefits are mainly improved efficiency, faster response and higher quality interaction with the outside world.

digitalPLMThe part of Digital Transformation that concerns me the most is the domain of PLM. As I have stated in earlier posts (Best Practices or Next Practices ? / What is Digital PLM ?), the need is to replace the classical document-driven product to market approach by a modern data-driven interaction of products and services.

I am continually surprised that companies with an excellent Digital Transformation profile on their websites have no clue about Digital Transformation in their product innovation domain. Marketing is faster than reality.

PIBerlin2017-1I am happy to discuss this topic with many of my peers in the product innovation world @ PI Berlin 2017, three weeks from now. I am eagerly looking to look at how and why companies do not embrace the Digital Transformation sooner and faster. The theme of the conference, “Digital Transformation: From Hype to Value “ says it all. You can find the program here, and I will report about this conference the weekend after.

IP Security

The topic of IP protection has always been high on the agenda of manufacturing companies. Digital Transformation brings new challenges. Digital information will be stored somewhere on a server and probably through firewalls connected to the internet. Some industries have high-security policies, with separate networks for their operational environments. Still, many large enterprises are currently struggling with IP security policies as sharing data while protecting IP between various systems creates a lot of administration per system.

dropboxCloud solutions for sharing data are still a huge security risk. Where is the data stored and who else have access to it? Dropbox came in the news recently as “deleted” data came back after five years, “due to a bug.” Cloud data sharing cannot be trusted for real sensitive information.

Cloud providers always claim that their solutions are safer due to their strict safety procedures compared to the improvident behavior of employees. And, this is true. For example, a company I worked with had implemented Digital Rights Management (DRM) for internal sharing of their IP, making sure that users could only read information on the screen, and not store it locally if they had an issue with the server. “No problem”, one of the employees said, “I have here a copy of the documents on my USB-drive.

lockedCloud-based PLM systems are supposed to be safer. However, it still matters where the data is stored; security and hacking policies of countries vary. Assume your company´s IP is safe for hacking. Then the next question is “How about ownership of your data?”

Vendor lock-in and ownership of data are topics that always comes back at the PDT conferences (see my post on PDT2016). When a PLM cloud provider stores your product data in a proprietary data format, you will always be forced to have a costly data migration project when you decide to change from the provider.

Why not use standards for data storage? Hakan Kårdén triggered me on this topic again with his recent post: Data Is The New Oil So Make Sure You Ask For The Right Quality.

 

Conclusion:

Digital Transformation is happening everywhere but not always with the same pace and focus. New PLM practices still need to be implemented on a larger scale to become best practices. Digital information in the context of Intellectual Property creates extra challenges to be solved. Cloud providers do not offer yet solutions that are safe and avoiding vendor lock-in.

Be aware. To be continued…

Many thanks (again) to Dick Bourke for his editing suggestions

First, Happy New Year!! I wish all my readers a healthy, happy and successful 2017. Increasing your understanding of modern PLM based on field experiences is my pledge to you this year. PLM as part of a business strategy is mentioned more and more at management level in companies. However, the meaning and impact of PLM can be diffuse therefore requiring more clarification to management. To save your time, I’m pleased to share some images/slides I have used to explain fundamental PLM concepts. Use them in your PLM meetings.

People, Processes, and Tools

PeopleProcessTools

A company should not implement a PLM system just because people say they need a PLM system. Most likely, PLM supports a business transformation, enabling new ways of working and new business processes.

PeopleProcessToolsTweet

Read more related to People, Processes, and Tools:

Old and New PLM

OldNewPLM

When your company wants to implement PLM today, it is important to realize all businesses are transitioning from old linear processes, pushing products to the market towards incremental customer-oriented processes. With a change from a document-driven approach towards a data-driven approach, implementing PLM requires a new approach:

Read more about how PLM is changing:

PLM Selection

PLM selection

Selecting the right PLM system is just the top of the iceberg. Most PLM systems have lots of functionalities in common. Therefore, when selecting a PLM system, take into account the topics below the waterline. The deeper you get, the more important they are for a successful PLM implementation.

Read more about PLM selection:

The Maturity of an Organization

Gartner maturity

Can you run before you can walk? Is PLM only valid for large companies? I do not think so. Large companies usually have a higher need to make their products less dependent on specific individual skills. Therefore, they will focus more on repeatable processes and as next steps, integrating internally and externally. This slide was presented by Marc Halpern at PDT2015 and illustrates the maturity journey a company can grow through, and how this journey affects the focus for PDM, PLM, and future integration,

Read more about the PLM journey and Maturity:

Don’t Choose the Easiest Path

gartner benefits

Another “classical” Gartner slide explaining what everybody knows, yet what most companies fail to do. Two important messages with this slide.

  • Every change in technology will cause a dip in the company’s performance. Give your people the time to adapt by changing performance KPIs for that period
  • Introducing new technology combined with introducing new processes and a change in culture will bring the highest value

Read more about Cultural Change:

Digital Transformation is Coming

The world is becoming rapidly digital. Digitization is destroying jobs that can be automated. A great article about the onrushing wave can be found in the Economist, describing which jobs are likely to stay and which are likely to disappear. And, disappearing jobs will not come back again as some populists might promise. The good news, however, is that new business models and processes require many new jobs for which we are not educated (yet). Self-learning becomes crucial.

Read more about Digital Transformation:

Evolution, Disruption or Bimodal?

bimodal

Companies that have implemented their classical PLM environment fail to move to a PLM infrastructure supporting modern customer-driven delivery of products and services. However, the evolutionary approach takes too long; the alternative is to disrupt your business. Or try a bimodal PLM-approach. The bimodal PLM approach is inspired by Gartner’s bimodal IT approach.

Read more about Disruption or Bimodal:

See You Soon?

2017 is going to be an interesting and challenging year for all of us. What will be the further impact of digitization on your business? Will we tweet our PLM strategy in the future? I hope to discuss these developments with you on my blog and during the upcoming PI Berlin.

PIBerlin2017

Let’s communicate !

PLM can be swinging and inspiring although there will be times of frustration and stress when implementing. These seven musical views will help you to make it through the project.

 

One Vision

Every business change should start with a vision and a strategy. Defining the vision and keeping the vision alive is the responsibility of senior management. When it comes to PLM, the vision is crucial.

 

No more heroes

Of course, when implementing PLM, the target is to streamline the organization’s processes, eliminate bottlenecks and reduce dependencies on individuals. No more need for firefighters or other heroes because they fix or solve issues that appear due to the lack of processes and clarity.

 

Let´s do it together

PLM implementations are not IT-projects, where you install, configure and roll out an infrastructure based on one or more systems. Like a music band, it should be a well-orchestrated project between business experts and IT. Here´s a song to make your project swing.

 

Say NO at the right time

When implementing PLM, the software geeks can do everything for you: Customize the system, create a complete new environment looking like the old environment, and more. Of course, you will pay for it. Not only for the extra services, but also in the long-term to support all these customizations. Always try to find a balance between the standard functionality and infrastructure of the PLM system and the company´s vision. This means there are times you must Say NO to your users. Maybe not always as funny as these guys say it.

 

Eight days a week

During the PLM implementation and for sure after one of the several rollouts, changes may appear. And, normal work still needs to be done, sometimes in a different way. There will never be enough time to do everything perfect and fast, and it feels like you need more days in the week. When you are stressed, swing with these guys.

 

We are the champions

Then when the PLM project has been implemented successfully, there is a feeling of relief. It has been a tough time for the company and the PLM team. This should be the moment for the management to get everyone together in the stadium as an important change for the company´s future has been realized. Sing all together.

 

… But the times they are a-changing

Although a moment of relief is deserved, PLM implementations never end. The current infrastructure could be improved continuously due to better business understanding. However, globalization and digitalization will create new business challenges and opportunities at an extraordinarily fast pace. So, be aware and sing along with Bob.

 

BONUS

Time to close the 2016 book and look forward to next year’s activities. I wish all my readers happy holidays and a healthy, successful new year with a lot of dialogue, and no more one-liners.

 

See you in 2017 !!!!

changeRecently, I have written about classical PLM (document-driven and sequential) and modern PLM (data-driven and iterative) as part of the upcoming digital transformation that companies will have to go through to be fit for the future. Some strategic consultancy companies, like Accenture, talk about Digital PLM when referring to a PLM environment supporting the digital enterprise.

 

From classical PLM to Digital PLM?

The challenge for all companies is to transform their businesses to become customer-centric and find a transformation path from the old legacy PLM environment towards the new digital environment. Companies want to do this in an evolutionary mode. However my current observations are that the pace of an evolutionary approach is too slow related to what happens in their market. This time the change is happening faster than before.

A Big Bang approach towards the new environment seems to be a big risk. History has taught us that this is very painful and costly. To be avoided too. So what remains is a kind of bimodal approach, which I introduced in my recent blog posts (Best Practices or Next Practices). Although one of my respected readers and commenters Ed Lopategui mentioned in his comment (here) bimodal is another word for coexistence. He is not optimistic about this approach either

So, what remains is disruption?

And disruption is a popular word and my blog buddy Oleg Shilovitsky recently dived into that topic again with his post: How to displace CAD and PLM industry incumbents. An interesting post about disruption and disruption patterns. My attention was caught by the words: digital infrastructure.
I quote:

How it might happen? Here is one potential answer – digital infrastructure. Existing software is limited to CAD files stored on a desktop and collaboration technologies developed 15-20 years using relational database and client-server architecture.

Digital Infrastructure

imageAs I mentioned the words, Digital Infrastructure triggered me to write this post. At this moment,  I see companies marketing their Digital Transformation story in a slick way, supported by all the modern buzz words like; customer-centric, virtual twin and data-driven. You would imagine as a PLM geek that they have already made the jump from the old document-driven PLM towards modern digital PLM. So what does a modern digital PLM environment look like ?

The reality, however, behind this slick marketing curtain, is that there are still the old legacy processes, where engineers are producing drawings as output for manufacturing. Because drawings are still legal and controlled information carriers. There is no digital infrastructure behind the scenes. So, what would you expect behind the scenes?

Model-Based Definition as part of the digital infrastructure

Crucial to be ready for a digital infrastructure is to transform your company´s product development process from a file-based process where drawings are leading towards a model-based enterprise. The model needs to be the leading authority (single source of truth) for PMI (Product Manufacturing Information) and potentially for all upfront engineering activities. In this case, we call it Model-Based Systems Engineering sometimes called RFLP (Requirements-Functional-Logical-Product), where even the product can be analyzed and simulated directly based on the model.

A file-based process is not part of a digital infrastructure or model-based enterprise architecture. File-based processes force the company to have multiple instances and representations of the same data in different formats, creating an overhead of work to keep up quality and correctness of data, that is not 100 % secure. A digital infrastructure works with connected data in context.

econimistTherefore, if your company is still relying on drawings and you want to be ready for the future, a first step towards a digital infrastructure would be fixing your current processes to become model-based. Some good introductions can be found here at ENGINEERING.com – search for MBD and you will find:

Moving to Mode-Based is already a challenging transformation inside your company before touching the challenge of moving towards a full digital enterprise, through evolution, disruption or bimodal approach – let the leading companies show the way.

Conclusion

Companies should consider and investigate how to use a Model-Based Engineering approach as a first step to becoming lean and fit for a digital future. The challenge will be different depending on the type of industry and product.
I am curious to learn from my readers where they are on the path to a digital enterprise.

bimodalIn my earlier post The weekend after PDT Europe I wrote about the first day of this interesting conference. We ended that day with some food for thought related to a bimodal PLM approach. Now I will take you through the highlights of day 2.

Interoperability and openness in the air (aerospace)

I believe Airbus and Boeing are one of the most challenged companies when it comes to PLM. They have to cope with their stakeholders and massive amount of suppliers involved, constrained by a strong focus on safety and quality. And as airplanes have a long lifetime, the need to keep data accessible and available for over 75 years are massive challenges. The morning was opened by presentations from Anders Romare (Airbus) and Brian Chiesi (Boeing) where they confirmed they could switch the presenter´s role between them as the situations in Airbus and Boeing are so alike.

airbus logoAnders Romare started with a presentation called: Digital Transformation through an e2e PLM backbone, where he explained the concept of extracting data from the various silo systems in the company (CRM, PLM, MES, ERP) to make data available across the enterprise. In particular in their business transformation towards digital capabilities Airbus needed and created a new architecture on top of the existing business systems, focusing on data (“Data is the new oil”).

In order to meet a data-driven environment, Airbus extracts and normalizes data from their business systems and provides a data lake with integrated data on top of which various apps can run to offer digital services to existing and new stakeholders on any type of device. The data-driven environment allows people to have information in context and almost real-time available to make right decisions. Currently, these apps run on top of this data layer.

AirbusPDT2016

Now imagine information captured by these apps could be stored or directed back in the original architecture supporting the standard processes. This would be a real example of the bimodal approach as discussed on day 1. As a closing remark Anders also stated that three years ago digital transformation was not really visible at Airbus, now it is a must.

BoeingLogoNext Brian Chiesi from Boeing talked about Data Standards: A strategic lever for Boeing Commercial Airplanes. Brian talked about the complex landscape at Boeing. 2500 Applications / 5000 Servers / 900 changes annually (3 per day) impacting 40.000 users. There is a lot of data replication because many systems need their own proprietary format. Brian estimated that if 12 copies exist now, in the ideal world 2 or 3 will do. Brian presented a similar future concept as Airbus, where the traditional business systems (Systems Engineering, PLM, MRP, ERP, MES) are all connected through a service backbone. This new architecture is needed to address modern technology capabilities (social / mobile / analytics / cloud /IoT / Automation / ,,)

BoeingArchitecture

Interesting part of this architecture is that Boeing aims to exchange data with the outside world (customers / regulatory/supply chain /analytics / manufacturing) through industry standard interfaces to have an optimal flow of information. Standardization would lead to a reduction of customized applications, minimize costs of integration and migration, break the obsolescence cycle and enable future technologies. Brian knows that companies need to pull for standards, vendors will deliver. Boeing will be pushing for standards in their contracts and will actively work together with five major Aerospace & Defense companies to define required PLM capabilities and have a unified voice to PLM solutions providers.

My conclusion on these to Aerospace giants is they express the need to adapt to move to modern digital businesses, no longer the linear approach from the classic airplane programs. Incremental innovation in various domains is the future. The existing systems need to be there to support their current fleet for many, many years to come. The new data-driven layer needs to be connected through normalization and standardization of data. For the future focus on standards is a must.

MicrosoftLogoSimon Floyd from Microsoft talked about The Impact of Digital Transformation in the Manufacturing Enterprise where he talked us through Digital Transformation, IoT, and analytics in the product lifecycle, clarified by examples from the Rolls Royce turbine engine. A good and compelling story which could be used by any vendor explaining digital transformation and the relation to IoT. Next, Simon walked through the Microsoft portfolio and solution components to support a modern digital enterprise based on various platform services. At the end, Simon articulated how for example ShareAspace based on Microsoft infrastructure and technology can be an interface between various PLM environments through the product lifecycle.

Simon’s presentation was followed by a panel discussion where the theme was: When is history and legacy an asset and barriers of entry and When does it become a burden and an invitation to future competitors.
PDTpanel2Mark Halpern (Gartner) mentioned here again the bimodal thinking. Aras is bimodal. The classical PLM vendors running in mode 1 will not change radically and the new vendors, the mode 2 types will need time to create credibility. Other companies mentioned here PropelPLM (PLM on Salesforce platform) or OnShape will battle the next five years to become significant and might disrupt.

Simon Floyd(Microsoft) mentioned that in order to keep innovation within Microsoft, they allow for startups within in the company, with no constraints in the beginning to Microsoft. This to keep disruption inside you company instead of being disrupted from outside. Another point mentioned was that Tesla did not want to wait till COTS software would be available for their product development and support platform. Therefore they develop parts themselves. Are we going back to the early days of IT ?

Interesting trend I believe too, in case the building blocks for such solution architecture are based on open (standardized ?) services.

Data Quality

After the lunch, the conference was split in three streams where I was participating in the “Creating and managing information quality stream.” As I discussed in my presentation from day 1, there is a need for accurate data, starting a.s.a.p. as the future of our businesses will run on data as we learned from all speakers (and this is not a secret – still many companies do not act).

boost logoIn the context of data quality, Jean Brange from Boost presented the ISO 8000 framework for data and information quality management. This standard is now under development and will help companies to address their digital needs. The challenge of data quality is that we need to store data with the right syntax and semantic to be used and in addition, it needs to be pragmatic: what are we going to store that will have value. And then the challenge of evaluating the content. Empty fields can be discovered, however, how do you qualify the quality of field with a value. The ISO 8000 framework is a framework, like ISO 9000 (product quality) that allow companies to work in a methodological way towards acceptable and needed data quality.

iso8000

eurostep logoMagnus Färneland from Eurostep addressed the topic of data quality and the foundation for automation based on the latest developments done by Eurostep on top of their already rich PLCS data model. The PLCS data model is an impressive model as it already supports all facets of product lifecycle from design, through development and operations. By introducing soft typing, EuroStep allows a more detailed tuning of the data model to ensure configuration management. When at which stage of the lifecycle is certain information required (and becomes mandatory) ? Consistent data quality enforced through business process logic.

The conference ended with Marc Halpern making a plea for Take Control of Your Product Data or Lose Control of Your Revenue, where Marc painted the future (horror) scenario that due to digital transformation the real “big fish” will be the digital business ecosystem owner and that once you are locked in with a vendor, these vendors can uplift their prices to save their own business without any respect for your company’s business model. Marc gave some examples where some vendor raised prices with the subscription model up to 40 %. Therefore even when you are just closing a new agreement with a vendor, you should negotiate a price guarantee and a certain bandwidth for increase. And on top of that you should prepare an exit strategy – prepare data for migration and have backups using standards. Marc gave some examples of billions extra cost related to data quality and loss. It can hurt !! Finally, Marc ended with recommendations for master data management and quality as a needed company strategy.

GartnerSupscriptionModels

cimdataGerard Litjens from CIMdata as closing speaker gave a very comprehensive overview of The Internet of Thing – What does it mean for PLM ? based on CIMdata’ s vision. As all vendors in this space explain the relation between IoT and PLM differently, it was a good presentation to be used as a base for the discussion: how does IoT influence our PLM landscape. Because of the length of this blog post, I will not further go into these details – it is worth obtaining this overview.

Concluding: PDT2016 is a crucial PLM conference for people who are interested in the details of PLM. Other conferences might address high-level customer stories, at PDT2016 it is about the details and sharing the advantages of using standards. Standards are crucial for a data-driven environment where business platforms with all their constraints will be the future. And I saw more and more companies are working with standards in a pragmatic manner, observing the benefits and pushing for more data standards – it is not just theory.

See you next year ?

PDT2016I am just back from the annual PDT conference (12th edition), this year hosted in Paris from 9 to 10 November, co-located with CIMdata’s PLM Road Map 2016 for Aerospace & Defense. The PDT conference, organized by EuroStep and CIMdata, is a relatively small conference with a little over a hundred attendees. The attractiveness of this conference is that the group of dedicated participants is very interactive with each other sharing honest opinions and situations, sometimes going very deep into the details, needed to get the full picture. The theme of the conference was: “Investing for the future while managing product data legacy and obsolescence.” Here are some of the impressions from these days, giving you food for thought to join next year.

Setting the scene

Almost traditionally Peter Bilello (CIMdata) started the conference followed by Marc Halpern (Gartner). Their two presentations had an excellent storyline together.

cimdataPieter Bilello started and discussed Issues and Remedies for PLM Obsolescence. Peter did not address PLM obsolescence for the first time. It is a topic many early PLM adaptors are facing and in a way the imminent obsolescence of their current environments block them of taking advantage of new technologies and capabilities current PLM vendors offer. Having learned from the past CIMdata provides a PLM Obsolescence Management model, which should be on every companies agenda, in the same way as data quality (which I will address later). Being proactive in obsolescence can save critical situations and high costs. From the obsolescence theme, Peter looked forward to the future and the value product innovation platforms can offer, given the requirements that data should be able to flow through the organization, connecting to other platforms and applications, increasing the demand to adhere and push for standards.

gartnerMarc Halpern followed with his presentation, titled: More custom products demand new IT strategies and new PLM application where he focused on the new processes and methodology needed for future businesses with a high-focus on customer-specific deliveries, speed, and automation. Automation is always crucial to reducing production costs. In this delivery process 3D printing could bring benefits and Mark shared the plusses and minuses of 3D printing. Finally, when automation of a customer specific order would be possible, it requires a different IT-architecture, depicted by Mark. After proposing a roadmap for customizable products, Mark shared some examples of ROI benefits reported by successful transformation projects. Impressive !!

Gartner-ROI-samples

My summary of these two sessions is that both CIMdata and Gartner confirm the challenges companies have to change their old legacy processes and PLM environments which support the past, meanwhile moving to more, customer-driven processes and, modern data-driven PLM functionality. This process is not just an IT or Business change, it will also be a cultural change.

JT / STEP AP242 / PLCS

standardsNext, we had three sessions related to standards, where Alfred Katzenbach told the success story of JT, the investment done to get this standard approved and performing based on an active community to get the most out of JT, beyond its initial purpose of viewing and exchanging data in a neutral format. Jean-Yves Delanaunay explained in Airbus Operation the STEP AP242 definition is used as the core standard for 3D Model Based Definition (MB) exchange, part of the STEP standards suite and as the cornerstone for Long Term Archiving and Retrieval of Aerospace & Defense 3D MBD.

There seems to be some rivalry between JT and STEP242 viewing capabilities, which go beyond my understanding as I am not an expert from the field here. Nigel Shaw ended the morning session positioning PLCS as a standard for interoperability of information along the whole lifecycle of a product. Having a standardized data model as Nigel showed would be a common good approach for PLM vendors to converge to a more interoperable standard.

PLCS concept model

My summary of standards is that there is a lot of thinking, evaluation, and testing done by an extensive community of committed people. It will be hard for a company to define a better foundation for a standard in their business domain. Vendors are focusing on performance inside their technology offering and therefore will never push for standards (unless you use their products as a standard). A force for adhering to standards should come from the user community.

Using standards

After lunch we had three end-users stories from:

  • Eric Delaporte (Renault Group) talked about their NewPDM project and the usage of standards mainly for exchanges. Two interesting observations: Eric talks about New PDM – the usage of the words New (when does New become regular?) and PDM (not talking about PLM ?) and secondly as a user of standards he does not care about the JT/AP242 debate and uses both standards where applicable and performing.
  • Sebastien Olivier (France Ministry of Defense) gave a bi-annual update of their PCLS-journey used in two projects, Pencil (Standardized Exchange platform and centralized source of logistical information) and MAPS (Managing procurement contracts for buying In-Service Support services) and the status of their S3000L implementation (International procedure for Logistic Support Analysis). A presentation for the real in-crowd of this domain.
  • Juha Rautjarvi discussion how efficient use of knowledge for safety and security could be maintained and enhanced through collaboration. Here Juha talks about the Body of Knowledge which should be available for all stakeholders in the context of security and safety. And like a physical product this Body of Knowledge goes through a lifecycle, continuous adapting to what potentially arises from the outside world

My conclusion on this part was that if you are not really in these standards on a day-to-day base (and I am not), it is hard to pick the details. Still, the higher level thought processes behind these standard approaches allow you to see the benefits and impact of using standards, which is not the same as selecting a tool. It is a strategic choice.

Modular / Bimodular / not sexy ?

modilar - bimodulanrJakob Asell from Modular Management gave an overview how modularity can connect the worlds of sales, engineering, and manufacturing by adding a modular structure as a governing structure to the various structures used by each discipline. This product architecture can be used for product planning and provides and end-to-end connectivity of information. Modular Management is assisting companies in moving towards this approach.

Next my presentation title: The importance of accurate data. Act now! addressed the topic of the switch from classical, linear, document-driven PLM towards a modern, more incremental and data-driven PLM approach. Here I explained the disadvantage of the old evolutionary approach (impossible – too slow/too costly) and an alternative method, inspired by Gartner’s bimodular IT-approach (read my blog post: Best Practices or Next Practices). No matter which option you are looking for correct and quality data is the oil for the future, so companies should consider allowing the flow of data as a health issue for the future.

The day was closed with a large panel, where the panelist first jumped on the topic bimodal (bipolar ?? / multimodal ??) talking about mode 1 (the strategic approach) and mode 2 (the tactical and fast approach based on Gartner’s definition). It was clear that the majority of the panel was in Mode 1 mode. Which fluently lead to the discussion of usage of standards (and PLM) as not being attractive for the young generations (not sexy). Besides the conclusion that it takes time to understand the whole picture and see the valuable befits a standard can bring and join this enthusiasm

panel-day 1

Conclusion

I realize myself that this post is already too long according blogging guidelines. Therefore I will tell more about day 2 of the conference next week with Airbus going bimodal and more.

Stay tuned for next week !