You are currently browsing the category archive for the ‘PLM’ category.

GettyImages-157335388[1]Last week I shared my observation from day 1 of the PI Berlin 2017 conference. If you have not read this review look here: The weekend after PI Berlin 2017.

Day 1 was the most significant day for me. I used the second day more for networking and some selective sessions that I wanted to attend. The advantage for the reader, this post is not as long as the previous one. Some final observations from day 2

PLM: The Foundation for Enterprise Digitalization

Peter Bilello from CIMdata gave an educational speech about digitalization and the impact of digitalization on current businesses. Peter considers digitalization as a logic next step in the PLM evolution process. See picture below.

clip_image002

Although it is an evolution process, the implementation of this next step requires a revolution. Digitalization will create a disruption in companies as the digital approach will reshape business models, internal business processes, roles and responsibilities. Peter further elaborated on the product innovation platform and its required characteristics. Similar to what I presented on the first day Peter concluded that we are in a learning stage how to build new methodology/infrastructure for PLM. For example, a concept of creating and maintaining a digital twin needs a solid foundation.
His conclusion: Digitalization requires PLM:

Boosting the value of PLM through
Advanced Analytics Assessment

autolivPaul Haesman from Autoliv introduced the challenges they have as a typical automotive company. Digitalization is reshaping the competitive landscape and the demands on more technology, still guaranteeing the highest safety levels of their products. In that context, they invited Tata Technologies to analyze their current PLM implementation and from there to provide feedback about their as-is readiness for the future.

Chris Hind from Tata Technologies presented their methodology where they provide benchmark information, a health check, impact and potential roadmap for PLM. A method that is providing great insights for both parties and I encourage companies that haven´t done such an assessment to investigate in such an activity. The major value of a PLM assessment is that it provides an agreed baseline for the company that allows management to connect the Why to the What and How. Often PLM implementations focus on What and How with not a real alignment to the Why, which results in unrealistic expectations or budgets due to the perceived value.

clip_image004

An interesting point address by Chris (see picture above) is that Document Management is considered as a trending priority !!!

It illustrates that digitalization in PLM has not taken off yet and companies still focusing on previous century capabilities 😦

The second highlight rating Manufacturing Process Management as the most immature PLM pillar can be considered in the same context. PLM systems are still considered engineering systems and manufacturing process management is in the gray area between PLM systems and ERP systems.

The last two bullets are clear. The roots of PLM are in managing quality and compliance and improving time to market.

Overcoming integration challenges –
Outotec´s Digital Journey

Outotec_RGBHelena Gutiérrez and Sami Grönstand explained in an entertaining manner the Outotec (providing technologies and services for the metal and mineral processing industries) company and their digital journey. Outotec has been working already for several years on simplifying their IT-landscape meanwhile trying to standardize in a modern, data-driven manner the flow of information.

Sami provided with great detail how the plant process definition is managed in PLM. The process definition is driven by the customer´s needs and largely defines the costs of a plant to build. Crucial for the quotation phase but also important if you want to create a digital continuity. Next, the process definition is further detailed with detailed steps, defining the key parameters characteristics of the main equipment.

ElephantAndAnts

And then the challenge starts. In the context of the plant structure, the right equipment needs to be selected. Here it is where plant meets product or as the Outotec team said where the elephant and ants do the tango.

In the end, as much as possible standardized products need to match the customer specific solution. The dream of most of these companies: combining Engineering To Order and Configure To Order and remember this in the context of digital continuity.

So far, a typical EPC (Engineering Procurement Construction) project, however, Outotec wants to extend the digital continuity to support also their customer´s installed plant. I remembered one of their quotes for the past: “Buy one (plant) and get two (a real one and a virtual one). “This concept managed in a digital continuity is something that will come up in many other industries – the digital twin.

clip_image008

Where companies like Outotec are learning to connect all data from the initiation of their customer specific solution through delivery and services, other product manufacturing companies are researching the same digital continuity for their product offerings to the field of consumers. Thanks to digitization these concepts become more and more similar. I wrote about this topic recently in my post PLM for Owner/Operators.

Final conclusion from PI Berlin 2017

It is evident participants and speakers are talking about the strategic value and role PLM can have an organization.

With digitalization, new possibilities arise where the need and value for end-to-end connectivity pop up in every industry.

We, the PLM community, are all learning and building new concepts. Keep sharing and meeting each other in blogs, forums, and conferences.

Last week I got the following question:

Many companies face the challenges relevant to the cooperation and joint ventures and need to integrate in a smart way the portfolio’s to offer integrated solutions. In the world of sharing and collaboration, this may be a good argument to dig into. Is PLM software ready for this challenge with best practice solutions or this is a matter that is under specific development case by case? Any guidelines?

Some history

When PLM solutions were developed their core focus was on bringing hardware products to the market in a traditional manner as shown in the figure below. clip_image001

Products were pushed to the market based on marketing research and closed innovation. Closed innovation meant companies were dependent on their internal R&D to provide innovative products. And this is the way most PLM systems are implemented: supporting internal development. Thanks to global connectivity, the internal development teams can collaborate together connected to a single PLM backbone/infrastructure.

Third Party Products (TPP) at that time were sometimes embedded in the EBOM, and during the development phase, there would be an exchange of information between the OEM and the TPP provider. Third Party Products were treated in a similar manner as purchase items. And as the manufacturing of the product was often defined in the ERP system, there the contractual and financial interactions with the TTP provider were handled, creating a discontinuity between what has been defined for the product and what has been shipped. The disconnect between the engineering intent and actual delivery to the customer often managed in Excel spreadsheets or proprietary databases developed to soften the pain

What is happening now?

In the past 10 – 15 years there is the growing importance of first electronic components and their embedded software now followed by new go-to-market approaches, where the customer proposition changes from just a product, towards a combined offering of hardware, software, and services. Let´s have a look how this could be done in a PLM environment.

From Products to Solutions

The first step is to manage the customer proposition in a logical manner instead of managing all in a BOM definition. In traditional businesses, most companies still work around multiple Bill of Materials. For example, read this LinkedIn post: The BOM is King. This approach works when your company only delivers hardware.

Not every PLM system supports Out-Of-The-Box a logical structure. I have seen implementations where this logical structure was stored in an external database (not preferred) or as a customized structure in the PLM system. Even in SmarTeam, this methodology was used to support Asset Lifecycle Management. I wrote about this concept early 2014 in the context of Service Lifecycle Management(SLM) two posts: PLM and/or SLM ? and PLM and/or SLM (continued). It is no coincidence that concepts used for connecting SLM to PLM are similar to defining customer propositions.

PropositionIn the figure to the left, you can see the basic structure to manage a customer proposition and how it would connect to the aspects of hardware, software, and services. In an advanced manner, the same structure could be used with configuration rules to define and create a portfolio of propositions. More about this topic potential in a future blog post.

For hardware, most PLM systems have their best practices based on the BOM as discussed before. When combining the hardware with embedded software, we enter the world of systems. The proposition is no longer a product it becomes a system or even an experience.

For managing systems, I see two main additions to the classical PLM approach:

  1. The need for connected systems engineering. As the behavior of the system is much more complicated than just a hardware product, companies discover the need to spend more time on understanding all the requirements for the system and its potential use cases in operation – the only way to define the full experience. Systems Engineering practices coming from Automotive & Aerospace are now coming into the world of high-tech, industrial equipment, and even consumer goods.
  2. The need to connect software deliverables. Software introduces a new challenge for companies, no matter if the software is developed internally or embedded through TTP. In both situations, there is the need to manage change in a fast and iterative manner. Classical ECR /ECO processes do not work here anymore. Working agile and managing a backlog becomes the mode. Application Lifecycle Management connected to PLM becomes a need.

In both domains, systems engineering, and ALM, PLM vendors have their offerings, and on the marketing side, they might all look the same to you. However, there is a fundamental need that is not always visible on the marketing slides, the need for complete openness.

Openness

opennessTo manage a portfolio based on systems a company can no longer afford to manually check in multiple management systems all the dependencies between the product and its components combined with the software deliverables and TTPs. Automation, traceability on changes and notifications are needed in a modern, digital environment, which you might call a product innovation platform. My high-speed blog buddy Oleg Shilovitsky just dedicated a post to “The Best PLM for Product Innovation Platform” sharing several quotes from CIMdata´s talk about characteristics of a Product Innovation Platform and stressing the need for openness.

It is true if you can only manage your hardware (mechanics & electronics) and software in dedicated systems, your infrastructure will be limited and rigid as the outside world is in constant and fast changes. No ultimate solution or product does it all and will do it all in the future. Therefore openness is crucial.

Services

In several companies, original in the Engineering, Procurement & Construction industry, I have seen the need to manage services in the context of the customer delivery too. Highly customized systems and/or disconnected systems were used here. I believe the domain of managing a proposition, a combination of hardware, software, AND services in a connected environment is still in its early days. Therefore the question marks in the diagram.

Conclusion

How Third Party Products management are supported by PLM depends very much on the openness of the PLM system. How it connects to ALM and how the PLM system is able to manage a proposition. If your PLM system has been implemented as a supporting infrastructure for Engineering only, you are probably not ready for the modern digital enterprise.

Other thoughts ???

PLM and IPTwo terms pass me every day: Digital Transformation appears in every business discussion, and IP Security, a topic also discussed in all parts of society. We realize it is easy to steal electronic data without being detected (immediately).

What is Digital Transformation?

Digital Transformation is reshaping business processes to enable new business models, create a closer relation with the market, and react faster while reducing the inefficiencies of collecting, converting and processing analog or disconnected information.

Digital Transformation became possible thanks to the lower costs of technology and global connectivity, allowing companies, devices, and customers to interact in almost real-time when they are connected to the internet.

IOTIoT (Internet of Things) and IIoT (Industrial Internet of Things) are terms closely related to Digital Transformation. Their focus is on creating connectivity with products (systems) in the field, providing a tighter relation with the customer and enabling new (upgrade) services to gain better performance. Every manufacturing company should be exploring IoT and IIoT possibilities now.

Digital Transformation is also happening in the back office of companies. The target is to create a digital data flow inside the company and with the outside stakeholders, e.g., customers, suppliers, authorities. The benefits are mainly improved efficiency, faster response and higher quality interaction with the outside world.

digitalPLMThe part of Digital Transformation that concerns me the most is the domain of PLM. As I have stated in earlier posts (Best Practices or Next Practices ? / What is Digital PLM ?), the need is to replace the classical document-driven product to market approach by a modern data-driven interaction of products and services.

I am continually surprised that companies with an excellent Digital Transformation profile on their websites have no clue about Digital Transformation in their product innovation domain. Marketing is faster than reality.

PIBerlin2017-1I am happy to discuss this topic with many of my peers in the product innovation world @ PI Berlin 2017, three weeks from now. I am eagerly looking to look at how and why companies do not embrace the Digital Transformation sooner and faster. The theme of the conference, “Digital Transformation: From Hype to Value “ says it all. You can find the program here, and I will report about this conference the weekend after.

IP Security

The topic of IP protection has always been high on the agenda of manufacturing companies. Digital Transformation brings new challenges. Digital information will be stored somewhere on a server and probably through firewalls connected to the internet. Some industries have high-security policies, with separate networks for their operational environments. Still, many large enterprises are currently struggling with IP security policies as sharing data while protecting IP between various systems creates a lot of administration per system.

dropboxCloud solutions for sharing data are still a huge security risk. Where is the data stored and who else have access to it? Dropbox came in the news recently as “deleted” data came back after five years, “due to a bug.” Cloud data sharing cannot be trusted for real sensitive information.

Cloud providers always claim that their solutions are safer due to their strict safety procedures compared to the improvident behavior of employees. And, this is true. For example, a company I worked with had implemented Digital Rights Management (DRM) for internal sharing of their IP, making sure that users could only read information on the screen, and not store it locally if they had an issue with the server. “No problem”, one of the employees said, “I have here a copy of the documents on my USB-drive.

lockedCloud-based PLM systems are supposed to be safer. However, it still matters where the data is stored; security and hacking policies of countries vary. Assume your company´s IP is safe for hacking. Then the next question is “How about ownership of your data?”

Vendor lock-in and ownership of data are topics that always comes back at the PDT conferences (see my post on PDT2016). When a PLM cloud provider stores your product data in a proprietary data format, you will always be forced to have a costly data migration project when you decide to change from the provider.

Why not use standards for data storage? Hakan Kårdén triggered me on this topic again with his recent post: Data Is The New Oil So Make Sure You Ask For The Right Quality.

 

Conclusion:

Digital Transformation is happening everywhere but not always with the same pace and focus. New PLM practices still need to be implemented on a larger scale to become best practices. Digital information in the context of Intellectual Property creates extra challenges to be solved. Cloud providers do not offer yet solutions that are safe and avoiding vendor lock-in.

Be aware. To be continued…

Many thanks (again) to Dick Bourke for his editing suggestions

First, Happy New Year!! I wish all my readers a healthy, happy and successful 2017. Increasing your understanding of modern PLM based on field experiences is my pledge to you this year. PLM as part of a business strategy is mentioned more and more at management level in companies. However, the meaning and impact of PLM can be diffuse therefore requiring more clarification to management. To save your time, I’m pleased to share some images/slides I have used to explain fundamental PLM concepts. Use them in your PLM meetings.

People, Processes, and Tools

PeopleProcessTools

A company should not implement a PLM system just because people say they need a PLM system. Most likely, PLM supports a business transformation, enabling new ways of working and new business processes.

PeopleProcessToolsTweet

Read more related to People, Processes, and Tools:

Old and New PLM

OldNewPLM

When your company wants to implement PLM today, it is important to realize all businesses are transitioning from old linear processes, pushing products to the market towards incremental customer-oriented processes. With a change from a document-driven approach towards a data-driven approach, implementing PLM requires a new approach:

Read more about how PLM is changing:

PLM Selection

PLM selection

Selecting the right PLM system is just the top of the iceberg. Most PLM systems have lots of functionalities in common. Therefore, when selecting a PLM system, take into account the topics below the waterline. The deeper you get, the more important they are for a successful PLM implementation.

Read more about PLM selection:

The Maturity of an Organization

Gartner maturity

Can you run before you can walk? Is PLM only valid for large companies? I do not think so. Large companies usually have a higher need to make their products less dependent on specific individual skills. Therefore, they will focus more on repeatable processes and as next steps, integrating internally and externally. This slide was presented by Marc Halpern at PDT2015 and illustrates the maturity journey a company can grow through, and how this journey affects the focus for PDM, PLM, and future integration,

Read more about the PLM journey and Maturity:

Don’t Choose the Easiest Path

gartner benefits

Another “classical” Gartner slide explaining what everybody knows, yet what most companies fail to do. Two important messages with this slide.

  • Every change in technology will cause a dip in the company’s performance. Give your people the time to adapt by changing performance KPIs for that period
  • Introducing new technology combined with introducing new processes and a change in culture will bring the highest value

Read more about Cultural Change:

Digital Transformation is Coming

The world is becoming rapidly digital. Digitization is destroying jobs that can be automated. A great article about the onrushing wave can be found in the Economist, describing which jobs are likely to stay and which are likely to disappear. And, disappearing jobs will not come back again as some populists might promise. The good news, however, is that new business models and processes require many new jobs for which we are not educated (yet). Self-learning becomes crucial.

Read more about Digital Transformation:

Evolution, Disruption or Bimodal?

bimodal

Companies that have implemented their classical PLM environment fail to move to a PLM infrastructure supporting modern customer-driven delivery of products and services. However, the evolutionary approach takes too long; the alternative is to disrupt your business. Or try a bimodal PLM-approach. The bimodal PLM approach is inspired by Gartner’s bimodal IT approach.

Read more about Disruption or Bimodal:

See You Soon?

2017 is going to be an interesting and challenging year for all of us. What will be the further impact of digitization on your business? Will we tweet our PLM strategy in the future? I hope to discuss these developments with you on my blog and during the upcoming PI Berlin.

PIBerlin2017

Let’s communicate !

PLM can be swinging and inspiring although there will be times of frustration and stress when implementing. These seven musical views will help you to make it through the project.

 

One Vision

Every business change should start with a vision and a strategy. Defining the vision and keeping the vision alive is the responsibility of senior management. When it comes to PLM, the vision is crucial.

 

No more heroes

Of course, when implementing PLM, the target is to streamline the organization’s processes, eliminate bottlenecks and reduce dependencies on individuals. No more need for firefighters or other heroes because they fix or solve issues that appear due to the lack of processes and clarity.

 

Let´s do it together

PLM implementations are not IT-projects, where you install, configure and roll out an infrastructure based on one or more systems. Like a music band, it should be a well-orchestrated project between business experts and IT. Here´s a song to make your project swing.

 

Say NO at the right time

When implementing PLM, the software geeks can do everything for you: Customize the system, create a complete new environment looking like the old environment, and more. Of course, you will pay for it. Not only for the extra services, but also in the long-term to support all these customizations. Always try to find a balance between the standard functionality and infrastructure of the PLM system and the company´s vision. This means there are times you must Say NO to your users. Maybe not always as funny as these guys say it.

 

Eight days a week

During the PLM implementation and for sure after one of the several rollouts, changes may appear. And, normal work still needs to be done, sometimes in a different way. There will never be enough time to do everything perfect and fast, and it feels like you need more days in the week. When you are stressed, swing with these guys.

 

We are the champions

Then when the PLM project has been implemented successfully, there is a feeling of relief. It has been a tough time for the company and the PLM team. This should be the moment for the management to get everyone together in the stadium as an important change for the company´s future has been realized. Sing all together.

 

… But the times they are a-changing

Although a moment of relief is deserved, PLM implementations never end. The current infrastructure could be improved continuously due to better business understanding. However, globalization and digitalization will create new business challenges and opportunities at an extraordinarily fast pace. So, be aware and sing along with Bob.

 

BONUS

Time to close the 2016 book and look forward to next year’s activities. I wish all my readers happy holidays and a healthy, successful new year with a lot of dialogue, and no more one-liners.

 

See you in 2017 !!!!

changeRecently, I have written about classical PLM (document-driven and sequential) and modern PLM (data-driven and iterative) as part of the upcoming digital transformation that companies will have to go through to be fit for the future. Some strategic consultancy companies, like Accenture, talk about Digital PLM when referring to a PLM environment supporting the digital enterprise.

 

From classical PLM to Digital PLM?

The challenge for all companies is to transform their businesses to become customer-centric and find a transformation path from the old legacy PLM environment towards the new digital environment. Companies want to do this in an evolutionary mode. However my current observations are that the pace of an evolutionary approach is too slow related to what happens in their market. This time the change is happening faster than before.

A Big Bang approach towards the new environment seems to be a big risk. History has taught us that this is very painful and costly. To be avoided too. So what remains is a kind of bimodal approach, which I introduced in my recent blog posts (Best Practices or Next Practices). Although one of my respected readers and commenters Ed Lopategui mentioned in his comment (here) bimodal is another word for coexistence. He is not optimistic about this approach either

So, what remains is disruption?

And disruption is a popular word and my blog buddy Oleg Shilovitsky recently dived into that topic again with his post: How to displace CAD and PLM industry incumbents. An interesting post about disruption and disruption patterns. My attention was caught by the words: digital infrastructure.
I quote:

How it might happen? Here is one potential answer – digital infrastructure. Existing software is limited to CAD files stored on a desktop and collaboration technologies developed 15-20 years using relational database and client-server architecture.

Digital Infrastructure

imageAs I mentioned the words, Digital Infrastructure triggered me to write this post. At this moment,  I see companies marketing their Digital Transformation story in a slick way, supported by all the modern buzz words like; customer-centric, virtual twin and data-driven. You would imagine as a PLM geek that they have already made the jump from the old document-driven PLM towards modern digital PLM. So what does a modern digital PLM environment look like ?

The reality, however, behind this slick marketing curtain, is that there are still the old legacy processes, where engineers are producing drawings as output for manufacturing. Because drawings are still legal and controlled information carriers. There is no digital infrastructure behind the scenes. So, what would you expect behind the scenes?

Model-Based Definition as part of the digital infrastructure

Crucial to be ready for a digital infrastructure is to transform your company´s product development process from a file-based process where drawings are leading towards a model-based enterprise. The model needs to be the leading authority (single source of truth) for PMI (Product Manufacturing Information) and potentially for all upfront engineering activities. In this case, we call it Model-Based Systems Engineering sometimes called RFLP (Requirements-Functional-Logical-Product), where even the product can be analyzed and simulated directly based on the model.

A file-based process is not part of a digital infrastructure or model-based enterprise architecture. File-based processes force the company to have multiple instances and representations of the same data in different formats, creating an overhead of work to keep up quality and correctness of data, that is not 100 % secure. A digital infrastructure works with connected data in context.

econimistTherefore, if your company is still relying on drawings and you want to be ready for the future, a first step towards a digital infrastructure would be fixing your current processes to become model-based. Some good introductions can be found here at ENGINEERING.com – search for MBD and you will find:

Moving to Mode-Based is already a challenging transformation inside your company before touching the challenge of moving towards a full digital enterprise, through evolution, disruption or bimodal approach – let the leading companies show the way.

Conclusion

Companies should consider and investigate how to use a Model-Based Engineering approach as a first step to becoming lean and fit for a digital future. The challenge will be different depending on the type of industry and product.
I am curious to learn from my readers where they are on the path to a digital enterprise.

bimodalIn my earlier post The weekend after PDT Europe I wrote about the first day of this interesting conference. We ended that day with some food for thought related to a bimodal PLM approach. Now I will take you through the highlights of day 2.

Interoperability and openness in the air (aerospace)

I believe Airbus and Boeing are one of the most challenged companies when it comes to PLM. They have to cope with their stakeholders and massive amount of suppliers involved, constrained by a strong focus on safety and quality. And as airplanes have a long lifetime, the need to keep data accessible and available for over 75 years are massive challenges. The morning was opened by presentations from Anders Romare (Airbus) and Brian Chiesi (Boeing) where they confirmed they could switch the presenter´s role between them as the situations in Airbus and Boeing are so alike.

airbus logoAnders Romare started with a presentation called: Digital Transformation through an e2e PLM backbone, where he explained the concept of extracting data from the various silo systems in the company (CRM, PLM, MES, ERP) to make data available across the enterprise. In particular in their business transformation towards digital capabilities Airbus needed and created a new architecture on top of the existing business systems, focusing on data (“Data is the new oil”).

In order to meet a data-driven environment, Airbus extracts and normalizes data from their business systems and provides a data lake with integrated data on top of which various apps can run to offer digital services to existing and new stakeholders on any type of device. The data-driven environment allows people to have information in context and almost real-time available to make right decisions. Currently, these apps run on top of this data layer.

AirbusPDT2016

Now imagine information captured by these apps could be stored or directed back in the original architecture supporting the standard processes. This would be a real example of the bimodal approach as discussed on day 1. As a closing remark Anders also stated that three years ago digital transformation was not really visible at Airbus, now it is a must.

BoeingLogoNext Brian Chiesi from Boeing talked about Data Standards: A strategic lever for Boeing Commercial Airplanes. Brian talked about the complex landscape at Boeing. 2500 Applications / 5000 Servers / 900 changes annually (3 per day) impacting 40.000 users. There is a lot of data replication because many systems need their own proprietary format. Brian estimated that if 12 copies exist now, in the ideal world 2 or 3 will do. Brian presented a similar future concept as Airbus, where the traditional business systems (Systems Engineering, PLM, MRP, ERP, MES) are all connected through a service backbone. This new architecture is needed to address modern technology capabilities (social / mobile / analytics / cloud /IoT / Automation / ,,)

BoeingArchitecture

Interesting part of this architecture is that Boeing aims to exchange data with the outside world (customers / regulatory/supply chain /analytics / manufacturing) through industry standard interfaces to have an optimal flow of information. Standardization would lead to a reduction of customized applications, minimize costs of integration and migration, break the obsolescence cycle and enable future technologies. Brian knows that companies need to pull for standards, vendors will deliver. Boeing will be pushing for standards in their contracts and will actively work together with five major Aerospace & Defense companies to define required PLM capabilities and have a unified voice to PLM solutions providers.

My conclusion on these to Aerospace giants is they express the need to adapt to move to modern digital businesses, no longer the linear approach from the classic airplane programs. Incremental innovation in various domains is the future. The existing systems need to be there to support their current fleet for many, many years to come. The new data-driven layer needs to be connected through normalization and standardization of data. For the future focus on standards is a must.

MicrosoftLogoSimon Floyd from Microsoft talked about The Impact of Digital Transformation in the Manufacturing Enterprise where he talked us through Digital Transformation, IoT, and analytics in the product lifecycle, clarified by examples from the Rolls Royce turbine engine. A good and compelling story which could be used by any vendor explaining digital transformation and the relation to IoT. Next, Simon walked through the Microsoft portfolio and solution components to support a modern digital enterprise based on various platform services. At the end, Simon articulated how for example ShareAspace based on Microsoft infrastructure and technology can be an interface between various PLM environments through the product lifecycle.

Simon’s presentation was followed by a panel discussion where the theme was: When is history and legacy an asset and barriers of entry and When does it become a burden and an invitation to future competitors.
PDTpanel2Mark Halpern (Gartner) mentioned here again the bimodal thinking. Aras is bimodal. The classical PLM vendors running in mode 1 will not change radically and the new vendors, the mode 2 types will need time to create credibility. Other companies mentioned here PropelPLM (PLM on Salesforce platform) or OnShape will battle the next five years to become significant and might disrupt.

Simon Floyd(Microsoft) mentioned that in order to keep innovation within Microsoft, they allow for startups within in the company, with no constraints in the beginning to Microsoft. This to keep disruption inside you company instead of being disrupted from outside. Another point mentioned was that Tesla did not want to wait till COTS software would be available for their product development and support platform. Therefore they develop parts themselves. Are we going back to the early days of IT ?

Interesting trend I believe too, in case the building blocks for such solution architecture are based on open (standardized ?) services.

Data Quality

After the lunch, the conference was split in three streams where I was participating in the “Creating and managing information quality stream.” As I discussed in my presentation from day 1, there is a need for accurate data, starting a.s.a.p. as the future of our businesses will run on data as we learned from all speakers (and this is not a secret – still many companies do not act).

boost logoIn the context of data quality, Jean Brange from Boost presented the ISO 8000 framework for data and information quality management. This standard is now under development and will help companies to address their digital needs. The challenge of data quality is that we need to store data with the right syntax and semantic to be used and in addition, it needs to be pragmatic: what are we going to store that will have value. And then the challenge of evaluating the content. Empty fields can be discovered, however, how do you qualify the quality of field with a value. The ISO 8000 framework is a framework, like ISO 9000 (product quality) that allow companies to work in a methodological way towards acceptable and needed data quality.

iso8000

eurostep logoMagnus Färneland from Eurostep addressed the topic of data quality and the foundation for automation based on the latest developments done by Eurostep on top of their already rich PLCS data model. The PLCS data model is an impressive model as it already supports all facets of product lifecycle from design, through development and operations. By introducing soft typing, EuroStep allows a more detailed tuning of the data model to ensure configuration management. When at which stage of the lifecycle is certain information required (and becomes mandatory) ? Consistent data quality enforced through business process logic.

The conference ended with Marc Halpern making a plea for Take Control of Your Product Data or Lose Control of Your Revenue, where Marc painted the future (horror) scenario that due to digital transformation the real “big fish” will be the digital business ecosystem owner and that once you are locked in with a vendor, these vendors can uplift their prices to save their own business without any respect for your company’s business model. Marc gave some examples where some vendor raised prices with the subscription model up to 40 %. Therefore even when you are just closing a new agreement with a vendor, you should negotiate a price guarantee and a certain bandwidth for increase. And on top of that you should prepare an exit strategy – prepare data for migration and have backups using standards. Marc gave some examples of billions extra cost related to data quality and loss. It can hurt !! Finally, Marc ended with recommendations for master data management and quality as a needed company strategy.

GartnerSupscriptionModels

cimdataGerard Litjens from CIMdata as closing speaker gave a very comprehensive overview of The Internet of Thing – What does it mean for PLM ? based on CIMdata’ s vision. As all vendors in this space explain the relation between IoT and PLM differently, it was a good presentation to be used as a base for the discussion: how does IoT influence our PLM landscape. Because of the length of this blog post, I will not further go into these details – it is worth obtaining this overview.

Concluding: PDT2016 is a crucial PLM conference for people who are interested in the details of PLM. Other conferences might address high-level customer stories, at PDT2016 it is about the details and sharing the advantages of using standards. Standards are crucial for a data-driven environment where business platforms with all their constraints will be the future. And I saw more and more companies are working with standards in a pragmatic manner, observing the benefits and pushing for more data standards – it is not just theory.

See you next year ?

PDT2016I am just back from the annual PDT conference (12th edition), this year hosted in Paris from 9 to 10 November, co-located with CIMdata’s PLM Road Map 2016 for Aerospace & Defense. The PDT conference, organized by EuroStep and CIMdata, is a relatively small conference with a little over a hundred attendees. The attractiveness of this conference is that the group of dedicated participants is very interactive with each other sharing honest opinions and situations, sometimes going very deep into the details, needed to get the full picture. The theme of the conference was: “Investing for the future while managing product data legacy and obsolescence.” Here are some of the impressions from these days, giving you food for thought to join next year.

Setting the scene

Almost traditionally Peter Bilello (CIMdata) started the conference followed by Marc Halpern (Gartner). Their two presentations had an excellent storyline together.

cimdataPieter Bilello started and discussed Issues and Remedies for PLM Obsolescence. Peter did not address PLM obsolescence for the first time. It is a topic many early PLM adaptors are facing and in a way the imminent obsolescence of their current environments block them of taking advantage of new technologies and capabilities current PLM vendors offer. Having learned from the past CIMdata provides a PLM Obsolescence Management model, which should be on every companies agenda, in the same way as data quality (which I will address later). Being proactive in obsolescence can save critical situations and high costs. From the obsolescence theme, Peter looked forward to the future and the value product innovation platforms can offer, given the requirements that data should be able to flow through the organization, connecting to other platforms and applications, increasing the demand to adhere and push for standards.

gartnerMarc Halpern followed with his presentation, titled: More custom products demand new IT strategies and new PLM application where he focused on the new processes and methodology needed for future businesses with a high-focus on customer-specific deliveries, speed, and automation. Automation is always crucial to reducing production costs. In this delivery process 3D printing could bring benefits and Mark shared the plusses and minuses of 3D printing. Finally, when automation of a customer specific order would be possible, it requires a different IT-architecture, depicted by Mark. After proposing a roadmap for customizable products, Mark shared some examples of ROI benefits reported by successful transformation projects. Impressive !!

Gartner-ROI-samples

My summary of these two sessions is that both CIMdata and Gartner confirm the challenges companies have to change their old legacy processes and PLM environments which support the past, meanwhile moving to more, customer-driven processes and, modern data-driven PLM functionality. This process is not just an IT or Business change, it will also be a cultural change.

JT / STEP AP242 / PLCS

standardsNext, we had three sessions related to standards, where Alfred Katzenbach told the success story of JT, the investment done to get this standard approved and performing based on an active community to get the most out of JT, beyond its initial purpose of viewing and exchanging data in a neutral format. Jean-Yves Delanaunay explained in Airbus Operation the STEP AP242 definition is used as the core standard for 3D Model Based Definition (MB) exchange, part of the STEP standards suite and as the cornerstone for Long Term Archiving and Retrieval of Aerospace & Defense 3D MBD.

There seems to be some rivalry between JT and STEP242 viewing capabilities, which go beyond my understanding as I am not an expert from the field here. Nigel Shaw ended the morning session positioning PLCS as a standard for interoperability of information along the whole lifecycle of a product. Having a standardized data model as Nigel showed would be a common good approach for PLM vendors to converge to a more interoperable standard.

PLCS concept model

My summary of standards is that there is a lot of thinking, evaluation, and testing done by an extensive community of committed people. It will be hard for a company to define a better foundation for a standard in their business domain. Vendors are focusing on performance inside their technology offering and therefore will never push for standards (unless you use their products as a standard). A force for adhering to standards should come from the user community.

Using standards

After lunch we had three end-users stories from:

  • Eric Delaporte (Renault Group) talked about their NewPDM project and the usage of standards mainly for exchanges. Two interesting observations: Eric talks about New PDM – the usage of the words New (when does New become regular?) and PDM (not talking about PLM ?) and secondly as a user of standards he does not care about the JT/AP242 debate and uses both standards where applicable and performing.
  • Sebastien Olivier (France Ministry of Defense) gave a bi-annual update of their PCLS-journey used in two projects, Pencil (Standardized Exchange platform and centralized source of logistical information) and MAPS (Managing procurement contracts for buying In-Service Support services) and the status of their S3000L implementation (International procedure for Logistic Support Analysis). A presentation for the real in-crowd of this domain.
  • Juha Rautjarvi discussion how efficient use of knowledge for safety and security could be maintained and enhanced through collaboration. Here Juha talks about the Body of Knowledge which should be available for all stakeholders in the context of security and safety. And like a physical product this Body of Knowledge goes through a lifecycle, continuous adapting to what potentially arises from the outside world

My conclusion on this part was that if you are not really in these standards on a day-to-day base (and I am not), it is hard to pick the details. Still, the higher level thought processes behind these standard approaches allow you to see the benefits and impact of using standards, which is not the same as selecting a tool. It is a strategic choice.

Modular / Bimodular / not sexy ?

modilar - bimodulanrJakob Asell from Modular Management gave an overview how modularity can connect the worlds of sales, engineering, and manufacturing by adding a modular structure as a governing structure to the various structures used by each discipline. This product architecture can be used for product planning and provides and end-to-end connectivity of information. Modular Management is assisting companies in moving towards this approach.

Next my presentation title: The importance of accurate data. Act now! addressed the topic of the switch from classical, linear, document-driven PLM towards a modern, more incremental and data-driven PLM approach. Here I explained the disadvantage of the old evolutionary approach (impossible – too slow/too costly) and an alternative method, inspired by Gartner’s bimodular IT-approach (read my blog post: Best Practices or Next Practices). No matter which option you are looking for correct and quality data is the oil for the future, so companies should consider allowing the flow of data as a health issue for the future.

The day was closed with a large panel, where the panelist first jumped on the topic bimodal (bipolar ?? / multimodal ??) talking about mode 1 (the strategic approach) and mode 2 (the tactical and fast approach based on Gartner’s definition). It was clear that the majority of the panel was in Mode 1 mode. Which fluently lead to the discussion of usage of standards (and PLM) as not being attractive for the young generations (not sexy). Besides the conclusion that it takes time to understand the whole picture and see the valuable befits a standard can bring and join this enthusiasm

panel-day 1

Conclusion

I realize myself that this post is already too long according blogging guidelines. Therefore I will tell more about day 2 of the conference next week with Airbus going bimodal and more.

Stay tuned for next week !

clip_image002At this moment I am finalizing my session for PDT2016 where I will talk about the importance of accurate data. Earlier this year I wrote a post about that theme: The importance of accurate data. Act now!

My PDT session will be elaborating on this post, with a focus on why and how we need this change in day-to-day business happen. So if you are interested in a longer story and much more interesting topics to learn and discuss, come to Paris on 9 and 10 November.

Dreaming is free

Recently I found a cartoon on LinkedIn and shared it with my contacts, illustrating the optimistic view companies have when they are aiming to find the best solution for their business, going through an RFI phase, the RFP phase, and ultimately negotiation the final deal with the PLM solution provider or vendor. See the image below:

clip_image003

All credits to the author – I found this image here

The above cartoon gives a humoristic view of the (PLM) sales process (often true). In addition, I want to share a less optimistic view related to PLM implementations after the deal has been closed. Based on the PLM projects if have been coaching in the past, the majority of these projects became in stress mode once the stakeholders involved only focused on the software, the functions and features and centralizing data. Implementing the software without a business transformation caused a lot of discomfort.

clip_image005Users started to complain that the system did not allow them to do their day-to-day work in the same way. And they were right! They should have a new day-to-day work in the future, with different priorities based on the new PLM infrastructure.

This cultural change (and business change) was often not considered as the PLM system was implemented from an IT-perspective, not with a business perspective.

Over time, a better understanding of PLM and the fact that vendors and implementers have improved their portfolio and implementation skills, classical PLM implementations are now less disruptive.

A classical PLM implementation can be done quickly is because the system most of the time does not change the roles and responsibilities of people. Everyone remains working in his/her own silo. The difference: we store information in a central place so it can be found. And this approach would have worked if the world was not changing.

The digital enterprise transformation.

With the upcoming digitization and globalization of the market, enterprises are forced to adapt their business to become more customer-driven. This will have an impact on how PLM needs to be implemented. I wrote about this topic in my post: From a linear world to fast and circular. The modern digital enterprise has new roles and responsibilities and will eliminate roles and responsibilities that can be automated through a data-driven, rule-based approach. Therefore implementing PLM in a modern approach should be related (driven) by a business transformation and not the other way around!

Benefits realization

In the past two years, I have explained this story to all levels inside various organizations. And nobody disagreed. Redefining the processes, redefining roles was the priority. And we need a team to help people to make this change – these people are change management experts. The benefits diagram from Gartner as shown below was well understood, and most companies agreed the ambition should be to the top curve, in any case, stay above the red curve

clip_image007

But often reality relates to the first cartoon. In the majority of the implementations I have seen the past two years, the company did not want to invest in change management, defining the new process and new roles first for an optimum flow of information. They spent the entire budget on software and implementation services. With a minimum of staff, the technology was implemented based on existing processes – no change management at all. Disappointing, as short-term thinking destroyed the long-term vision and benefits were not as large as they had been dreaming.

Without changing business processes and cultural change management, the PLM team will fight against the organization, instead of surfing on the wave of new business opportunities and business growth.

Conclusion

If your company is planning to implement modern PLM which implicit requires a business transformation, make sure cultural change management is part of your plan and budget. It will bring the real ROI. Depending on your company´s legacy, if a business transformation is a mission impossible, it is sometimes easier to start a new business unit with new processes, new roles and potentially new people. Otherwise, the benefits will remain (too) low from your PLM implementation.

I am curious to learn your experience related to (the lack) of change management – how to include it into the real scope – your thoughts ?

Addition:
As a reaction to this post, Oleg Shilovitsky wrote a related blog post: PLM and the death spiral of cultural change.  See my response below to this post as it will contribute to the understanding of this post

Oleg, thanks for contributing to the theme of cultural change. Your post illustrates that my post was not clear enough, or perhaps too short. I do not believe PLM is that difficult because of technology, I would even claim that technology is a the bottom of my list of priorities. Not stating it is not important, but meaning that when you are converging with a company to a vision for PLM, you probably know the kind of technologies you are going to use.

The highest priority to my opinion is currently the business transformation companies need to go through in order to adapt their business to remain relevant in a digital world. The transformation will require companies to implement PLM in a different manner, less silo-oriented, more focus on value flows starting from the customer.

Working different means cultural change and a company needs to allocate time, budget and energy to that. The PLM implementation is supporting the cultural change not driving the cultural change.

And this is the biggest mistake I have seen everywhere. Management decides to implement a new PLM as the driver for cultural change, instead of the result of cultural change. And they reason this is done, is most of the time due to budget thinking as cultural change is ways more complex and expensive than a PLM implementation.

 

 

BEST-NEXTThe past half-year I have been intensively discussing potential PLM roadmaps with companies of different sizes and different maturity in PLM. Some companies are starting their PLM journey after many years of discussion and trying to identify the need and scope, others have an old PLM implementation (actually most of the time it is cPDM) where they discover that business paradigms from the previous century are no longer sufficient for the future.

The main changing paradigms are:

  • From a linear product-driven delivery process towards an individual customer focused offering based on products and effective services, quickly -adapting to the market needs.
  • From a document-driven, electronic files exchange based processes and systems towards data-driven platforms supporting information to flow in almost real-time through the whole enterprise.

Both changes are related and a result of digitization. New practices are under development as organizations are learning how to prepare and aim for the future. These new practices are currently dominating the agenda from all strategic consultancy firms as you cannot neglect the trend towards a digital enterprise. And these companies need next practices.

I wrote about it in recent posts: PLM what is next? and What is Digital PLM?

And what about my company?

InnovationIt is interesting to see that most of the PLM implementers and vendors are promoting best practices, based on their many years of experience working having customers contributing to functionality in their portfolio.

And it is very tempting to make your customer feel comfortable by stating:

“We will implement our (industry) best practices and avoid customization – we have done that before!”

I am sure you have heard this statement before. But what about these best practices as they address the old paradigms from the past?

Do you want to implement the past to support the future?

Starting with PLM ? Use Best Practices !

If the company is implementing PLM for the first time and the implementation is bottom-up you should apply the old PLM approach. My main argument: This company is probably not capable/ready to work in an integrated way. It is not in the company´s DNA yet. Sharing data and working in a controlled environment is a big step to take. Often PLM implementations failed at this point as the cultural resistance was too big.

When starting with classical PLM, avoid customization and keep the scope limited.  Horizontal implementations (processes across all departments) have more success than starting at engineering and trying to expand from there. An important decision to make at this stage is 2D leading (old) or the 3D Model leading (modern). Some future thoughts: How Model-based definition can fix your CAD models. By keeping the scope limited, you can always evolve to the next practices in 5 -10 years (if your company is still in business).

Note 1: remark between parenthesis is a little cynical and perhaps for the timeframe incorrect. Still, a company working bottom-up has challenges to stay in a modern competitive global environment.

Note 2: When writing this post I got notified about an eBook available with the tittle Putting PLM within reach written by Jim Brown. The focus is on cloud-based PLM solution that require less effort/investments on the IT-side and as side effect it discourages customization (my opinion) – therefore a good start.

Evolving in PLM – Next Practices

Enterprises that have already a PDM/PLM system in place for several years should not implement the best practices. They have reached the level that the inhibitors off a monolithic, document based environment are becoming clear.

They (must) have discovered that changing their product offering or their innovation strategy now with partners is adding complexity that cannot be supported easily. The good news, when you change your business model and product offering, there is C-level attention. This kind of changes do not happen bottom-up.

Unfortunate business changes are often discussed at the execution level of the organization without the understanding that the source of all products or offering data needs to be reorganized too.   PLM should be a part of that strategic plan and do not confuse the old PLM with the PLM for the future.

InfoInContextThe PLM for the future has to be built upon next practices. These next practices do not exists out of the box. They have to be matured and experienced by leading companies. The price you pay when being a leader Still being a leader bring market share and profit  your company cannot meet when being a follower.

 

The Bi-modal approach

As management of a company, you do not want a disruption to switch from one existing environment to a new environment. Too much risk and too disruptive – people will resist – stress and bad performance everywhere. As the new data-driven approach is under development (we are learning), the end target is still moving.

Evolving using the old PLM system towards the new PLM approach is not recommended.  This would be too expensive, slow and cumbersome. PLM would get a bad reputation as all the complexity of the past and the future are here. It is better to start the new PLM with a new business platform and customer-oriented processes for a limited offering and connect it to your legacy PLM.

Over the years the new PLM will become more clear and grow where the old PLM will become less and less relevant. Depending on the dynamics of your industry this might take a few years till decades.

bimodal
Gartner calls this the bi-modal approach. A bi-model approach requires orchestration needs full management attention as the future is going to be shaped here.

It must and will be a business-driven learning path for new best practices

 

Conclusion

Best Practices and Next Practices are needed in parallel. Depending on the maturity and lack of sharing information in your company, you can choose. Consider the bi-modal approach to choose a realistic time path.

What do you think? Could this simplified way of thinking help your company?

The theme Best Practices or Next Practices is not new. Prof. Krusse talked about it already 9 years ago in a generic way. Unfortunate the recording is in German only