You are currently browsing the tag archive for the ‘PLM’ tag.

clip_image002It is already the 6th consecutive year that MarketKey organized the Product Innovation conference with its primary roots in PLM. For me, the PI conferences have always been a checkpoint for changes and progress in the field.

This year about 100 companies participated in the event with the theme: Digital Transformation. From Hype to Value? Sessions were split into three major streams: digital transformation, extended PLM, and Business Enabled Innovation larded with general keynote speeches. I wanted to attend all sessions (and I will do virtually later through PI.TV), but in this post, my observations are from the event highlights from the extended PLM sessions.

From iCub to R1

ittGiorgio Metta gave an overview of the RobotCub project, where teams are working on developing a robot that can support human beings in our day-to-day live. Some of us are used to industrial robots and understand their constraints. A robot to interact with human beings is extreme more complex, and its development is still in the early stages. This type of robot needs to learn and interpret its environment while remaining accurate and safe for the persons interacting with the robot.

One of the interesting intermediate outcome from the project is that a human-like robot with legs and arms is far too expensive and complicated to handle. Excellent for science fiction movies, but in reality too difficult to control its balance and movements.

This was an issue with the iCUB robot. Now Giorgio and the teams are working on the new R1 robot, maybe not “as-human” as the iCUB robot, but more affordable. It is not only the mechanics that challenge the researchers. Also, the software supporting the artificial intelligence required for a self-learning and performing safe robot is still in the early days.

clip_image004

An inspiring keynote speech to start the conference.

Standardizing PLM Components

The first Extended PLM session was Guido Klette (Rheinmetall), describing the challenges the Rheinmetall group has related to develop and support PLM needs. The group has several PLD/PLM-like systems in place. Guido does not believe in one size fits all to help every business in the group. They have already several PLM “monsters” in their organization. For more adequate support, Rheinmetall has defined a framework with PLM components and dependencies to a more granular choice of functionality to meet individual businesses.

Rheinmetal components

A challenge for this approach, identified by a question from the audience, is that it is a very scientific approach not addressing the difference in culture between countries. Guido agreed and mentioned that despite culture, companies joining the Rheinmetall group most of the time were happy to adhere to such a structured approach.

My takeaway: the component approach fits very well with the modern thinking that PLM should not be supported by a single “monster” system but can be addressed by components providing at the end the right business process support.

PLM as a business asset

husqvarnagroupBjörn Axling gave an excellent presentation describing the PLM perspective from the Husqvarna group. He addressed the external and internal challenges and opportunities for the group in a structured and logical approach which probably apply for most manufacturing companies in a global market. Björn explained that in the Husqvarna group PLM is considered as a business approach, more than ever, Product Lifecycle Management needs to be viewed as the DNA of a company which was the title of one of his slides.

Husqvarna

I like his eleven key imperatives (see the above picture) in particular key imperative #9 which is often forgotten:

Take definitions, nomenclature and data management very seriously – the devil is in the details.

This point will always fire back on you if you did not give it the needed attention from the start. Of course, the other ten points are also relevant. The challenge in every PLM project is to get these points addressed and understood in your company.

How to use PLM to enable Industry 4.0?

EignerMartin Eigner´s presentation was building upon his consistent messages that PDM and PLM should be evolving into SysML with a growing need for Model-Based Systems Engineering (MBSE) support.

The title of the presentation was related to Industry 4.0 more focusing on innovation in for Germany´s manufacturing industry. Germany has always been strong in manufacturing, not so strong in product innovation. Martin mentioned that later this year the German government will start another initiative, Engineering 4.0, which should be exciting for our PLM community.

Martin elaborated on the fact that end-to-end support for SysLM can be achieved through a backbone based on linked data. Do not try to solve all product information views in a single system is the lesson learned and preached.

Eigner-Bimodal

For me, it was interesting to see that also Martin picked up on the bimodal approach for PLM, required to support a transition to a modern digital enterprise (see picture). We cannot continue to build upon our old PLM environments to support, future digital businesses.

PLM and Digital Transformation

In my afternoon session (Jos Voskuil), I shared the observations that companies invest a lot in digital transformation downstream by introducing digital platforms for ERP, CRM, MES and Operations. PLM is often the forgotten platform that needs to change to support a digital enterprise with all its benefits. You can see my presentation here on SlideShare. I addressed here the bimodal approach as discussed in a previous blog post, introduced in Best Practices or Next Practices.

TacitBerlin2017Conclusions

In case your company is not ready yet for a digital transformation or bimodal approach I addressed the need to become model-driven instead of document-driven. And of course for a digital enterprise, the quality of the data counts. I wrote about these topics recently: Digital PLM requires a Model-Based Enterprise and The importance of accurate data: ACT NOW!

Closed-Loop PLM

The last extended PLM presentation from day 1 was given by Felix Nyffenegger, professor for PLM/CAx at HSR (University of Applied Science in Rapperswil (CH)). Felix shared his discovery journey into Industry 4.0, and IoT combined with experiences from the digitalLab@HSR, leading into the concept of closed-loop PLM.

ClosedLoop

I liked in particular how Felix brought the various views on the product together into one diagram, telling the full story of closed-loop PLM – necessary for a modern implementation framework.

A new age for airships

The last presentation of the day was from Chris Daniels describing the journey of Hybrid Air Vehicles with their Airlander 10 project. Where the classical airships, the most infamous perhaps the Hindenburg, have disappeared due to their flaws, the team of Hybrid Air Vehicles built upon the concept of airships in a defense project with the target to deliver a long endurance multi-intelligence vehicle. The advantage of airships is that they can stay in the air for several days, serving as communication hotspot, communication or rescue ship for places hard to reach with traditional aircraft or helicopter. The Airlander can be operation without going back to a base for 5 days, which is extremely long when you compare this to other aircraft.

airlander

The Airlander project is a typical example of incremental innovation used to optimize and extend the purpose of an airship. Combined with the fact that Chris was an excellent speaker made it a great closure of the day

Conclusion

This post is just an extract of one day and one stream of the conference. Already too large for a traditional blog post. Next week I will follow-up with day two and respond beyond 140 characters to the tweet below:

WhyNotInPLM

PLM and IPTwo terms pass me every day: Digital Transformation appears in every business discussion, and IP Security, a topic also discussed in all parts of society. We realize it is easy to steal electronic data without being detected (immediately).

What is Digital Transformation?

Digital Transformation is reshaping business processes to enable new business models, create a closer relation with the market, and react faster while reducing the inefficiencies of collecting, converting and processing analog or disconnected information.

Digital Transformation became possible thanks to the lower costs of technology and global connectivity, allowing companies, devices, and customers to interact in almost real-time when they are connected to the internet.

IOTIoT (Internet of Things) and IIoT (Industrial Internet of Things) are terms closely related to Digital Transformation. Their focus is on creating connectivity with products (systems) in the field, providing a tighter relation with the customer and enabling new (upgrade) services to gain better performance. Every manufacturing company should be exploring IoT and IIoT possibilities now.

Digital Transformation is also happening in the back office of companies. The target is to create a digital data flow inside the company and with the outside stakeholders, e.g., customers, suppliers, authorities. The benefits are mainly improved efficiency, faster response and higher quality interaction with the outside world.

digitalPLMThe part of Digital Transformation that concerns me the most is the domain of PLM. As I have stated in earlier posts (Best Practices or Next Practices ? / What is Digital PLM ?), the need is to replace the classical document-driven product to market approach by a modern data-driven interaction of products and services.

I am continually surprised that companies with an excellent Digital Transformation profile on their websites have no clue about Digital Transformation in their product innovation domain. Marketing is faster than reality.

PIBerlin2017-1I am happy to discuss this topic with many of my peers in the product innovation world @ PI Berlin 2017, three weeks from now. I am eagerly looking to look at how and why companies do not embrace the Digital Transformation sooner and faster. The theme of the conference, “Digital Transformation: From Hype to Value “ says it all. You can find the program here, and I will report about this conference the weekend after.

IP Security

The topic of IP protection has always been high on the agenda of manufacturing companies. Digital Transformation brings new challenges. Digital information will be stored somewhere on a server and probably through firewalls connected to the internet. Some industries have high-security policies, with separate networks for their operational environments. Still, many large enterprises are currently struggling with IP security policies as sharing data while protecting IP between various systems creates a lot of administration per system.

dropboxCloud solutions for sharing data are still a huge security risk. Where is the data stored and who else have access to it? Dropbox came in the news recently as “deleted” data came back after five years, “due to a bug.” Cloud data sharing cannot be trusted for real sensitive information.

Cloud providers always claim that their solutions are safer due to their strict safety procedures compared to the improvident behavior of employees. And, this is true. For example, a company I worked with had implemented Digital Rights Management (DRM) for internal sharing of their IP, making sure that users could only read information on the screen, and not store it locally if they had an issue with the server. “No problem”, one of the employees said, “I have here a copy of the documents on my USB-drive.

lockedCloud-based PLM systems are supposed to be safer. However, it still matters where the data is stored; security and hacking policies of countries vary. Assume your company´s IP is safe for hacking. Then the next question is “How about ownership of your data?”

Vendor lock-in and ownership of data are topics that always comes back at the PDT conferences (see my post on PDT2016). When a PLM cloud provider stores your product data in a proprietary data format, you will always be forced to have a costly data migration project when you decide to change from the provider.

Why not use standards for data storage? Hakan Kårdén triggered me on this topic again with his recent post: Data Is The New Oil So Make Sure You Ask For The Right Quality.

 

Conclusion:

Digital Transformation is happening everywhere but not always with the same pace and focus. New PLM practices still need to be implemented on a larger scale to become best practices. Digital information in the context of Intellectual Property creates extra challenges to be solved. Cloud providers do not offer yet solutions that are safe and avoiding vendor lock-in.

Be aware. To be continued…

Many thanks (again) to Dick Bourke for his editing suggestions

PLM can be swinging and inspiring although there will be times of frustration and stress when implementing. These seven musical views will help you to make it through the project.

 

One Vision

Every business change should start with a vision and a strategy. Defining the vision and keeping the vision alive is the responsibility of senior management. When it comes to PLM, the vision is crucial.

 

No more heroes

Of course, when implementing PLM, the target is to streamline the organization’s processes, eliminate bottlenecks and reduce dependencies on individuals. No more need for firefighters or other heroes because they fix or solve issues that appear due to the lack of processes and clarity.

 

Let´s do it together

PLM implementations are not IT-projects, where you install, configure and roll out an infrastructure based on one or more systems. Like a music band, it should be a well-orchestrated project between business experts and IT. Here´s a song to make your project swing.

 

Say NO at the right time

When implementing PLM, the software geeks can do everything for you: Customize the system, create a complete new environment looking like the old environment, and more. Of course, you will pay for it. Not only for the extra services, but also in the long-term to support all these customizations. Always try to find a balance between the standard functionality and infrastructure of the PLM system and the company´s vision. This means there are times you must Say NO to your users. Maybe not always as funny as these guys say it.

 

Eight days a week

During the PLM implementation and for sure after one of the several rollouts, changes may appear. And, normal work still needs to be done, sometimes in a different way. There will never be enough time to do everything perfect and fast, and it feels like you need more days in the week. When you are stressed, swing with these guys.

 

We are the champions

Then when the PLM project has been implemented successfully, there is a feeling of relief. It has been a tough time for the company and the PLM team. This should be the moment for the management to get everyone together in the stadium as an important change for the company´s future has been realized. Sing all together.

 

… But the times they are a-changing

Although a moment of relief is deserved, PLM implementations never end. The current infrastructure could be improved continuously due to better business understanding. However, globalization and digitalization will create new business challenges and opportunities at an extraordinarily fast pace. So, be aware and sing along with Bob.

 

BONUS

Time to close the 2016 book and look forward to next year’s activities. I wish all my readers happy holidays and a healthy, successful new year with a lot of dialogue, and no more one-liners.

 

See you in 2017 !!!!

changeRecently, I have written about classical PLM (document-driven and sequential) and modern PLM (data-driven and iterative) as part of the upcoming digital transformation that companies will have to go through to be fit for the future. Some strategic consultancy companies, like Accenture, talk about Digital PLM when referring to a PLM environment supporting the digital enterprise.

 

From classical PLM to Digital PLM?

The challenge for all companies is to transform their businesses to become customer-centric and find a transformation path from the old legacy PLM environment towards the new digital environment. Companies want to do this in an evolutionary mode. However my current observations are that the pace of an evolutionary approach is too slow related to what happens in their market. This time the change is happening faster than before.

A Big Bang approach towards the new environment seems to be a big risk. History has taught us that this is very painful and costly. To be avoided too. So what remains is a kind of bimodal approach, which I introduced in my recent blog posts (Best Practices or Next Practices). Although one of my respected readers and commenters Ed Lopategui mentioned in his comment (here) bimodal is another word for coexistence. He is not optimistic about this approach either

So, what remains is disruption?

And disruption is a popular word and my blog buddy Oleg Shilovitsky recently dived into that topic again with his post: How to displace CAD and PLM industry incumbents. An interesting post about disruption and disruption patterns. My attention was caught by the words: digital infrastructure.
I quote:

How it might happen? Here is one potential answer – digital infrastructure. Existing software is limited to CAD files stored on a desktop and collaboration technologies developed 15-20 years using relational database and client-server architecture.

Digital Infrastructure

imageAs I mentioned the words, Digital Infrastructure triggered me to write this post. At this moment,  I see companies marketing their Digital Transformation story in a slick way, supported by all the modern buzz words like; customer-centric, virtual twin and data-driven. You would imagine as a PLM geek that they have already made the jump from the old document-driven PLM towards modern digital PLM. So what does a modern digital PLM environment look like ?

The reality, however, behind this slick marketing curtain, is that there are still the old legacy processes, where engineers are producing drawings as output for manufacturing. Because drawings are still legal and controlled information carriers. There is no digital infrastructure behind the scenes. So, what would you expect behind the scenes?

Model-Based Definition as part of the digital infrastructure

Crucial to be ready for a digital infrastructure is to transform your company´s product development process from a file-based process where drawings are leading towards a model-based enterprise. The model needs to be the leading authority (single source of truth) for PMI (Product Manufacturing Information) and potentially for all upfront engineering activities. In this case, we call it Model-Based Systems Engineering sometimes called RFLP (Requirements-Functional-Logical-Product), where even the product can be analyzed and simulated directly based on the model.

A file-based process is not part of a digital infrastructure or model-based enterprise architecture. File-based processes force the company to have multiple instances and representations of the same data in different formats, creating an overhead of work to keep up quality and correctness of data, that is not 100 % secure. A digital infrastructure works with connected data in context.

econimistTherefore, if your company is still relying on drawings and you want to be ready for the future, a first step towards a digital infrastructure would be fixing your current processes to become model-based. Some good introductions can be found here at ENGINEERING.com – search for MBD and you will find:

Moving to Mode-Based is already a challenging transformation inside your company before touching the challenge of moving towards a full digital enterprise, through evolution, disruption or bimodal approach – let the leading companies show the way.

Conclusion

Companies should consider and investigate how to use a Model-Based Engineering approach as a first step to becoming lean and fit for a digital future. The challenge will be different depending on the type of industry and product.
I am curious to learn from my readers where they are on the path to a digital enterprise.

bimodalIn my earlier post The weekend after PDT Europe I wrote about the first day of this interesting conference. We ended that day with some food for thought related to a bimodal PLM approach. Now I will take you through the highlights of day 2.

Interoperability and openness in the air (aerospace)

I believe Airbus and Boeing are one of the most challenged companies when it comes to PLM. They have to cope with their stakeholders and massive amount of suppliers involved, constrained by a strong focus on safety and quality. And as airplanes have a long lifetime, the need to keep data accessible and available for over 75 years are massive challenges. The morning was opened by presentations from Anders Romare (Airbus) and Brian Chiesi (Boeing) where they confirmed they could switch the presenter´s role between them as the situations in Airbus and Boeing are so alike.

airbus logoAnders Romare started with a presentation called: Digital Transformation through an e2e PLM backbone, where he explained the concept of extracting data from the various silo systems in the company (CRM, PLM, MES, ERP) to make data available across the enterprise. In particular in their business transformation towards digital capabilities Airbus needed and created a new architecture on top of the existing business systems, focusing on data (“Data is the new oil”).

In order to meet a data-driven environment, Airbus extracts and normalizes data from their business systems and provides a data lake with integrated data on top of which various apps can run to offer digital services to existing and new stakeholders on any type of device. The data-driven environment allows people to have information in context and almost real-time available to make right decisions. Currently, these apps run on top of this data layer.

AirbusPDT2016

Now imagine information captured by these apps could be stored or directed back in the original architecture supporting the standard processes. This would be a real example of the bimodal approach as discussed on day 1. As a closing remark Anders also stated that three years ago digital transformation was not really visible at Airbus, now it is a must.

BoeingLogoNext Brian Chiesi from Boeing talked about Data Standards: A strategic lever for Boeing Commercial Airplanes. Brian talked about the complex landscape at Boeing. 2500 Applications / 5000 Servers / 900 changes annually (3 per day) impacting 40.000 users. There is a lot of data replication because many systems need their own proprietary format. Brian estimated that if 12 copies exist now, in the ideal world 2 or 3 will do. Brian presented a similar future concept as Airbus, where the traditional business systems (Systems Engineering, PLM, MRP, ERP, MES) are all connected through a service backbone. This new architecture is needed to address modern technology capabilities (social / mobile / analytics / cloud /IoT / Automation / ,,)

BoeingArchitecture

Interesting part of this architecture is that Boeing aims to exchange data with the outside world (customers / regulatory/supply chain /analytics / manufacturing) through industry standard interfaces to have an optimal flow of information. Standardization would lead to a reduction of customized applications, minimize costs of integration and migration, break the obsolescence cycle and enable future technologies. Brian knows that companies need to pull for standards, vendors will deliver. Boeing will be pushing for standards in their contracts and will actively work together with five major Aerospace & Defense companies to define required PLM capabilities and have a unified voice to PLM solutions providers.

My conclusion on these to Aerospace giants is they express the need to adapt to move to modern digital businesses, no longer the linear approach from the classic airplane programs. Incremental innovation in various domains is the future. The existing systems need to be there to support their current fleet for many, many years to come. The new data-driven layer needs to be connected through normalization and standardization of data. For the future focus on standards is a must.

MicrosoftLogoSimon Floyd from Microsoft talked about The Impact of Digital Transformation in the Manufacturing Enterprise where he talked us through Digital Transformation, IoT, and analytics in the product lifecycle, clarified by examples from the Rolls Royce turbine engine. A good and compelling story which could be used by any vendor explaining digital transformation and the relation to IoT. Next, Simon walked through the Microsoft portfolio and solution components to support a modern digital enterprise based on various platform services. At the end, Simon articulated how for example ShareAspace based on Microsoft infrastructure and technology can be an interface between various PLM environments through the product lifecycle.

Simon’s presentation was followed by a panel discussion where the theme was: When is history and legacy an asset and barriers of entry and When does it become a burden and an invitation to future competitors.
PDTpanel2Mark Halpern (Gartner) mentioned here again the bimodal thinking. Aras is bimodal. The classical PLM vendors running in mode 1 will not change radically and the new vendors, the mode 2 types will need time to create credibility. Other companies mentioned here PropelPLM (PLM on Salesforce platform) or OnShape will battle the next five years to become significant and might disrupt.

Simon Floyd(Microsoft) mentioned that in order to keep innovation within Microsoft, they allow for startups within in the company, with no constraints in the beginning to Microsoft. This to keep disruption inside you company instead of being disrupted from outside. Another point mentioned was that Tesla did not want to wait till COTS software would be available for their product development and support platform. Therefore they develop parts themselves. Are we going back to the early days of IT ?

Interesting trend I believe too, in case the building blocks for such solution architecture are based on open (standardized ?) services.

Data Quality

After the lunch, the conference was split in three streams where I was participating in the “Creating and managing information quality stream.” As I discussed in my presentation from day 1, there is a need for accurate data, starting a.s.a.p. as the future of our businesses will run on data as we learned from all speakers (and this is not a secret – still many companies do not act).

boost logoIn the context of data quality, Jean Brange from Boost presented the ISO 8000 framework for data and information quality management. This standard is now under development and will help companies to address their digital needs. The challenge of data quality is that we need to store data with the right syntax and semantic to be used and in addition, it needs to be pragmatic: what are we going to store that will have value. And then the challenge of evaluating the content. Empty fields can be discovered, however, how do you qualify the quality of field with a value. The ISO 8000 framework is a framework, like ISO 9000 (product quality) that allow companies to work in a methodological way towards acceptable and needed data quality.

iso8000

eurostep logoMagnus Färneland from Eurostep addressed the topic of data quality and the foundation for automation based on the latest developments done by Eurostep on top of their already rich PLCS data model. The PLCS data model is an impressive model as it already supports all facets of product lifecycle from design, through development and operations. By introducing soft typing, EuroStep allows a more detailed tuning of the data model to ensure configuration management. When at which stage of the lifecycle is certain information required (and becomes mandatory) ? Consistent data quality enforced through business process logic.

The conference ended with Marc Halpern making a plea for Take Control of Your Product Data or Lose Control of Your Revenue, where Marc painted the future (horror) scenario that due to digital transformation the real “big fish” will be the digital business ecosystem owner and that once you are locked in with a vendor, these vendors can uplift their prices to save their own business without any respect for your company’s business model. Marc gave some examples where some vendor raised prices with the subscription model up to 40 %. Therefore even when you are just closing a new agreement with a vendor, you should negotiate a price guarantee and a certain bandwidth for increase. And on top of that you should prepare an exit strategy – prepare data for migration and have backups using standards. Marc gave some examples of billions extra cost related to data quality and loss. It can hurt !! Finally, Marc ended with recommendations for master data management and quality as a needed company strategy.

GartnerSupscriptionModels

cimdataGerard Litjens from CIMdata as closing speaker gave a very comprehensive overview of The Internet of Thing – What does it mean for PLM ? based on CIMdata’ s vision. As all vendors in this space explain the relation between IoT and PLM differently, it was a good presentation to be used as a base for the discussion: how does IoT influence our PLM landscape. Because of the length of this blog post, I will not further go into these details – it is worth obtaining this overview.

Concluding: PDT2016 is a crucial PLM conference for people who are interested in the details of PLM. Other conferences might address high-level customer stories, at PDT2016 it is about the details and sharing the advantages of using standards. Standards are crucial for a data-driven environment where business platforms with all their constraints will be the future. And I saw more and more companies are working with standards in a pragmatic manner, observing the benefits and pushing for more data standards – it is not just theory.

See you next year ?

PDT2016I am just back from the annual PDT conference (12th edition), this year hosted in Paris from 9 to 10 November, co-located with CIMdata’s PLM Road Map 2016 for Aerospace & Defense. The PDT conference, organized by EuroStep and CIMdata, is a relatively small conference with a little over a hundred attendees. The attractiveness of this conference is that the group of dedicated participants is very interactive with each other sharing honest opinions and situations, sometimes going very deep into the details, needed to get the full picture. The theme of the conference was: “Investing for the future while managing product data legacy and obsolescence.” Here are some of the impressions from these days, giving you food for thought to join next year.

Setting the scene

Almost traditionally Peter Bilello (CIMdata) started the conference followed by Marc Halpern (Gartner). Their two presentations had an excellent storyline together.

cimdataPieter Bilello started and discussed Issues and Remedies for PLM Obsolescence. Peter did not address PLM obsolescence for the first time. It is a topic many early PLM adaptors are facing and in a way the imminent obsolescence of their current environments block them of taking advantage of new technologies and capabilities current PLM vendors offer. Having learned from the past CIMdata provides a PLM Obsolescence Management model, which should be on every companies agenda, in the same way as data quality (which I will address later). Being proactive in obsolescence can save critical situations and high costs. From the obsolescence theme, Peter looked forward to the future and the value product innovation platforms can offer, given the requirements that data should be able to flow through the organization, connecting to other platforms and applications, increasing the demand to adhere and push for standards.

gartnerMarc Halpern followed with his presentation, titled: More custom products demand new IT strategies and new PLM application where he focused on the new processes and methodology needed for future businesses with a high-focus on customer-specific deliveries, speed, and automation. Automation is always crucial to reducing production costs. In this delivery process 3D printing could bring benefits and Mark shared the plusses and minuses of 3D printing. Finally, when automation of a customer specific order would be possible, it requires a different IT-architecture, depicted by Mark. After proposing a roadmap for customizable products, Mark shared some examples of ROI benefits reported by successful transformation projects. Impressive !!

Gartner-ROI-samples

My summary of these two sessions is that both CIMdata and Gartner confirm the challenges companies have to change their old legacy processes and PLM environments which support the past, meanwhile moving to more, customer-driven processes and, modern data-driven PLM functionality. This process is not just an IT or Business change, it will also be a cultural change.

JT / STEP AP242 / PLCS

standardsNext, we had three sessions related to standards, where Alfred Katzenbach told the success story of JT, the investment done to get this standard approved and performing based on an active community to get the most out of JT, beyond its initial purpose of viewing and exchanging data in a neutral format. Jean-Yves Delanaunay explained in Airbus Operation the STEP AP242 definition is used as the core standard for 3D Model Based Definition (MB) exchange, part of the STEP standards suite and as the cornerstone for Long Term Archiving and Retrieval of Aerospace & Defense 3D MBD.

There seems to be some rivalry between JT and STEP242 viewing capabilities, which go beyond my understanding as I am not an expert from the field here. Nigel Shaw ended the morning session positioning PLCS as a standard for interoperability of information along the whole lifecycle of a product. Having a standardized data model as Nigel showed would be a common good approach for PLM vendors to converge to a more interoperable standard.

PLCS concept model

My summary of standards is that there is a lot of thinking, evaluation, and testing done by an extensive community of committed people. It will be hard for a company to define a better foundation for a standard in their business domain. Vendors are focusing on performance inside their technology offering and therefore will never push for standards (unless you use their products as a standard). A force for adhering to standards should come from the user community.

Using standards

After lunch we had three end-users stories from:

  • Eric Delaporte (Renault Group) talked about their NewPDM project and the usage of standards mainly for exchanges. Two interesting observations: Eric talks about New PDM – the usage of the words New (when does New become regular?) and PDM (not talking about PLM ?) and secondly as a user of standards he does not care about the JT/AP242 debate and uses both standards where applicable and performing.
  • Sebastien Olivier (France Ministry of Defense) gave a bi-annual update of their PCLS-journey used in two projects, Pencil (Standardized Exchange platform and centralized source of logistical information) and MAPS (Managing procurement contracts for buying In-Service Support services) and the status of their S3000L implementation (International procedure for Logistic Support Analysis). A presentation for the real in-crowd of this domain.
  • Juha Rautjarvi discussion how efficient use of knowledge for safety and security could be maintained and enhanced through collaboration. Here Juha talks about the Body of Knowledge which should be available for all stakeholders in the context of security and safety. And like a physical product this Body of Knowledge goes through a lifecycle, continuous adapting to what potentially arises from the outside world

My conclusion on this part was that if you are not really in these standards on a day-to-day base (and I am not), it is hard to pick the details. Still, the higher level thought processes behind these standard approaches allow you to see the benefits and impact of using standards, which is not the same as selecting a tool. It is a strategic choice.

Modular / Bimodular / not sexy ?

modilar - bimodulanrJakob Asell from Modular Management gave an overview how modularity can connect the worlds of sales, engineering, and manufacturing by adding a modular structure as a governing structure to the various structures used by each discipline. This product architecture can be used for product planning and provides and end-to-end connectivity of information. Modular Management is assisting companies in moving towards this approach.

Next my presentation title: The importance of accurate data. Act now! addressed the topic of the switch from classical, linear, document-driven PLM towards a modern, more incremental and data-driven PLM approach. Here I explained the disadvantage of the old evolutionary approach (impossible – too slow/too costly) and an alternative method, inspired by Gartner’s bimodular IT-approach (read my blog post: Best Practices or Next Practices). No matter which option you are looking for correct and quality data is the oil for the future, so companies should consider allowing the flow of data as a health issue for the future.

The day was closed with a large panel, where the panelist first jumped on the topic bimodal (bipolar ?? / multimodal ??) talking about mode 1 (the strategic approach) and mode 2 (the tactical and fast approach based on Gartner’s definition). It was clear that the majority of the panel was in Mode 1 mode. Which fluently lead to the discussion of usage of standards (and PLM) as not being attractive for the young generations (not sexy). Besides the conclusion that it takes time to understand the whole picture and see the valuable befits a standard can bring and join this enthusiasm

panel-day 1

Conclusion

I realize myself that this post is already too long according blogging guidelines. Therefore I will tell more about day 2 of the conference next week with Airbus going bimodal and more.

Stay tuned for next week !

reflectSummer holidays are upcoming. Time to look back and reflect on what happened so far. As a strong believer that a more data-driven PLM is required to support modern customer-focused business models, I have tried to explain this message to many individuals around Europe with mixed success.

Compared to a year ago the notion of a new PLM approach, digital and data-driven, has been resonating more and more. Two years ago I presented at the Product Innovation conference in Berlin a session with the title: Did you notice PLM is changing ? The feedback at that time was that it was a beautiful story, probably happening in the far future. Last year in Düsseldorf ( my review here), the digital trend (s) became clearer. And this year in Munich (my review here), people mentioned upcoming changes were unavoidable, in particular in the relation with IoT, how it could drastically change existing business models.

thinkFor me, the enjoyable thing of the PI Conferences is that they give a snapshot of what people care the most in the context of their product development and in particular PLM. When you are busy in day-to-day business, everything seems to move slowly forward. However, by looking back, I must admit the pace of change has increased dramatically, not the same pace as it was five or ten years ago.

Something is happening, and it happens fast !

And here I want to encourage my readers to step back for a moment from day-to-day business and look around what is happening, in business and in the world. It is all related !

imageJobs are disappearing in the middle class due to automation and direct connectivity with customers creates new types of businesses. Old jobs will never come back, not even when you close your border. And this is what worries many societies. This global, connected world has created a new way of doing business, challenging old and traditional businesses (and people) as their models become obsolete.

The primary reaction is trying to close the discomfort outside. Let´s act as if it never happened and just switch back to the good life in the previous century or centuries.

To be honest, it is all about the discomfort this new world brings to us. This new world requires new skills, in particular, more personal skills to develop continuously, learn and adapt for the future. Closing your mind and thought for the future, by hanging in the past, only brings you further away from the future and create more discomfort.

Are you talking PLM ?

Yes, the previous section was very generic, however also valid for PLM. Modern enterprises are changing the way they are going to do business and PLM is a crucial part of that total picture. Jeff Immelt, CEO of GE, explains in a discussion with Microsoft´s CEO Satya Nadella what it takes for an organization to be ready for the future. He does not talk about PLM, he talks about the need for people to be different in attitude and responsibilities – it is a business transformation – people first. Have a look here:

And although Jeff does not mention PLM, the changing digital business paradigm will affect all classical system, PLM, ERP, CRM.  And your PLM vision and plans should anticipate for such a business transformation. Implementing PLM now in the same way is has been done for 10 years in the past, with the processes from the past in mind might make your company even more rigid than before.  See my recent blog post: The value of PLM at the C-level.

Take this thought into consideration during your holidays. Can you be comfortable in this world by keep on hanging on the past or should you consider an uncomfortable, but crucial change the way your company will remain (flexible) in future business?

My holiday this year will be in my ultimate comfort zone at the beach. Reading books, no internet, discussing with friends what moves us. Two weeks to charge the batteries for this exciting, rapidly changing world of business (and PLM). I look forward coming back with some of my findings in my upcoming blogs.

104-06

Getting in and out of your comfort zone happens everywhere. Read this HBR article with a lot of similarities: If You’re Not Outside Your Comfort Zone, You Won’t Learn Anything

See you soon in the PLM (dis)comfort zone

imageIf you have followed my blog the recent years, you might have discovered my passion for a modern, data-driven approach for PLM. (Read about it here: The difference between files and data-driven – a tutorial (3 posts)).

The data-driven approach will be the foundation for product development and innovation in a digital enterprise. Digital enterprises can outperform traditional businesses is such a way that within five to ten years, non-digital businesses will be considered as dinosaurs (if they still exist).

In particular, a digital enterprise is operating in an agile, iterative way with the customer continuously in focus, where traditional enterprises often work more in a linear way, pushing their products to the market (if the market still is waiting for these commodities).

Read more about this topic here: From a linear world to fast and circular?

When and how to become a digital enterprise?

It is (almost) inevitable your company will transform at a particular time into a digital enterprise too. Either driven by a vision to remain ahead of the competition or as a final effort to stay in business as competing against agile digital competitors is killing your market share.

black holeOne characteristic of a digital enterprise is that all benefits rely on accurate data flowing through the organization and its eco-system. And it does not matter if the data resides in a single system/platform like the major vendors are promoting or the fact that data is federated and consumed by the right person with the right role. I am a believer in the latter concept, still seeing current startups trying to create the momentum to achieve such an infrastructure. Have a look at my blog buddy’s company OpenBOM and Oleg’s recent article: The challenges of distributed BOM handover in data-driven manufacturing

No matter what you believe at this stage, the future is about accurate data. I bumped recently into some issues related actual data again. Some examples:

A change in objectives is needed!

One of the companies I am working with were only focusing on individual outputs, either in their drawings (yes, the 3D Model was not leading yet) or/and in their Excels (sounds familiar ? ). When we started implementing a PLM backbone, it became apparent during the discovery phase we could not use any advanced search tools to have quick wins by aggregating data for better understanding of the information we discovered. Drawings and Models did not contain any (file) properties. Therefore, the only way to understand information was by knowing its (file) name and potential its directory. Of course, the same file could be in multiple directories and as there were almost no properties, how to know what belongs to what item ?

When discussing the future of PLM with such companies, you always hear people (mainly engineers) say:

“we are not administrators, we need to get our job done.”

This shortsighted statement is often supported by management, and then you get stuck in the past.

It is time for the management and engineers to realize their future is also based on a data-driven approach. Therefore adding data to a drawing or CAD model, or in the case of PLM, part / process characteristics become the job of an engineer. We have to redefine roles as in a digital enterprise there is nobody to fix data downstream. People fixing data issues are too expensive.

I do not want to go digital

blindMost companies at this time are not ready for a digital enterprise yet. The changing paradigm is relatively new. Switching now to a modern approach cannot be done either because their culture is still based on the previous century or they are just in the middle of a standard PLM process, just learning to share files within their (global) origination. These companies might create an attitude:

“I do now want to go digital”

I believe this is ostrich behavior, like saying:

“I want all information printed on paper on my desk so I can work in comfort (and keep my job).”

History shows hanging to the past is killing for companies. Those companies that did not invest in the first electronic wave are probably out of business (unless they never had competition). The same for digital. In potentially ten years from now, it is not affordable to work in a traditional way anymore as labor cost and speed of information flowing through an organization are going to be crucial KPIs to stay in business.

The compromise

 

communityAs Dutch, we are always seeking compromises. It helped our country to become a leading trading nation and due to the compromises, we struggle less with strikes compared to our neighboring companies. Therefore my proposal for those who do not like digital at this stage: Add just a little digital workload to your day-to-day business, preferably stimulated and motivated by your management and promoted as a company initiative. By adding as much as possible relevant properties and context to your work, you will be working on the digital future of your company. When the times is there to become digital, it will be much easier to connect your old legacy information to the new digital platform, speeding up the business transformation.

And of course there will be tools

If you are observing what is happening in the PLM domain , you will see more and more tools for data discovery and data cleansing will appear on the market. Dick Bourke wrote end of last year an introduction article about this topic at Engineering.com: Is-Suspect-Product-Data-the-Elephant-in-the-Search-and-Discover-Room? Have a read to get interested.

And there are rewards

Once you have more accurate data, you can:

  • Find it (saving search time)
  • Create reports through automation (saving processing time)
  • Apply rules (saving validation work & time or processing time)
  • Create analytics (predict the future – priceless J)

 

Conclusion

We are in a transition phase the way PLM will is implemented. What is clear, no matter in which stage you are, accurate data is going to be crucial for the future? Use this awareness for your company to stay in business.

IMG-20160412-WA0000Finally, I have time to share my PLM experiences with you in this blog. The past months have been very busy as I moved to a new house, and I wanted to do and control a lot of activities myself. Restructuring your house in an agile way is not easy. Luckily there was a vision how the house should look like. Otherwise, the “agile” approach would be an approach of too many fixes. Costly and probably typical for many old construction projects.

Finally, I realized the beauty of IKEA´s modular design and experienced the variety of high-quality products from BLUM (an impressive company in Austria I worked with)

In parallel, I have been involved in some PLM discussions where in all cases the connection with the real C-level was an issue. And believe it or not, my blog buddy Oleg Shilovitsky just published a post: Hard to sell PLM? Because nobody gives a SH*T about PLM software. Oleg is really starting from the basics explaining you do not sell PLM; you sell a business outcome. And in larger enterprises I believe you sell at this time the ability to do a business transformation as business is becoming digital, with the customer in the center. And this is the challenge I want to discuss in this post

 

The value of PLM at the C-level

imageBelieve it or not, it is easier to implement PLM (in general) instead of explaining a CEO why a company needs modern PLM. A nice one-liner to close this post, however, let me explain what I mean by this statement and perhaps show the reasons why PLM does not seem to be attractive so much at the C-level. I do not want to offend any particular PLM company, Consultancy firm or implementor, therefore, allow me to stay on a neutral level.

The C-level time challenge

elevatorFirst, let´s imagine the situation at C-level. Recently I heard an excellent anecdote about people at C-level. When they were kids, the were probably the brightest and able to process and digest a lot of information, making their (school) careers a success. When later arriving in a business environment, they were probably the ones that could make a difference in their job and for that reason climbed the career ladder fast to reach a C-level position. Then arriving at that level, they become too busy to dive really deep into the details.

Everyone around them communicates in “elevator speeches” and information to read must me extremely condensed and easy to understand. As if people at C-level have no brains and should be informed like small kids.

I have seen groups of people working weeks on preparing the messages for the CEO. Every word is twisted hundred times – would he or she understand it? I believe the best people at C-level have brains, and they would understand the importance of PLM when someone explains it. However, it requires time if it does not come from your comfort zone.

Who explains the strategic value of PLM

There are a lot of strategic advisory companies who have access to the board room, and we can divide them into two groups. The ones that focus on strategy independent of any particular solution and the ones that concentrate on a strategy, guaranteeing their implementation teams are ready to deploy the solution. Let´s analyze both options and their advice:

Independent of a particular solution

tunnel_visionWhen a company is looking for help from a strategic consultancy firm, you know upfront part of the answer. As every consultancy firm has a preferred sweet spot, based on their principal consultant(s). As a PLM consultant, I probably imagine the best PLM approach for your company, not being expert in financials or demagogic trends. If the advisory company has a background in accountancy, they will focus their advice on financials. If the company has a background in IT, they will focus their information on an infrastructure concept saving so much money.

A modern digital enterprise is now the trend, where digital allows the company to connect and interact with the customer and therefore react faster to market needs or opportunities. IoT is one of the big buzz words here. Some companies grasp the concept of being customer centric (the future) and adapt their delivery model to that, not realizing the entire organization including their product definition process should be changing too. You cannot push products to the market in the old linear way, while meanwhile expecting modern agile work processes.

discussMost of the independent strategic consultants will not push for a broader scope as it is out of their comfort zone. Think for a moment. Who are the best strategic advisors that can talk about the product definition process, the delivery process and products in operation and service? I would be happy if you give me their names in the comments with proof points.

Related to a particular solution

hammer and nailWhen you connect with a strategic advisory company, which an extensive practice in XXX or YYY, you can be sure the result will be strategic advice containing XXX or YYY. The best approach with ZZZ will not come on the table, as consultancy firms will not have the intention to investigate in that direction for your company. They will tell you: “With XXX we have successfully transformed (many) other companies like yours, so choose this path with the lowest risk.

And this is the part what concerns me the most at this time. Business is changing rapidly and therefore PLM should be changing too. If not that would be a strange situation? Read about the PLM Identity crisis here and here.

The solution is at C-level (conclusion)

I believe the at the end the future of your company will be dependent on your DNA, your CEO and the C-level supporting the CEO. Consultancy firms can only share their opinion from their point of view and with their understanding in mind.

If you have a risk-averse management, you might be at risk.
Doing nothing or following the majority will not bring more competitive advantage.

The awareness that business is global and changing rapidly should be on every company’s agenda.

Change is always an opportunity to get better; still no outsider can recommend you what is the best. Take control and leadership. For me, it is clear that the product development and delivery process should be a part of this strategy. Call it PLM or something different. I do not care. But do not focus on efficiency and ROI, focus on being able to be different from the majority. Apple makes mobile phones; Nespresso makes coffee, etc.

Think and use extreme high elevators to talk with your C-level!

Your thoughts?

tacitIn 1999, I started my company TacIT in order to focus on knowledge management. The name TacIT came from the term tacit knowledge, the

knowledge an expert has, combining knowledge from different domains and making the right decision, based on his or her experience / intuition? Tacit knowledge is the opposite of explicit knowledge which you can define in rules. In particular, large companies are always looking for ways to capture and share knowledge to raise the tacit knowledge of their employees.

When I analyzed knowledge management in 1999, many businesses thought it was just about installing intranet. At that time, it became in fashion to have an internal website where people were publishing their knowledge. Wikipedia was not yet launched. Some people got excited from the intranet capabilities; however a lot of information remained locked or hidden. What was clear to me at that time was that knowledge management as a bottom-up approach would not work in an organization for the following reasons:

  • In 1999 knowledge was power, so voluntary sharing your knowledge was considered more or less reducing your job security. Others might become as skilled as you. A friend of mine was trying to capture knowledge from experts in his domain and only people close to retirement were willing to speak with him. Has this attitude meanwhile changed?
  • It takes time to share your knowledge and in particular for the busy experts this is a burden. They want (or need) to go on to the next job and not spend “useless” time to describe what they have learned.

My focus on knowledge management disappeared in 2000 as I got dragged into PLM with the excuse in mind that PLM should be a kind of knowledge management too.

No knowledge management in PLM

PLM_flowIn theory, the picture representing PLM is a circle, where through iterations organizations learn to improve their products and understand better the way their products are perceived and performing in the market. However, the reality was that PLM was used as an infrastructure to transfer and share information mainly within engineering disciplines. Each department had its own tools and demands. Most companies have silos for PDM, ERP, and Services, and people have no clue about which information exist within the organization. Most of the time, they only know their system and even worse they are the only ones that know where their data is stored (or hidden when you talk to colleagues)

When PLM became more and more accepted as the backbone for product information in a company, there was more attention for a structured manner of knowledge management in the context of lessons learned. Quality systems like ISO900x provide guidance for processes of quality improvement. Various industries have their own quality methodology, APQP, 8D, CAPA all to ensure quality gets improved in a learning organization. 8D and CAPA are examples of Issue management which are a must-do for every PLM implementation. It is the first step in sharing and discovering commonalities and trends related to your product, your processes, and your customers. When issues are solved by email and phone calls, the content and lesson learned remain often hidden for the rest of the organization.

PLM binStill storing all information into one PLM system is not what I would call knowledge management. Also, my garbage bin (I had a huge one) contains all my written notes and thoughts. Would anyone be able to work with my environment? No!

Knowledge Management is an attitude

When organizations really care about knowledge, it should be a top-down guided process. And knowledge is more than storing data in a static manner in a central place. Let´s have a look how modern knowledge management could work:

Structured information

InfoInContextIn a PLM system you will find mainly structured information, i.e. Bill of Materials containing Parts, Documents/CAD Models/Drawings describing products, Catalogs with standard parts, Suppliers and in modern environment perhaps even issues related to these information objects and all the change processes that have been performed on parts, products or documents.

This information already becomes valuable information if companies upfront spend time on planning and creating the context of the information. This means attributes are important and even maintaining relationships between different types of information. This is the value a PLM system can bring beyond a standard document management system or a parts database. Information in the right context brings much more value.

For example, a “Where used” of a part not only in the context of a BOM but also in the context of suppliers, all issues, all ECRs/ECOs, projects or customers implemented. It could be any relation, starting from any relevant information object.

fasterCreating rich data in context does not happen without a business change. People creating the relationships and attribute values need to be rewarded for that. Often it is the opposite.

“Do your job as fast as possible and do only what is necessary to deliver now” is often the message from a short-sighted manager who believes that spending time on the “NOW” is more important than spending time on the “FUTURE.”

Managing information to become valuable in the future is an investment that needs to be done in the world of structured data. Once done, a company will discover that this investment has improved the overall performance of the company as time for searching will reduce (from 20 %++ to 5 % –) and people are enabled to reuse instead of reinventing things or worse re-experience issues.

There is more structured information out there.

SBAOf course, companies cannot wait for a few years till structured information becomes usable. Most of the time there is already a lot of information in the various systems the company is using. Emails, the ERP system, the PDM system and file directories may contain already relevant information. Here modern search-based applications like Exalead or Conweaver (for sure there are more apps in the market – these are the two I am familiar with) will help to collect and connect information in context coming from various systems. This allows users to see information across disciplines and across the lifecycle of a product.

Still these capabilities are not really knowledge management increasing the tacit knowledge of a company

How to collect tacit knowledge ?

Static information collection does not contribute to tacit knowledge, it provides some visibility to what exists and might help with explicit knowledge. Tacit knowledge can only be collected by an active process. People in an organization need to be motivated and stimulated to share their story, which is more than just sharing information. It is the reasoning why certain decisions were taken which helps others to learn. Innovation and learning come from associating information coming from different domains and creating opportunity and excitement to share stories. This is what modern companies like Google and Apple do and it is somehow the same way as information is shared at the coffee machine. This is the primary challenge. Instead of an opportunistic approach to knowledge sharing you want a reliable process of knowledge sharing. The process of capturing and sharing tacit knowledge could be improved by assigning knowledge agents in a company.

knowledge agent

Image courtesy of www.atlassian.comknowledge flow

A knowledge agent has the responsibility to capture and translate lessons learned. For that reasons, a knowledge agent should be somebody who can capitalize information and store and publish it in a manner the information can be found back in various contexts. The advantage of such a process is that knowledge is obtained in a structured manner. In the modern world, a knowledge agent could be a community owner / moderator actively sharing and publishing information. Strangely knowledge agents are often considered as overhead as their immediate value is not directly visible (as many of the PLM activities are) although the job of a knowledge agent does not need to be a full-time job.

I found a helpful link related to the knowledge management agent here: 7 knowledge management tips. The information is not in the context of product development. However, it is generic enough to consider.

https://www.atlassian.com/it-service/7-knowledge-management-tips

Conclusion

Many companies talk about PLM and Knowledge Management as equivalents to each other. It should be clear that they are different although also partly overlapping in purpose. Import to understand that both PLM knowledge and general Knowledge Management will only happen with a top-down strategy and motivation for the organization, either by assigning individual people to become knowledge agents or to have common processes for all to follow up.

I am curious to learn:

  • Is knowledge management on your company´s agenda

and if Yes

  • How is knowledge management implemented