You are currently browsing the category archive for the ‘PLM’ category.

 

For those who have followed my blog over the years, it must be clear that I am advocating for a digital enterprise explaining benefits of a data-driven approach where possible. In the past month an old topic with new insights came to my attention: Yes or No intelligent Part Numbers or do we mean Product Numbers?

 

 

What’s the difference between a Part and a Product?

In a PLM data model, you need to have support for both Parts and Products and there is a significant difference between these two types of business objects. A Product is an object facing the outside world, which can be a company (B2B) or customer (B2C) related. Examples of B2C products are the Apple iPhone 8, the famous IKEA Billy, or my Garmin 810 and my Dell OptiPlex 3050 MFXX8.  Examples of B2B products are the ABB synchronous motor AMZ 2500, the FESTO standard cylinder DSBG.  Products have a name and if there are variants of the product, they also have an additional identifier.

A Part represents a physical object that can be purchased or manufactured. A combination of Parts appears in a BOM. In case these Parts are not yet resolved for manufacturing, this BOM might be the Engineering BOM or a generic Manufacturing BOM. In case the Parts are resolved for a specific manufacturing plant, we talk about the MBOM.

I have discussed the relation between Parts and Products in a earlier post Products, BOMs and Parts which was a follow-up on my LinkedIn post, the importance of a PLM data model. Although both posts were written more than two years ago, the content is still valid. In the upcoming year, I will address this topic of products further, including software and services moving to solutions / experiences.

Intelligent number for Parts?

As parts are company internal business objects, I would like to state if the company is serious about becoming a digital enterprise, parts should have meaningless unique identifiers. Unique identifiers are the link between discipline or application specific data sets. For example, in the image below, where I imagined attributes sets for a part, based on engineering and manufacturing data sets.

Apart from the unique ID, there might be a common set of attributes that will be exposed in every connected system. For example, a description, a classification and one or more status attributes might be needed.

Note 1: A revision number is not needed when you create every time a new unique ID for a new version of the part.  This practice is already common in the electronics industry. In the old mechanical domain, we are used to having revisions in particular for make parts based on Form-Fit-Function rules.

Note 2: The description might be generated automatically based on a concatenation of some key attributes.

Of course if you are aiming for a full digital enterprise, and I think you should, do not waste time fixing the past. In some situations, I learned that an external consultant recommended the company to rename their old meaningful part numbers to the new non-intelligent part numbering scheme. There are two mistakes here. Renumbering is too costly, as all referenced information should be updated. And secondly as long as the old part numbers have a unique ID for the enterprise, there is no need to change. The connectivity of information should not depend on how the unique ID is formatted.

Read more if you want here: The impact of Non-Intelligent Part Numbers

Intelligent numbers for Products?

If the world was 100 % digital and connected, we could work with non-intelligent product numbers. However, this is a stage beyond my current imagination.  For products we will still need a number that allows customers to refer to, for when they communicate with their supplier / vendor or service provider. For many high-tech products the product name and type might be enough. When I talk about the Samsung S5 G900F 16G, the vendor knows which kind of configuration I am referring too. Still it is important to realize that behind these specifications, different MBOMs might exist due to different manufacturing locations or times.

However, when I refer to the IKEA Billy, there are too many options to easily describe the right one consistent in words, therefore you will find a part number on the website, e.g. 002.638.50. This unique ID connects directly to a single sell-able configuration. Here behind this unique ID also different MBOMs might exist for the same reason as for the Samsung telephone. The number is a connection to the sales configuration and should not be too complicated as people need to be able to read and recognize it when you go to a warehouse.

Conclusion

There is a big difference between Product and Part numbers because of the intended scope of these business objects. Parts will soon exist in connected, digital enterprises and therefore do not need any meaningful number anymore. Products need to be identified by consumers anywhere around the world, not yet able or willing to have a digital connection with their vendors. Therefore smaller and understandable numbers will remain needed to support exact communication between consumer and vendor.

Advertisements

When I started working with SmarTeam Corp.  in 1999, the company had several product managers, who were responsible for the whole lifecycle of a component or technology. The Product Manager was the person to define the features for the new release and provide the justification for these new features internally inside R&D.  In addition the Product Manager had the external role to visit customers and understand their needs for future releases and building and explaining a coherent vision to the outside and internal world. The product manager had a central role, connecting all stakeholders.

In the ideal situation the Product Manager was THE person who could speak in R&D-language about the implementation of features, could talk with marketing and documentation teams to explain the value and expected behavior and could talk with the customer describing the vision, meanwhile verifying the product’s vision and roadmap based on their inputs.All these expected skills make the role of a product manager challenging. Is the person too “techy” than he/she will enjoy working with R&D but have a hard time understanding customer demands. From the other side if the Product Manager is excellent in picking-up customer and market feedback he/she might not be heard and get the expected priorities from R&D. For me, it has always been clear that in software world a “bi-directional” Product Manager is crucial to success.

Where are the Product Managers in the Manufacturing Industry?

Approximate four years ago new concepts related to digitalization for PLM became more evident. How could a digital continuity connect the various disciplines around the product lifecycle and therefore provide end-to-end visibility and traceability? When speaking of end-to-end visibility most of the time companies talked about the way they designed and delivered products, visibility of what is happening stopped most of the time after manufacturing. The diagram to the left, showing a typical Build To Order organization illustrates the classical way of thinking. There is an R&D team working on Innovation, typically a few engineers and most of the engineers are working in Sales Engineering and Manufacturing Preparation to define and deliver a customer specific order. In theory, once delivered none of the engineers will be further involved, and it is up to the Service Department to react to what is happening in the field.

A classical process in the PLM domain is the New Product Introduction process for companies that deliver products in large volumes to the market, most of the time configurable to be able to answer to various customer or pricing segments. This process is most of the time linear and is either described in one stream or two parallel streams. In the last case, the R&D department develops new concepts and prepares the full product for the market. However, the operational department starts in parallel, initially involved in strategic sourcing, and later scaling-up manufacturing disconnected from R&D.

I described these two processes because they both illustrate how disconnected the source (R&D/ Sales)  are from the final result in the field. In both cases managed by the service department. A typical story that I learned from many manufacturing companies is that at the end it is hard to get a full picture from what is happening across the whole lifecycle, How external feedback (market & customers) have the option to influence at any stage is undefined. I used the diagram below even  before companies were even talking about a customer-driven digital transformation. Just understanding end-to-end what is happening with a product along the lifecycle is already a challenge for a company.

Putting the customer at the center

Modern business is about having customer or market involvement in the whole lifecycle of the product. And as products become more and more a combination of hardware and software, it is the software that allows the manufacturer to provide incremental innovation to their products. However, to innovate in a manner that is matching or even exceeding customer demands, information from the outside world needs to travel as fast as possible through an organization. In case this is done in isolated systems and documents, the journey will be cumbersome and too slow to allow a company to act fast enough. Here digitization comes in, making information directly available as data elements instead of documents with their own file formats and systems to author them. The ultimate dream is a digital enterprise where date “flows”, advocated already by some manufacturing companies for several years.

In the previous paragraph I talked about the need to have an infrastructure in place for people in an organization to follow the product along the complete lifecycle, to be able to analyze and improve the customer experience. However, you also need to create a role in the organization for a person to be responsible for combining insights from the market and to lead various disciplines in the organization, R&D, Sales, Services. And this is precisely the role of a Product Manager.

Very common in the world of software development, not yet recognized in manufacturing companies. In case a product manager role exists already in your organization, he/she can tell you how complicated it currently is to get an overall view of the product and which benefits a digital infrastructure would bring for their job. Once the product manager is well-supported and recognized in the organization, the right skill set to prioritize or discover actions/features will make the products more attractive for consumers. Here the company will benefit.

Conclusion

If your company does not have the role of a product manager in place, your business is probably not yet well enough engaged in the customer journey.  There will be broken links and costly processes to get a fast response to the market.  Consider the role of a Product Manager, which will emerge as seen from the software business.

NOTE 1: Just before publishing this post I read an interesting post from Jan Bosch: Structure Eats Strategy. Well fitting in this context

NOTE 2: The existence of a Product Manager might be a digital maturity indicator for a company, like for classical PLM maturity, the handling of the MBOM (PDM/PLM/ERP) gives insight into PLM maturity of a company.

Related to the MBOM, please read: The Importance of a PLM data model – EBOM and MBOM

 

 

 

 

 

This post is a rewrite of an article I wrote on LinkedIn two years ago and modified it to my current understanding. When you are following my blog, in particular, the posts related to the business change needed to transform a company towards a data-driven digital enterprise, one of the characteristics of digital is about the real-time availability of information. This has an impact on everyone working in such an organization. My conversations are in the context of PLM (Product Lifecycle Management) however I assume my observations are valid for other domains too.

Real-time visibility is going to be the big differentiator for future businesses, and in particular, in the PLM domain, this requires a change from document-centric processes towards data-driven processes.

Documents have a lot of disadvantages.  Documents lock information in a particular format and document handling results in sequential processes, where one person/one discipline at the time is modifying or adding content. I described the potential change in my blog post: From a linear world to fast and circular?

From a linear world to fast and circular

In that post, I described that a more agile and iterative approach to bring products and new enhancements to the market should have an impact on current organizations. A linear organization, where products are pushed to the market, from concept to delivery, is based on working in silos and will be too slow to compete against future, modern digital enterprises. This because departmental structures with their own hierarchy block fast moving of information, and often these silos perform filtering/deformation of the information.  It becomes hard to have a single version of the truth as every department, and its management will push for their measured truth.

A matching business model related to the digital enterprise is a matrix business model, where multi-disciplinary teams work together to achieve their mission. An approach that is known in the software industry, where parallel and iterative work is crucial to continuous deliver incremental benefits.

Image:  21stcenturypublicservant.wordpress.com/

In a few of my projects, I discovered this correlation with software methodology that I wanted to share. One of my clients was in the middle of moving from a document-centric approach toward a digital information backbone, connecting the RFQ phase and conceptual BOM through design, manufacturing definition, and production. The target was to have end-to-end data continuity as much as possible, meanwhile connecting the quality and project tasks combined with issues to this backbone.

The result was that each individual had a direct view of their current activities, which could be a significant quantity for some people engaged in multiple projects.  Just being able to measure these numbers already lead to more insight into an individual’s workload. At the time we discussed with the implementation team the conceptual dashboard for an individual, it lead to questions like: “Can the PLM system escalate tasks and issues to the relevant manager when needed?” and  “Can this escalation be done automatically? “

And here we started the discussion. “Why do you want to escalate to a manager?”  Escalation will only give more disruption and stress for the persons involved. Isn´t the person qualified enough to make a decision what is important?

One of the conclusions of the discussion was that currently, due to lack of visibility of what needs to be done and when and with which urgency, people accept things get overlooked. So the burning issues get most of the attention and the manager’s role is to make things burning to get it done.

When discussing further, it was clear that thanks to the visibility of data, real critical issues will appear at the top of an individual’s dashboard. The relevant person can immediately overlook what can be achieved and if not, take action. Of course, there is the opportunity to work on the easy tasks only and to ignore the tough ones (human behavior) however the dashboard reveals everything that needs to be done – visibility. Therefore if a person learns to manage their priorities, there is no need for a manager to push anymore, saving time and stress.

The ultimate conclusion of our discussion was: Implementing a modern PLM environment brings first of all almost 100 % visibility, the single version of the truth. This new capability breaks down silos, a department cannot hide activities behind their departmental wall anymore. Digital PLM allows horizontal multidisciplinary collaboration without the need going through the management hierarchy.

It would mean Power to People, in case they are stimulated to do so. And this was the message to the management: “ you have to change too, empower your people.”

What do you think – will this happen? This was my question in 2015.  Now two years later I can say some companies have seen the potential of the future and are changing their culture to empower their employees working in multidisciplinary teams. Other companies, most of the time with a long history in business, are keeping their organizational structure with levels of middle management and maintain a culture that consolidates the past.

Conclusion

A digital enterprise empowers individuals allowing companies to become more proactive and agile instead of working within optimized silos. In silos, it appears that middle management does not trust individuals to prioritize their work.  The culture of a company and its ability to change are crucial for the empowerment of individuals The last two years there is progress in understanding the value of empowered multidisciplinary teams.

Is your company already empowering people ? Let us know !

Note: After speaking with Simon, one of my readers who always gives feedback from reality, we agreed that multidisciplinary teams are very helpful for organizations. However you will still need a layer of strategic people securing standard ways of working and future ways of working as the project teams might be to busy doing their job. We agreed this is the role for modern middle management.
DO YOU AGREE ?

Last week I posted my first review of the PDT Europe conference. You can read the details here: The weekend after PDT Europe (part 1).  There were some questions related to the abbreviation PDT. Understanding the history of PDT, you will discover it stands for Product Data Technology. Yes, there are many TLA’s in this world.

Microsoft’s view on the digital twin

Now back to the conference. Day 2 started with a remote session from Simon Floyd. Simon is Microsoft’s Managing Director for Manufacturing Industry Architecture Enterprise Services and a frequent speaker at PDT. Simon shared with us Microsoft’s viewpoint of a Digital Twin, the strategy to implement a Digit Twin, the maturity status of several of their reference customers and areas these companies are focusing. From these customers it was clear most companies focused on retrieving data in relation to maintenance, providing analytics and historical data. Futuristic scenarios like using the digital twin for augmented reality or design validation. As I discussed in the earlier post, this relates to my observations, where creating a digital thread between products in operations is considered as a quick win. Establishing an end-to-end relationship between products in operation and their design requires many steps to fix. Read my post: Why PLM is the forgotten domain in digital transformation.

When discussing the digital twin architecture, Simon made a particular point for standards required to connect the results of products in the field. Connecting a digital twin in a vendor-specific framework will create a legacy, vendor lock-in, and less open environment to use digital twins. A point that I also raised in my presentation later that day.

Simon concluded with a great example of potential future Artificial Intelligence, where an asset based on its measurements predicts to have a failure before the scheduled maintenance stop and therefore requests to run with a lower performance so it can reach the maintenance stop without disruption.

Closing the lifecycle loop

Sustainability and the circular economy has been a theme at PDT for some years now too. In his keynote speech, Torbjörn Holm from Eurostep took us through the global megatrends (Hay group 2030) and the technology trends (Gartner 2018) and mapped out that technology would be a good enabler to discuss several of the global trends.

Next Torbjörn took us through the reasons and possibilities (methodologies and tools) for product lifecycle circularity developed through the ResCoM project in which Eurostep participated.

The ResCoM project (Resource Conservative Manufacturing) was a project co-funded by the European Commission and recently concluded. More info at www.rescom.eu

Torbjörn concluded discussing the necessary framework for Digital Twin and Digital Thread(s), which should be based on a Model-Based Definition, where ISO 10303 is the best candidate.

Later in the afternoon, there were three sessions in a separate track, related to design optimization for value, circular and re-used followed by a panel discussion. Unfortunate I participated in another track, so I have to digest the provided materials still. Speakers in that track were Ola Isaksson (Chalmers University), Ingrid de Pauw & Bram van der Grinten (IDEAL&CO) and Michael Lieder (KTH Sweden)

Connecting many stakeholders

Rebecca Ihrfors, CIO from the Swedish Defense Material Administration (FMV) shared her plans on transforming the IT landscape to harmonize the current existing environments and to become a broker between industry and the armed forces (FM). As now many of the assets come with their own data sets and PDM/PLM environments, the overhead to keep up all these proprietary environments is too expensive and fragmented. FWM wants to harmonize the data they retrieve from industry and the way they offer it to the armed forces in a secure way. There is a need for standards and interoperability.

The positive point from this presentation was that several companies in the audience and delivering products to Swedish Defense could start to share and adapt their viewpoints how they could contribute.

Later in the afternoon, there were three sessions in a separate track rented to standards for MBE inter-operability and openness that would fit very well in this context. Brian King (Koneksys), Adrian Murton (Airbus UK) and Magnus Färneland (Eurostep) provided various inputs, and as I did not attend these parallel sessions I will dive deeper in their presentations at a later time

PLM something has to change – bimodal and more

In my presentation, which you can download from SlideShare here: PLM – something has to change. My main points were related to the fact that apparently, companies seem to understand that something needs to happen to benefit really from a digital enterprise. The rigidness from large enterprise and their inhibitors to transform are more related to human and incompatibility issues with the future.

How to deal with this incompatibility was also the theme for Martin Eigner’s presentation (System Lifecycle Management as a bimodal IT approach) and Marc Halpern’s closing presentation (Navigating the Journey to Next Generation PLM).

Martin Eigner’s consistent story was about creating an extra layer on top of the existing (Mode 1) systems and infrastructure, which he illustrated by a concept developed based on Aras.

By providing a new digital layer on top of the existing enterprise, companies can start evolving to a modern environment, where, in the long-term, old Mode 1 systems will be replaced by new digital platforms (Mode 2). Oleg Shilovitsky wrote an excellent summary of this approach. Read it here: Aras PLM  platform “overlay” strategy explained.

Marc Halpern closed the conference describing his view on how companies could navigate to the Next Generation PLM by explaining in more detail what the Gartner bimodal approach implies. Marc’s story was woven around four principles.

Principle 1 The bimodal strategy as the image shows.

Principle 2 was about Mode 1 thinking in an evolutionary model. Every company has to go through maturity states in their organization, starting from ad-hoc, departmental, enterprise-based to harmonizing and fully digital integrated. These maturity steps also have to be taken into account when planning future steps.

Principle 3 was about organizational change management, a topic often neglected or underestimated by product vendors or service providers as it relates to a company culture, not easy to change and navigate in a particular direction.

Finally, Principle 4 was about Mode 2 activities. Here an organization should pilot (in a separate environment), certify (make sure it is a realistic future), adopt (integrate it in your business) and scale (enable this new approach to exists and grow for the future).

Conclusions

This post concludes my overview of PDT Europe 2017. Looking back there was a quiet aligned view of where we are all heading with PLM and related topics. There is the hype an there is reality, and I believe this conference was about reality, giving good feedback to all the attendees what is really happening and understood in the field. And of course, there is the human factor, which is hard to influence.

Share your experiences and best practices related to moving to the next generation of PLM (digital PLM ?) !

 

 

 

PDT Europe is over, and it was this year a surprising aligned conference, showing that ideas and concepts align more and more for modern PLM. Håkan Kårdén opened the conference mentioning the event was fully booked, about 160 attendees from over 19 countries. With a typical attendance of approx. 120 participants, this showed the theme of the conference: Continuous Transformation of PLM to support the Lifecycle Model-Based Enterprise was very attractive and real. You can find a history of tweets following the hashtag #pdte17

Setting the scene

Peter Bilello from CIMdata kicked-off by bringing some structure related to the various Model-Based areas and Digital Thread. Peter started by mentioning that technology is the least important issue as organization culture, changing processing and adapting people skills are more critical factors for a successful adoption of modern PLM. Something that would repeatedly be confirmed by other speakers during the conference.

Peter presented a nice slide bringing the Model-Based terminology together on one page. Next, Peter took us through various digital threads in the different stages of the product lifecycle. Peter concluded with the message that we are still in a learning process redefining optimal processes for PLM, using Model-Based approaches and Digital Threads and thanks (or due) to digitalization these changes will be rapid. Ending with an overall conclusion that we should keep in mind:


It isn’t about what we call digitalization; It is about delivering value to customers and all other stakeholders of the enterprise

Next Marc Halpern busted the Myth of Digital Twins (according to his session title) and looked into realistic planning them. I am not sure if Marc smashed some of the myths although it is sure Digital Twin is at the top of the hype cycle and we are all starting to look for practical implementations. A digital twin can have many appearances and depends on its usage. For sure it is not just a 3D Virtual model.

There are still many areas to consider when implementing a digital twin for your products. Depending on what and how you apply the connection between the virtual and the physical model, you have to consider where your vendor really is in maturity and avoid lock in on his approach. In particular, in these early stages, you are not sure which technology will last longer, and data ownership and confidentially will play an important role. And opposite to quick wins make sure your digital twin is open and use as much as possible open standards to stay open for the future, which also means keep aiming for working with multiple vendors.

Industry sessions

Next, we had industry-focused sessions related to a lifecycle Model-Based enterprise and later in the afternoon a session from Outotec with the title: Managing Installed Base to Unlock Service opportunities.

The first presentation from Väino Tarandi, professor in IT in Construction at KTH Sweden presented his findings related to BIM and GIS in the context of the lifecycle, a test bed where PLCS meets IFC. Interesting as I have been involved in BIM Level 3 discussions in the UK, which was already an operational challenge for stakeholders in the construction industry now extended with the concept of the lifecycle. So far these projects are at the academic level, and I am still waiting for companies to push and discover the full benefits of an integrated approach.

Concepts for the industrial approach could be learned from Outotec as you might understand later in this post. Of course the difference is that Outotec is aiming for data ownership along the lifecycle, where in case of the construction industries, each silo often is handled by a different contractor.

Fredrik Ekström from Swedish Transport Administration shared his challenges of managing assets for both road and railway transport – see image on the left. I have worked around this domain in the Netherlands, where asset management for infrastructure and asset management for the rail infrastructure are managed in two different organizations. I believe Fredrik (and similar organizations) could learn from the concepts in other industries. Again Outotec’s example is also about having relevant information to increase service capabilities, where the Swedish Transport Administration is aiming to have the right data for their services. When you look at the challenges reported by Fredrik, I assume he can find the answers in other industry concepts.

Outotec’s presentation related to managing installed base and unlock service opportunities explained by Sami Grönstrand and Helena Guiterrez was besides entertaining easy to digest content and well-paced. Without being academic, they explained somehow the challenges of a company with existing systems in place moving towards concepts of a digital twin and the related data management and quality issues. Their practical example illustrated that if you have a clear target, understanding better a customer specific environment to sell better services, can be achieved by rational thinking and doing, a typical Finish approach. This all including the “bi-modal approach” and people change management.

Future Automotive

Ivar Hammarstadt, Senior Analyst Technology Intelligence for Volvo Cars Corporation entertained us with a projection toward the future based on 160 years of automotive industry. Interesting as electrical did not seem to be the only way to go for a sustainable future depending on operational performance demands.

 

Next Jeanette Nilsson and Daniel Adin from Volvo Group Truck shared their findings related to an evaluation project for more than one year where they evaluated the major PLM Vendors (Dassault Systemes / PTC / Siemens) on their Out-of-the-box capabilities related to 3D product documentation and manufacturing.

They concluded that none of the vendors were able to support the full Volvo Truck complexity in a OOTB matter. Also, it was a good awareness project for Volvo Trucks organization to understand that a common system for 3D geometry reduces the need for data transfers and manual data validation. Cross-functional iterations can start earlier, and more iterations can be performed. This will support a shortening of lead time and improve product quality. Personally, I believe this was a rather expensive approach to create awareness for such a conclusion, pushing PLM vendors in a competitive pre-sales position for so much detail.

Future Aerospace

Kenny Swope from Boeing talked us through the potential Boeing journey towards a Model-Based Enterprise. Boeing has always been challenging themselves and their partners to deliver environments close to what is possible. Look at the Boeing journey and you can see that already in 2005 they were aiming for an approach that most of current manufacturing enterprises cannot meet. And now they are planning their future state.

To approach the future state Boeing aims to align their business with a single architecture for all aspects of the company. Starting with collecting capabilities (over 400 in 6 levels) and defining value streams (strategic/operational) the next step is mapping the capabilities to the value streams.  Part of the process would be to look at the components of a value stream if they could be fulfilled by a service. In this way you design your business for a service-oriented architecture, still independent from any system constraints. As Kenny states the aerospace and defense industry has a long history and therefore slow to change as its culture is rooted in the organization. It will be interesting to learn from Kenny next hear how much (mandatory) progress towards a model-based enterprise has been achieved and which values have been confirmed.

Gearing up for day 2

Martin Eigner took us in high-speed mode through his vision and experience working in a bi-modular approach with Aras to support legacy environments and a modern federated layer to support the complexity of a digital enterprise where the system architecture is leading. I will share more details on these concepts in my next post as during day 2 of PDT Europe both Marc Halpern and me were talking related to this topic, and I will combine it in a more extended story.

The last formal presentation for day one was from Nigel Shaw from Eurostep Ltd where he took us through the journey of challenges for a model-based enterprise. As there will not be a single model that defines all, it will be clear various models and derived models will exist for a product/system.  Interesting was Nigel’s slide showing the multiple models disciplines can have from an airplane (1948). Similar to the famous “swing” cartoon, used to illustrate that every single view can be entirely different from the purpose of the product.

Next are these models consistent and still describing the same initial specified system. On top of that, even the usage of various modeling techniques and tools will lead to differences in the system. And the last challenge on top is managing the change over the system’s lifecycle. From here Nigel stepped into the need for digital threads to govern relations between the various views per discipline and lifecycle stage, not only for the physical and the virtual twin.  When comparing the needs of a model-based enterprise through its lifecycle, Nigel concluded that using PLCS as a framework provides an excellent fit to manage such complexity.

Finally, after a panel discussion, which was more a collection of opinions as the target was not necessary to align in such a short time, it was time for the PDT dinner always an excellent way to share thoughts and verify them with your peers.

Conclusion

Day 1 was over before you knew it without any moment of boredom and so I hope is also this post. Next week I will close reviewing the PDT conference with some more details about my favorite topics.

 

As I am preparing my presentation for the upcoming PDT Europe 2017 conference in Gothenburg, I was reading relevant experiences to a data-driven approach. During PDT Europe conference we will share and discuss the continuous transformation of PLM to support the Lifecycle Model-Based Enterprise. 

One of the direct benefits is that a model-based enterprise allows information to be shared without the need to have documents to be converted to a particular format, therefore saving costs for resources and bringing unprecedented speed for information availability, like what we are used having in a modern digital society.

For me, a modern digital enterprise relies on data coming from different platforms/systems and the data needs to be managed in such a manner that it can serve as a foundation for any type of app based on federated data.

This statement implies some constraints. It means that data coming from various platforms or systems must be accessible through APIs / Microservices or interfaces in an almost real-time manner. See my post Microservices, APIs, Platforms and PLM Services. Also, the data needs to be reliable and understandable for machine interpretation. Understandable data can lead to insights and predictive analysis. Reliable and understandable data allows algorithms to execute on the data.

Classical ECO/ECR processes can become highly automated when the data is reliable, and the company’s strategy is captured in rules. In a data-driven environment, there will be much more granular data that requires some kind of approval status. We cannot do this manually anymore as it would kill the company, too expensive and too slow. Therefore, the need for algorithms.

What is understandable data?

I tried to avoid as long as possible academic language, but now we have to be more precise as we enter the domain of master data management. I was triggered by this recent post from Gartner: Gartner Reveals the 2017 Hype Cycle for Data Management. There are many topics in the hype cycle, and it was interesting to see Master Data Management is starting to be taken seriously after going through inflated expectations and disillusionment.

This was interesting as two years ago we had a one-day workshop preceding PDT Europe 2015, focusing on Master Data Management in the context of PLM. The attendees at that workshop coming from various companies agreed that there was no real MDM for the engineering/manufacturing side of the business. MDM was more or less hijacked by SAP and other ERP-driven organizations.

Looking back, it is clear to me why in the PLM space MDM was not a real topic at that time. We were still too much focusing and are again too much focusing on information stored in files and documents. The only area touched by MDM was the BOM, and Part definitions as these objects also touch the ERP- and After Sales-  domain.

Actually, there are various MDM concepts, and I found an excellent presentation from Christopher Bradley explaining the different architectures on SlideShare: How to identify the correct Master Data subject areas & tooling for your MDM initiative. In particular, I liked the slide below as it comes close to my experience in the process industry

Here we see two MDM architectures, the one of the left driven from ERP. The one on the right could be based on the ISO-15926 standard as the process industry has worked for over 25 years to define a global exchange standard and data dictionary. The process industry was able to reach such a maturity level due to the need to support assets for many years across the lifecycle and the relatively stable environment. Other sectors are less standardized or so much depending on new concepts that it would be hard to have an industry-specific master.

PLM as an Application Specific Master?

If you would currently start with an MDM initiative in your company and look for providers of MDM solution, you will discover that their values are based on technology capabilities, bringing data together from different enterprise systems in a way the customer thinks it should be organized. More a toolkit approach instead of an industry approach. And in cases, there is an industry approach it is sporadic that this approach is related to manufacturing companies. Remember my observation from 2015: manufacturing companies do not have MDM activities related to engineering/manufacturing because it is too complicated, too diverse, too many documents instead of data.

Now with modern digital PLM, there is a need for MDM to support the full digital enterprise. Therefore, when you combine the previous observations with a recent post on Engineering.com from Tom Gill: PLM Initiatives Take On Master Data Transformation I started to come to a new hypotheses:

For companies with a model-based approach that has no MDM in place, the implementation of their Product Innovation Platform (modern PLM) should be based on the industry-specific data definition for this industry.

Tom Gill explains in his post the business benefits and values of using the PLM as the source for an MDM approach. In particular, in modern PLM environments, the PLM data model is not only based on the BOM.  PLM now encompasses the full lifecycle of a product instead of initially more an engineering view. Modern PLM systems, or as CIMdata calls them Product Innovation Platforms, manage a complex data model, based on a model-driven approach. These entities are used across the whole lifecycle and therefore could be the best start for an industry-specific MDM approach. Now only the industries have to follow….

Once data is able to flow, there will be another discussion: Who is responsible for which attributes. Bjørn Fidjeland from plmPartner recently wrote: Who owns what data when …?  The content of his post is relevant, I only would change the title: Who is responsible for what data when as I believe in a modern digital enterprise there is no ownership anymore – it is about sharing and responsibilities

 

Conclusion

Where MDM in the past did not really focus on engineering data due to the classical document-driven approach, now in modern PLM implementations, the Master Data Model might be based on the industry-specific data elements, managed and controlled coming from the PLM data model

 

Do you follow my thoughts / agree ?

 

 

At this moment there are two approaches to implement PLM. The most common practice is item-centric and model-centric will be potentially the best practice for the future. Perhaps your company still using a method from the previous century called drawing-centric. In that case, you should read this post with even more attention as there are opportunities to improve.

 

The characteristics of item-centric

In an item-centric approach, the leading information carrier is an item also known as a part. The term part is sometimes confusing in an organization as it is associated with a 3D CAD part. In SAP terminology the item is called Material, which is sometimes confusing for engineering as they consider Material the raw material. Item-centric is an approach where items are managed and handled through the whole lifecycle. In theory, an item can be a conceptual item (for early estimates), a design item (describing the engineering intent), a manufacturing item (defining how an item is consumed) and potentially a service item.

The picture below illustrates the various stages of an item-centric approach. Don’t focus on the structure, it’s an impression.

It is clear these three structures are different and can contain different item types. To read more about the details for an EBOM/MBOM approach read these post on my blog:

Back to item-centric. This approach means that the item is the leading authority of the product /part. The id and revision describe the unique object in the database, and the status of the item tells you in the current lifecycle stage for the item. In some cases, where your company makes configurable products also the relation between two items can define effectivity characteristics, like data effectivity, serial number effectivity and more. From an item structure, you can find its related information in context. The item points to the correct CAD model, the assembly or related manufacturing drawings, the specifications. In case of an engineering item, it might point towards approved manufacturers or approved manufacturing items.

Releasing an item or a BOM means the related information in context needs to validated and frozen too. In case your company works with drawings for manufacturing, these drawings need to be created, correct and released, which sometimes can be an issue due to some last-minute changes that can happen. The above figure just gives an impression of the potential data related to an item. It is important to mention that reports, which are also considered documents, do not need an approval as they are more a snapshot of the characteristics at that moment of generation.

The advantages of an item-centric approach are:

  • End-to-end traceability of information
  • Can be implemented in an evolutionary approach after PDM-ERP without organizational changes
  • It enables companies to support sharing of information
  • Sharing of information forces companies to think about data governance
    (not sure if a company wants to invest on that topic)

The main disadvantages of an item-centric approach are:

  • Related information on the item is not in context and therefore requires its own management and governance to ensure consistency
  • Related information is contained in documents, where availability and access is not always guaranteed

Still, the item-centric approach brings big benefits to a company that was working in a classical drawing-driven PDM-ERP approach. An additional remark needs to be made that not every company will benefit from an item-centric approach as typically Engineering-to-Order companies might find this method creating too much overhead.

The characteristics of Model-Centric

A model-centric approach is considered the future approach for modern enterprises as it brings efficiency, speed, multidisciplinary collaboration and support for incremental innovation in an agile way. When talking about a model-centric approach, I do not mean a 3D CAD model-centric approach. Yes, in case the product is mature, there will be a 3D Model serving as a base for the physical realization of the product.

However, in the beginning, the model can be still a functional or logical model. In particular, for complex products, model-based systems engineering might be the base for defining the solution. Actually, when we talk about products that interact with the outside world through software, we tend to call them systems. This explains that model-based systems engineering is getting more and more a recommended approach to make sure the product works as expected, fulfills all the needs for the product and creates a foundation for incremental innovation without starting from scratch.

Where the model-based architecture provides a framework for all stakeholders, the 3D CAD model will be the base for a digital thread towards manufacturing. Linking parameters from the logical and functional model towards the physical model a connection is created without the need to create documents or input-files for other disciplines. Adding 3D Annotations to the 3D CAD model and manufacturing process steps related to the model provides a direct connection to the manufacturing process.

The primary challenge of this future approach is to have all these data elements (requirements, functions, components, 3D design instances, manufacturing processes & resources to be connected in a federated environment (the product innovation platform). Connecting, versioning and baselining are crucial for a model-centric approach. This is what initiatives like Industry 4.0 are now exploring through demonstrators, prototypes to get a coherent collection of managed data.

Once we are able to control this collection of managed data concepts of digital twin or even virtual twin can be exploited linking data to a single instance in the field.

Also, the model can serve as the foundation for introduction incremental innovation, bringing in new features.  As the model-based architecture provides direct visibility for change impact (there are no documents to study), it will be extremely lean and cost-efficient to innovate on an existing product.

Advantages of model-centric

  • End-to-end traceability of all data related to a product
  • Extremely efficient in data-handling – no overhead on data-conversions
  • Providing high-quality understanding of the product with reduced effort compared to drawing-centric or item-centric approaches
  • It is scalable to include external stakeholders directly (suppliers/customers) leading to potential different, more beneficial business models
  • Foundation for Artificial Intelligence at any lifecycle step.

Disadvantages of model-centric

  • It requires a fundamentally different way of working compared to past. Legacy departments, legacy people, and legacy data do not fit directly into the model-centric approach. A business transformation is required, not evolution.
  • It is all about sharing data, which requires an architecture that is built to share information across Not through a service bus but as a (federated) platform of information.
    A platform requires a strong data governance, both from the dictionary as well as authorizations which discipline is leading/following.
  • There is no qualified industrial solution from any vendor yet at this time. There is advanced technology, there are demos, but to my knowledge, there is no 100% model-centric enterprise yet. We are all learning. Trying to distinguish reality from the hype.

 

Conclusions

The item-centric approach is the current best practice for most PLM implementations. However, it has the disadvantage that it is not designed for a data-driven approach, the foundation of a digital enterprise. The model-centric approach is new. Some facets already exist. However, for the total solution companies, vendors, consultants, and implementers are all learning step-by-step how it all connects. The future of model-centric is promising and crucial for survival.

Do you want to learn where we are now related to a model-centric approach?
Come to PDT2017 in Gothenburg on 18-19th October and find out more from the experts and your peers.

During my summer holidays, I read some fantastic books to relax the brain. Confessions from Jaume Cabré was an impressive novel, and I finished Yuval Noah Harari’s book Sapiens.

However, to get my PLM-twisted brain back on track, I also decided to read the book “The Death of Expertise” from Tom Nichols, with the thought-provoking subtitle” “The Campaign Against Established Knowledge and Why it Matters.”

I wanted to read it and understand if and how this would apply for PLM.

Tom Nichols is an American, so you understand he has many examples to support his statement from his own experience, like the anti-vaccination “experts”,  the climate change “hoax” and an “expert” tweeting president in his country who knows everything. Besides these obvious examples, Tom explains in a structured way how due to more general education and the internet, the distance between an expert and a average person has disappeared and facts and opinions seem to be interchangeable. I talked about this phenomena during the Product Innovation conference in Munich 2016: The PLM identity crisis.

Further down the book, Tom becomes a little grumpy and starts to complain about the Internet, Google and even about Wikipedia. These information resources provide so often fake or skin-deep information, which is not scientifically proven by experts. It reminded me of a conference that I attended in the early nineties of the previous century.  An engineering society had organized this conference to discuss the issue that finite element analysis became more and more available to laymen. The affordable simulation software would be used by non-trained engineers, and they would make the wrong decisions. Constructions would fall down, machines would fail. Looking back now, we can see the liberation of finite element analysis leads to more usage of simulation technology providing better products and when really needed experts are still involved.

I have the same opinion for internet, Google, and Wikipedia. They rapidly provide information. Still, you need to do fact checking and look at multiple sources, even if you found the answer that you liked already. Usually, when I do my “research” using the internet, I try to find different sources with different opinions and if possible also from various countries. What you will discover is that, when using the internet, there is often detailed information, but not in the headlines of these pages. To get down to the details, we will need experts for certain cases, but we cannot turn the clock back to the previous century.

What about PLM Expertise?

In the case of PLM, it is hard to find real expertise. Although PLM is recognized as a business strategy / a domain / an infrastructure , PLM has so many faces depending on the industry and its application. It will be hard to find an expert who understands it all and I assume headhunters can confirm this. A search for “PLM Consultant” on LinkedIn gives me almost 4000 hits, and when searching for “PLM Expert,” this number is reduced to less than 200. With only one source of information (LinkedIn), these figures do not really give an in-depth result (as expected !)

However, what is a PLM expert? Recently I wrote a post sharing the observation that a lot of PLM product – or IT-focused discussions miss the point of education (see PLM for Small and Medium Enterprises – It is not the software). In this post, I referred to an initiative from John Stark striving for the recognition of a PLM professional. You can read John’s follow up on this activity here: How strong is the support for Professional PLM?  Would a PLM Professional bring expertise?

I believe when a company understands the need for PLM, they have to build this knowledge internally. Building knowledge is a challenge for small and medium enterprises. It is a long-term investment contributing to the viability of the company. Support from a PLM professional can help. However, like the job of a teacher, it is about the skill-set (subjects, experience) and the motivational power of such a person. A certificate won’t help to select a qualified person.

Conclusion

We still need PLM expertise, and it takes time to build it. Expertise is something different as an (internet) opinion. When gaining PLM expertise, use the internet and other resources wisely. Do not go for the headlines of an internet page. Go deeper than the marketing pages from PLM related companies (vendors/implementers). Take time and hire experts to help you, not to release you from your responsibility to collect the expertise.

 

Note: If you want to meet PLM Experts and get a vendor-independent taste of PLM, join me at PDT Europe 2017 on 18-19 October in Gothenburg.  The theme of the conference: Continuous transformation of PLM to support the Lifecycle Model-Based Enterprise.  The conference is preceded on 17th October by CIMdata’s PLM Roadmap Europe 2017. Looking forward to meet you there !

 

 

PLM holiday thoughts

July and August are the months that privileged people go on holiday. Depending on where you live and work it can be a long weekend or a long month. I plan to give my PLM twisted brain a break for two weeks. I am not sure if it will happen as Greek beaches always have inspired for philosophers. What do you think about “PLM on the beach”?

There are two topics that keep me intrigued at this moment, and I hope to experience more about them the rest of the year.

Moving to Model-Based processes

I believe we all get immune for the term “Digital Transformation” (11.400.000 hits on Google today). I have talked about digital transformation in the context many times too. Change is happening. The classic ways of working were based on documents, a container of information, captured on paper (very classical) or captured in a file (still current).

As every stakeholder in a company (marketing, engineering, manufacturing, supplier, services, customers, and management) required a different set of information, many pieces of information all referring to the same product, have been parsed and modified into other documents.  It is costly and expensive to get a complete view of what is happening in the business. Meanwhile, all these information transformations (with Excel as the king) are creating an overhead for information management, both on IT-level and even more for non-value added resources who are manipulating information for the next silo/discipline.

What we have learned from innovative companies is that a data-driven approach, where more granular information is stored uniquely as data objects instead of document containers bring huge benefits. Information objects can be shared where relevant along the product lifecycle and without the overhead of people creating and converting documents, the stakeholders become empowered as they can retrieve all information objects they desire (if allowed). We call this the digital thread.

The way to provide a digital thread for manufacturing companies is to change the way they organize the product development and delivery processes. A model-based approach is required. I wrote about in a post: Digital PLM requires a Model-Based Enterprise a year ago. The term “Model-Based” also has many variations (67.800.00 hits on Google today). Some might consider the 3D MCAD Model at the center of information both for engineering and manufacturing.A good overview in the video below

Others might think about a behavior/simulation model of the product for simulating and delivering a digital twin often referred in the context of model-based design (MBD).

And ultimately a model-based approach integrated with systems engineering into Model-Based Systems Engineering (MBSE) allowing all stakeholders to collaborate in a data-driven manner around complex products based.

You can learn a lot about that during the upcoming PDT Europe conference on 18-19th October in Gothenburg. Concepts and experiences will be shared, and my contribution to the conference will be all about the challenges and lessons learned from the transformation process companies are embarking on becoming model-based.

PLM and ALM

A second topic that becomes more and more relevant for companies is how to combine the domains of product development and application software empowering these products. The challenge here is that we have no mature concepts yet for both domains. It reminds me of the early PDM implementations where companies implemented their PDM system for MCAD software and documents. All the electrical stuff was done disconnected in separate systems and somewhere in the product lifecycle information from MCAD and ECAD was merged in the bill of materials and documents. Mainly manually with a decent overhead for people consolidating the data.  Modern PLM systems have found best practices to manage a combination of mechanical and electronic components through an EBOM even connecting embedded software as an item in the BOM.

Now more and more the behavior and experience of products are driven by software. Sensors and connectivity of data are driving new capabilities and business models to the market. Customers are getting better connected, however also the companies delivering these solutions can act much faster now based on trends or issues experienced from the field.

The challenge, however, is that the data coming from the systems and the software defining the behavior of the products most of the time is managed in a separate environment, the ALM environment. In the ALM environment delivery of new solutions can be extremely fast and agile, creating a disconnect between the traditional product delivery processes and the software delivery processes.

Companies are learning now how to manage the dependencies between these two domains, as consistency of requirements and features of the products is required. Due to the fast pace of software changes, it is almost impossible to connect everything to the PLM product definition. PLM Vendors are working on concepts to connect PLM and ALM through different approaches. Other companies might believe that their software process is crucial and that the mechanical product becomes a commodity. Could you build a product innovation platform starting from the software platform which some of the old industry giants believe?

PLM combined with ALM concepts are the ones to follow, and I am looking forward to meeting the first company that has implemented a consistent flow between the world of hardware and software. So far there are many slide solutions, the reality and legacy at this moment are still inhibitors for the next step.

Conclusion

There is still a lot to discover and execute in the domain of PLM. Moving to a data-driven enterprise with all stakeholders connected is the challenging journey. Can we build robust concepts taking accuracy, security, and speed into account? I believe so, in particular when dreaming at the beach.

 

Bye for now

Does history repeat itself even in the PLM domain? The last week I have read various blog posts related to PLM and Small and Medium Enterprises (SME). A good summary of these thoughts can be found in  Oleg Shilovitsky’s post: How PLM vendors can find a formula to serve midsize manufacturing companies. Usually, the conclusions are:

  • Smaller enterprise cannot afford the “expensive” PLM solutions
  • Existing PLM systems are too complex to implement
  • Lack of usability
  • PLM systems are not flexible enough to implement
  • PLM should be cloud-based (reducing IT costs and efforts).

Jim Brown from Tech-Clarity published an e-book Finding PLM to Fit Mid-Sized High-Tech Companies and Oleg chimed in on that post as Jim talks about Core PLM, which was more design-oriented than BOM-oriented. Read these two posts as they give a good insight into PLM vendor thinking.

In 2006, Oleg and I worked @ SmarTeam where we defined and built a “Core PLM” solution, targeting mid-market companies. This core PLM solution called the SmarTeam Engineering Express (SNE) contained both pre-configured CAD-integrations as well as BOM practices (EBOM-MBOM).  Combined with documented best practices, pre-configured methodology, and workflows this environment could be implemented relatively quick (if the implementer did not want to earn extra money on services ).

There was even ROI provided by a launching customer:  A PLM success story with ROI (2012)

As part of the SME focus, SmarTeam people interviewed small and medium enterprises to understand in detail their needs. They mentioned the obvious points:

  • Easy to Use (Usability)
  • Quickly Deployable (Best Practices – pre-defined processes)
  • Easy & rapid Configurable and low IT-costs.

Interesting enough SmarTeam’s enterprise customers requested the same capabilities. It makes you realize there is no unique difference in PLM for mid-market companies and large enterprises. I believe the major difference is due to education, the company’s culture and where the PLM decision is made. Let’s explore

Lack of education

Small and Medium Enterprises usually lack resources who can spend time on planning or think about a new business strategy. The work needs to be done first. SME companies hire experts for their skills that bring immediate value, strategic thinking comes second. An engineering department does not hire a strategist; they hire a qualified and promising engineer.

These new hires are normally not educated on standard PLM concepts like ECR, ECO, Configuration Management, PLM-ERP best practices (EBOM/MBOM). For an engineering study, these practices/processes are not considered as critical as it is about collaboration and not about skills. The PLM capabilities engineering students learn are the basic functionalities they need master when working with their (CAD) tools.

SME’s use their own best practices based on years of experience (before PLM existed) and when they select a PLM system, it is mostly more a data management tool than an infrastructure to streamline processes. Combined with the fact that every PLM vendor has its own definition for PLM, it is hard to have a unified way of thinking for bring in new ways of working supported by a best in class PLM-system. The lack of standards and education is illustrated by a recent post from John Stark: Should PLM become a profession?

Of course, you can educate yourself on PLM. CIMdata is well-known for its training program, John Stark and others can educate you on PLM. Have a look at this interesting new startup SharePLM. PLM is about sharing, and I try to share PLM experiences too, by coaching or lecturing or through my blog posts. My most read posts over the past years are ECO/ECR for Dummies, The importance of a PLM data model: EBOM – MBOM and Bill of Materials for Dummies – ETO. Illustrating people around the world want to be educated on PLM but can the influence their (non-educated) management?

SME management considers PLM as an engineering tool. They want their employees to work with the best tools, and the management’s focus is on reducing costs and improving efficiency. Different ways of working or different business models, enabled by digitalization are not necessary on the SME management agenda. However, with the digital revolution is on its way, strategical thinking becomes crucial for survival. In that context, a recent post from Beth Stackpole on Digital Engineering says it all: PLM Knowledge Gap Hampers New Engineers.

Difference in culture

Small and Medium Enterprise often rely on close collaboration between people all working with their best in class tools per discipline. Collaboration is done through email, personal relations, and Dropbox-like sharing tools. They know their peers and people rely on intrinsic knowledge.

 

Large enterprise often consists of a collection of business units that could be considered as Small and Medium Enterprises. To create synergies and gain IT-benefits, management from large enterprises want to standardize on processes (naming and steps) and tools. Standardized processes allow the management to compare and benchmark the Business Units. Standardized tools, of course, reduce the overall IT costs.

Large enterprises usually have staff with a strategic task to work on standardized and optimized processes for the future. These people will discover and have time to be educated on the values of PLM, supported by strategic advisors that know the value of PLM. In these companies, the decisions made for PLM are top-down decisions. Usability, functions/features and costs are usually crucial for bottom-up decisions. For a top-down decision an aligned vision, matching roadmap, management value and costs are usually the main topics.

Conclusion

Ten years ago I believed Small and Medium Enterprises would benefit from a special offering with a focus on usability, pre-configured environments and providing best practices. I believe this is now on the agenda for all PLM vendors, some perhaps struggling with their legacy. However, cloud offerings will become more and more similar. Therefore, PLM education is in my opinion still a missing point in SME. Educated management and educated students could increase the value delivery of PLM by understanding the right target and managing the expectations correctly.

What do you think? Will there ever be the best-in-class PLM offering for SME or do you believe the human factor where education and understanding are crucial?  Looking forward to your opinion !

%d bloggers like this: