You are currently browsing the category archive for the ‘Standards’ category.

After a short summer break with almost no mentioning of the word PLM, it is time to continue this series of posts exploring the future of “connected” PLM. For those who also started with a cleaned-up memory, here is a short recap:

In part 1, I rush through more than 60 years of product development, starting from vellum drawings ending with the current PLM best practice for product development, the item-centric approach.

In part 2, I painted a high-level picture of the future, introducing the concept of digital platforms, which, if connected wisely, could support the digital enterprise in all its aspects. The five platforms I identified are the ERP and CRM platform (the oldest domains).

Next, the MES and PIP platform(modern domains to support manufacturing and product innovation in more detail) and the IoT platform (needed to support connected products and customers).

In part 3, I explained what is data-driven and how data-driven is closely connected to a model-based approach. Here we abandon documents (electronic files) as active information carriers. Documents will remain, however, as reports, baselines, or information containers. In this post, I ended up with seven topics related to data-driven, which I will discuss in upcoming posts.

Hopefully, by describing these topics – and for sure, there are more related topics – we will better understand the connected future and make decisions to enable the future instead of freezing the past.

 

Topic 1 for this post:

Data-driven does not imply, there needs to be a single environment, a single database that contains all information. As I mentioned in my previous post, it will be about managing connected datasets federated. It is not anymore about owned the data; it is about access to reliable data.

 

Platform or a collection of systems?

One of the first (marketing) hurdles to take is understanding what a data platform is and what is a collection of systems that work together, sold as a platform.

CIMdata published in 2017 an excellent whitepaper positioning the PIP (Product Innovation Platform):  Product Innovation Platforms: Definition, Their Role in the Enterprise, and Their Long-Term Viability. CIMdata’s definition is extensive and covers the full scope of product innovation. Of course, you can find a platform that starts from a more focused process.

For example, look at OpenBOM (focus on BOM collaboration), OnShape (focus on CAD collaboration) or even Microsoft 365 (historical, document-based collaboration).

The idea behind a platform is that it provides basic capabilities connected to all stakeholders, inside and outside your company. In addition, to avoid that these capabilities are limited, a platform should be open and able to connect with other data sources that might be either local or central available.

From these characteristics, it is clear that the underlying infrastructure of a platform must be based on a multitenant SaaS infrastructure, still allowing local data to be connected and shielded for performance or IP reasons.

The picture below describes the business benefits of a Product Innovation Platform as imagined by Accenture in 2014

Link to CIMdata’s 2014 commentary of Digital PLM HERE

Sometimes vendors sell their suite of systems as a platform. This is a marketing trick because when you want to add functionality to your PLM infrastructure, you need to install a new system and create or use interfaces with the existing systems, not really a scalable environment.

In addition, sometimes, the collaboration between systems in such a marketing platform is managed through proprietary exchange (file) formats.

A practice we have seen in the construction industry before cloud connectivity became available. However, a so-called end-to-end solution working on PowerPoint implemented in real life requires a lot of human intervention.

 

Not a single environment

There has always been the debate:

“Do I use best-in-class tools, supporting the end-user of the software, or do I provide an end-to-end infrastructure with more generic tools on top of that, focusing on ease of collaboration?”

In the system approach, the focus was most of the time on the best-in-class tools where PLM-systems provide the data governance. A typical example is the item-centric approach. It reflects the current working culture, people working in their optimized siloes, exchanging information between disciplines through (neutral) files.

The platform approach makes it possible to deliver the optimized user interface for the end-user through a dedicated app. Assuming the data needed for such an app is accessible from the current platform or through other systems and platforms.

It might be tempting as a platform provider to add all imaginable data elements to their platform infrastructure as much as possible. The challenge with this approach is whether all data should be stored in a central data environment (preferably cloud) or federated.  And what about filtering IP?

In my post PLM and Supply Chain Collaboration, I described the concept of having an intermediate hub (ShareAspace) between enterprises to facilitate real-time data sharing, however carefully filtered which data is shared in the hub.

It may be clear that storing everything in one big platform is not the future. As I described in part 2, in the end, a company might implement a maximum of five connected platforms (CRM, ERP, PIP, IoT and MES). Each of the individual platforms could contain a core data model relevant for this part of the business. This does not imply there might be no other platforms in the future. Platforms focusing on supply chain collaboration, like ShareAspace or OpenBOM, will have a value proposition too.  In the end, the long-term future is all about realizing a digital tread of information within the organization.

Will we ever reach a perfectly connected enterprise or society? Probably not. Not because of technology but because of politics and human behavior. The connected enterprise might be the most efficient architecture, but will it be social, supporting all humanity. Predicting the future is impossible, as Yuval Harari described in his book:  21 Lessons for the 21st Century. Worth reading, still a collection of ideas.

 

Proprietary data model or standards?

So far, when you are a software vendor developing a system, there is no restriction in how you internally manage your data. In the domain of PLM, this meant that every vendor has its own proprietary data model and behavior.

I have learned from my 25+ years of experience with systems that the original design of a product combined with the vendor’s culture defines the future roadmap. So even if a PLM vendor would rewrite all their software to become data-driven, the ways of working, the assumptions will be based on past experiences.

This makes it hard to come to unified data models and methodology valid for our PLM domain. However, large enterprises like Airbus and Boeing and the major Automotive suppliers have always pushed for standards as they will benefit the most from standardization.

The recent PDT conferences were an example of this, mainly the 2020 Fall conference. Several Aerospace & Defense PLM Action groups reported their progress.

You can read my impression of this event in The weekend after PLM Roadmap / PDT 2020 – part 1 and The next weekend after PLM Roadmap PDT 2020 – part 2.

It would be interesting to see a Product Innovation Platform built upon a data model as much as possible aligned to existing standards. Probably it won’t happen as you do not make money from being open and complying with standards as a software vendor. Still, companies should push their software vendors to support standards as this is the only way to get larger connected eco-systems.

I do not believe in the toolkit approach where every company can build its own data model based on its current needs. I have seen this flexibility with SmarTeam in the early days. However, it became an upgrade risk when new, overlapping capabilities were introduced, not matching the past.

In addition, a flexible toolkit still requires a robust data model design done by experienced people who have learned from their mistakes.

The benefit of using standards is that they contain the learnings from many people involved.

 

Conclusion

I did not like writing this post so much, as my primary PLM focus lies on people and methodology. Still, understanding future technologies is an important point to consider. Therefore, this time a not-so-exciting post. There is enough to read on the internet related to PLM technology; see some of the recent articles below. Enjoy

 

Matthias Ahrens shared:  Integrated Product Lifecycle Management (Google translated from German)

Oleg Shilovitsky wrote numerous articles related to technology –
in this context:
3 Challenges of Unified Platforms and System Locking and
SaaS PLM Acceleration Trends

One of my favorite conferences is the PLM Road Map & PDT conference. Probably because in the pre-COVID days, it was the best PLM conference to network with peers focusing on PLM practices, standards, and sustainability topics. Now the conference is virtual, and hopefully, after the pandemic, we will meet again in the conference space to elaborate on our experiences further.

Last year’s fall conference was special because we had three days filled with a generic PLM update and several A&D (Aerospace & Defense) working groups updates, reporting their progress and findings. Sessions related to the Multiview BOM researchGlobal Collaboration, and several aspects of Model-Based practices: Model-Based Definition, Model-Based Engineering & Model-Based Systems engineering.

All topics that I will elaborate on soon. You can refresh your memory through these two links:

This year, it was a two-day conference with approximately 200 attendees discussing how emerging technologies can disrupt the current PLM landscape and reshape the PLM Value Equation. During the first day of the conference, we focused on technology.

On the second day, we looked in addition to the impact new technology has on people and organizations.

Today’s Emerging Trends & Disrupters

Peter Bilello, CIMdata’s President & CEO, kicked off the conference by providing CIMdata observations of the market. An increasing number of technology capabilities, like cloud, additive manufacturing, platforms, digital thread, and digital twin, all with the potential of realizing a connected vision. Meanwhile, companies evolve at their own pace, illustrating that the gap between the leaders and followers becomes bigger and bigger.

Where is your company? Can you afford to be a follower? Is your PLM ready for the future? Probably not, Peter states.

Next, Peter walked us through some technology trends and their applicability for a future PLM, like topological data analytics (TDA), the Graph Database, Low-Code/No-Code platforms, Additive Manufacturing, DevOps, and Agile ways of working during product development. All capabilities should be related to new ways of working and updated individual skills.

I fully agreed with Peter’s final slide – we have to actively rethink and reshape PLM – not by calling it different but by learning, experimenting, and discussing in the field.

Digital Transformation Supporting Army Modernization

An interesting viewpoint related to modern PLM came from Dr. Raj Iyer, Chief Information Officer for IT Reform from the US Army. Rai walked us through some of the US Army’s challenges, and he gave us some fantastic statements to think about. Although an Army cannot be compared with a commercial business, its target remains to be always ahead of the competition and be aware of the competition.

Where we would say “data is the new oil”, Rai Iyer said: “Data is the ammunition of the future fight – as fights will more and more take place in cyberspace.”

The US Army is using a lot of modern technology – as the image below shows. The big difference here with regular businesses is that it is not about ROI but about winning fights.

Also, for the US Army, the cloud becomes the platform of the future. Due to the wide range of assets, the US Army has to manage, the importance of product data standards is evident.  – Rai mentioned their contribution and adherence to the ISO 10303 STEP standard crucial for interoperability. It was an exciting insight into the US Army’s current and future challenges. Their primary mission remains to stay ahead of the competition.

Joining up Engineering Data without losing the M in PLM

Nigel Shaw’s (Eurostep) presentation was somehow philosophical but precisely to the point what is the current dilemma in the PLM domain.  Through an analogy of the internet, explaining that we live in a world of HTTP(s) linking, we create new ways of connecting information. The link becomes an essential artifact in our information model.

Where it is apparent links are crucial for managing engineering data, Nigel pointed out some of the significant challenges of this approach, as you can see from his (compiled) image below.

I will not discuss this topic further here as I am planning to come back to this topic when explaining the challenges of the future of PLM.

As Nigel said, they have a debate with one of their customers to replace the existing PLM tools or enhance the existing PLM tools. The challenge of moving from coordinated information towards connected data is a topic that we as a community should study.

Integration is about more than Model Format.

This was the presentation I have been waiting for. Mark Williams from Boeing had built the story together with Adrian Burton from Airbus. Nigel Shaw, in the previous session, already pointed to the challenge of managing linked information. Mark elaborated further about the model-based approach for system definition.

All content was related to the understanding that we need a  model-based information infrastructure for the future because storing information in documents (the coordinated approach) is no longer viable for complex systems. Mark ‘slide below says it all.

Mark stressed the importance of managing model information in context, and it has become a challenge.

Mark mentioned that 20 years ago, the IDC (International Data Corporation) measured Boeing’s performance and estimated that each employee spent 2 ½ hours per day. In 2018, the IDC estimated that this number has grown to 30 % of the employee’s time and could go up to 50 % when adding the effort of reusing and duplicating data.

The consequence of this would be that a full-service enterprise, having engineering, manufacturing and services connected, probably loses 70 % of its information because they cannot find it—an impressive number asking for “clever” ways to find the correct information in context.

It is not about just a full indexed search of the data, as some technology geeks might think. It is also about describing and standardizing metadata that describes the models. In that context, Mark walked through a list of existing standards, all with their pros and cons, ending up with the recommendation to use the ISO 10303-243 – MoSSEC standard.

MoSSEC standing for Modelling and Simulation information in a collaborative Systems Engineering Context to manage and connect the relationships between models.

MoSSEC and its implication for future digital enterprises are interesting, considering the importance of a model-based future. I am curious how PLM Vendors and tools will support and enable the standard for future interoperability and collaboration.

Additive Manufacturing
– not as simple as paper printing – yet

Andreas Graichen from Siemens Energy closed the day, coming back to the new technologies’ topic: Additive Manufacturing or in common language 3D Printing. Andreas shared their Additive Manufacturing experiences, matching the famous Gartner Hype Cycle. His image shows that real work needs to be done to understand the technology and its use cases after the first excitement of the hype is over.

Material knowledge was one of the important topics to study when applying additive manufacturing. It is probably a new area for most companies to understand the material behaviors and properties in an Additive Manufacturing process.

The ultimate goal for Siemens Energy is to reach an “autonomous” workshop anywhere in the world where gas turbines could order their spare parts by themselves through digital warehouses. It is a grand vision, and Andreas confirmed that the scalability of Additive Manufacturing is still a challenge.

For rapid prototyping or small series of spare parts, Additive Manufacturing might be the right solution. The success of your Additive Manufacturing process depends a lot on how your company’s management has realistic expectations and the budget available to explore this direction.

Conclusion

Day 1 was enjoyable and educational, starting and ending with a focus on disruptive technologies. The middle part related to data the data management concepts needed for a digital enterprise were the most exciting topics to follow up in my opinion.

Next week I will follow up with reviewing day 2 and share my conclusions. The PLM Road Map & PDT Spring 2021 conference confirmed that there is work to do to understand the future (of PLM).

 

Last week I shared my first review of the PLM Roadmap / PDT Fall 2020 conference, organized by CIMdata and Eurostep. Having digested now most of the content in detail, I can state this was the best conference of 2020. In my first post, the topics I shared were mainly the consultant’s view of digital thread and digital twin concepts.

This time, I want to focus on the content presented by the various Aerospace & Defense working groups who shared their findings, lessons-learned (so far) on topics like the Multi-view BOM, Supply Chain Collaboration, MBSE Data interoperability.

These sessions were nicely wrapped with presentations from Alberto Ferrari (Raytheon), discussing the digital thread between PLM and Simulation Lifecycle Management and Jeff Plant (Boeing) sharing their Model-Based Engineering strategy.

I believe these insights are crucial, although there might be people in the field that will question if this research is essential. Is not there an easier way to achieve to have the same results?

Nicely formulated by Ilan Madjar as a comment to my first post:

Ilan makes a good point about simplifying the ideas to the masses to make it work. The majority of companies probably do not have the bandwidth to invest and understand the future benefits of a digital thread or digital twins.

This does not mean that these topics should not be studied. If your business is in a small, simple eco-system and wants to work in a connected mode, you can choose a vendor and a few custom interfaces.

However, suppose you work in a global industry with an extensive network of partners, suppliers, and customers.

In that case, you cannot rely on ad-hoc interfaces or a single vendor. You need to invest in standards; you need to study common best practices to drive methodology, standards, and vendors to align.

This process of standardization is so crucial if you want to have a sustainable, connected enterprise. In the end, the push from these companies will lead to standards, allowing the smaller companies to ad-here or connect to.

The future is about Connected through Standards, as discussed in part 1 and further in this post. Let’s go!

Global Collaboration – Defining a baseline for data exchange processes and standards

Katheryn Bell (Pratt & Whitney Canada) presented the progress of the A&D Global Collaboration workgroup. As you can see from the project timeline, they have reached the phase to look towards the future.

Katheryn mentioned the need to standardize terminology as the first point of attention. I am fully aligned with that point; without a standardized terminology framework, people will have a misunderstanding in communication.

This happens even more in the smaller businesses that just pick sometimes (buzz) terms without a full understanding.

Several years ago, I talked with a PLM-implementer telling me that their implementation focus was on systems engineering. After some more explanations, it appeared they were making an attempt for configuration management in reality. Here the confusion was massive. Still, a standard, common terminology is crucial in our domain, even if it seems academic.

The group has been analyzing interoperability standards, standards for long-time archival and retrieval (LOTAR), but also has been studying the ISO 44001 standard related to Collaborative business relationship management systems

In the Q&A session, Katheryn explained that the biggest problem to solve with collaboration was the risk of working with the wrong version of data between disciplines and suppliers.

Of course, such errors can lead to huge costs if they are discovered late (or too late). As some of the big OEMs work with thousands of suppliers, you can imagine it is not an issue easily discovered in a more ad-hoc environment.

The move to a standardized Technical Data Package based on a Model-Based Definition is one of these initiatives in this domain to reduce these types of errors.

You can find the proceedings from the Global Collaboration working group here.

 

Connect, Trace, and Manage Lifecycle of Models, Simulation and Linked Data: Is That Easy?

I loved Alberto Ferrari‘s (Raytheon) presentation how he described the value of a model-based digital thread, positioning it in a targeted enterprise.

Click on the image and discover how business objectives, processes and models go together supported by a federated infrastructure.

Alberto’s presentation was a kind of mind map from how I imagine the future, and it is a pity if you have not had the chance to see his session.

Alberto also focused on the importance of various simulation capabilities combined with simulation lifecycle management. For Alberto, they are essential to implement digital twins. Besides focusing on standards, Alberto pleas for a semantic integration, open service architecture with the importance of DevSecOps.

Enough food for thought; as Alberto mentioned, he presented the corporate vision, not the current state.

More A&D Action Groups

There were two more interesting specialized sessions where teams from the A&D action groups provided a status update.

Brandon Sapp (Boeing) and Ian Parent (Pratt & Whitney) shared the activities and progress on Minimum Model-Based Definition (MBD) for Type Design Certification.

As Brandon mentioned, MBD is already a widely used capability; however, MBD is still maturing and evolving.  I believe that is also one of the reasons why MBD is not yet accepted in mainstream PLM. Smaller organizations will wait; however, can your company afford to wait?

More information about their progress can be found here.

Mark Williams (Boeing) reported from the A&D Model-Based Systems Engineering action group their first findings related to MBSE Data Interoperability, focusing on an Architecture Model Exchange Solution.  A topic interesting to follow as the promise of MBSE is that it is about connected information shared in models. As Mark explained, data exchange standards for requirements and behavior models are mature, readily available in the tools, and easily adopted. Exchanging architecture models has proven to be very difficult. I will not dive into more details, respecting the audience of this blog.

For those interested in their progress, more information can be found here

Model-Based Engineering @ Boeing

In this conference, the participation of Boeing was significant through the various action groups. As the cherry on the cake, there was Jeff Plant‘s session, giving an overview of what is happening at Boeing. Jeff is Boeing’s director of engineering practices, processes, and tools.

In his introduction, Jeff mentioned that Boeing has more than 160.000 employees in over 65 countries. They are working with more than 12.000 suppliers globally. These suppliers can be manufacturing, service or technology partnerships. Therefore you can imagine, and as discussed by others during the conference, streamlined collaboration and traceability are crucial.

The now-famous MBE Diamond symbol illustrates the model-based information flows in the virtual world and the physical world based on the systems engineering approach. Like Katheryn Bell did in her session related to Global Collaboration, Jeff started explaining the importance of a common language and taxonomy needed if you want to standardize processes.

Zoom in on the Boeing MBE Taxonomy, you will discover the clarity it brings for the company.

I was not aware of the ISO 23247 standard concerning the Digital Twin framework for manufacturing, aiming to apply industry standards to the model-based definition of products and process planning. A standard certainly to follow as it brings standardization on top of existing standards.

As Jeff noted: A practical standard for implementation in a company of any size. In my opinion, mandatory for a sustainable, connected infrastructure.

Jeff presented the slide below, showing their standardization internally around federated platforms.

This slide resembles a lot the future platform vision I have been sharing since 2017 when discussing PLM’s future at PLM conferences, when explaining the differences between Coordinated and Connected – see also my presentation here on Slideshare.

You can zoom in on the picture to see the similarities. For me, the differences were interesting to observe. In Jeff’s diagram, the product lifecycle at the top indicates the platform of (central) interest during each lifecycle stage, suggesting a linear process again.

In reality, the flow of information through feedback loops will be there too.

The second exciting detail is that these federated architectures should be based on strong interoperability standards. Jeff is urging other companies, academics and vendors to invest and come to industry standards for Model-Based System Engineering practices.  The time is now to act on this domain.

It reminded me again of Marc Halpern’s message mentioned in my previous post (part 1) that we should be worried about vendor alliances offering an integrated end-to-end data flow based on their solutions. This would lead to an immense vendor-lock in if these interfaces are not based on strong industry standards.

Therefore, don’t watch from the sideline; it is the voice (and effort) of the companies that can drive standards.

Finally, during the Q&A part, Jeff made an interesting point explaining Boeing is making a serious investment, as you can see from their participation in all the action groups. They have made the long-term business case.

The team is confident that the business case for such an investment is firm and stable, however in such long-term investment without direct results, these projects might come under pressure when the business is under pressure.

The virtual fireside chat

The conference ended with a virtual fireside chat from which I picked up an interesting point that Marc Halpern was bringing in. Marc mentioned a survey Gartner has done with companies in fast-moving industries related to the benefits of PLM. Companies reported improvements in accuracy and product development. They did not see so much a reduced time to market or cost reduction. After analysis, Gartner believes the real issue is related to collaboration processes and supply chain practices. Here lead times did not change, nor the number of changes.

Marc believes that this topic will be really showing benefits in the future with cloud and connected suppliers. This reminded me of an article published by McKinsey called The case for digital reinvention. In this article, the authors indicated that only 2 % of the companies interview were investing in a digital supply chain. At the same time, the expected benefits in this area would have the most significant ROI.

The good news, there is consistency, and we know where to focus for early results.

Conclusion

It was a great conference as here we could see digital transformation in action (groups). Where vendor solutions often provide a sneaky preview of the future, we saw people working on creating the right foundations based on standards. My appreciation goes to all the active members in the CIMdata A&D action groups as they provide the groundwork for all of us – sooner or later.

Last week I was happy to attend the PLM Roadmap / PDT Fall 2020 conference as usual organized by CIMdata and Eurostep. I wrote about the recent PI DX conference, which touched a lot on the surface of PLM and Digital Transformation. This conference is really a conference for those who want to understand the building blocks needed for current and future PLM.

In this conference, usually with approximately 150 users on-site, now with over 250 connected users for 3 (half) days. Many of us, following every session of the conference. As an active participant in the physical events, it was a little disappointing not to be in the same place with the other participants this time. The informal network meetings in this conference have always been special thanks to a relatively small but stable group of experts.  Due to the slightly reduced schedule, there was this time, less attention for some of the typical PDT-topics most of the time coming from Sweden and related to sustainability.

The conference’s theme was Digital Thread—the PLM Professionals’ Path to Delivering Innovation, Efficiency, and Quality and might sound like a marketing statement.  However, the content presented was much more detailed than just marketing info. The fact that you watched the presentation on your screen made it an intense conference with many valuable details.

Have a look at the agenda, and I will walk you through some of the highlights for me. As there was so much content to discuss, I will share this time part 1. Next week, in part 2, you will see the coherence of all the presentations.

As if there was a Coherent Thread.

Digital Twin, It Requires a Digital Thread

Peter Bilello, President & CEO, CIMdata, ‘s keynote with the title Digital Twin, It Requires a Digital Thread was immediately an illustration of discussing reality.  When I stated at the Digital Twin conference in the Netherlands that “Digital Twins do not run on Documents“, it had the same meaning as when Peter stated,” A Digital Twin without a Digital Thread is an orphan”.

Digital Thread

And Peter’s statement, “All companies do PLM, most of the time however disconnected”, is another way to stimulate companies working in a connected manner.

As usual Peter’s session was a good overview of the various aspect related to the Digital Thread and Digital Twin.

Digital Twin

The concept of a virtual twin is not new. The focus is as mentioned before now more on the term “Connected” Peter provided the CIMdata definition for Digital Thread and Digital Twin. Click on the images to the left to read the full definition.

Peter’s overview also referred to the Boeing Diamond, illustrating the mapping of the physical and virtual world, connected through a Digital Thread the various Digital Twins that can exist. The Boeing Diamond was one of the favorites during the conference.

When you look at Peter’s conclusions, there is an alignment with what I wrote in the post: A Digital Twin for Everyone and the fact that we need to strive for a connected enterprise. Only then we can benefit from a Digital Twin concept.

 

The Multi-view BOM Solution Evaluation
– Process, Results, and Industry Impacts

The reports coming from the various A&D PLM action groups are always engaging sessions to watch. Here, nine companies, even competitors, discuss and explore PLM themes between themselves supported by CIMdata.

These companies were the first that implemented PLM; it is interesting to watch how they move forward like supertankers. They cannot jump from one year to another year on a new fashionable hype. Their PLM-infrastructure needs to be consistent and future-proof due to their data’s longevity and the high standards for regulatory compliance and safety.

However, these companies are also pioneers for the future. They have been practicing Model-Based approaches for over ten years already and are still learning. In next week’s post, you will read later that these frontrunners are pushing for standards to make a Model-Based future affordable and achievable.

In that context, the action group Multi-View BOM shared their evaluation results for a study related to the multi-view BOM. A year ago, I wrote about this topic when Fred Feru from Airbus presented the intermediate results at the CIMdata Roadmap/PDT 2019 conference.

Dan Ganser (Gulfstream) and Javier Reines (Airbus) presented the findings. The conclusion was that the four vendors evaluated, i.e., Aras, Dassault Systems, PTC and Siemens, all passed the essential requirements and use cases. You can find the report and the findings here: Multi-view Bill of Materials

One interesting remark.

When the use cases were evaluated, the vendors could score on a level from 0 to 5, see picture. Interesting to see that apparently, it was possible to exceed the requirement, something that seems like a contradiction.

In particular, in this industry, where formal requirements management is a must – either you meet a requirement or not.

Dan Ganser explained that the current use cases were defined based on the minimum expectations, therefore there was the option to exceed the requirement. I still would be curious to see what does it mean to exceed the requirement. Is it usability, time, or something innovative we might have missed?

 

5G for Digital Twins & Shadows

I learned a lot from the presentation from Niels Koenig, working at the Fraunhofer Institute for Production Technology. Niels explained how important 5G is for realizing the Industry 4.0 targets. At the 5G Industry Campus, several projects are running to test and demonstrate the value of 5G in relation to manufacturing.

If you want to get an impression of the 5G Industry Campus – click on the Youube movie.

One of the examples Niels discussed was closed-loop manufacturing. Thanks to the extremely low latency (< 1ms), a connected NC machine can send real-time measurements to be compared with the expected values. For example, in the case of resonance, the cutting might not be smooth. Thanks to the closed-loop, the operator will be able to interfere or adjust the operation. See the image below.

Digital Thread: Be Careful What you Wish For, It Just Might Come True

I was looking forward to Marc Halpern‘s presentation. Marc often brings a less technical viewpoint but a more business-related viewpoint to the discussion. Over the past ten years, there have been many disruptive events, most recently the COVID-pandemic.

Companies are asking themselves how they can remain resilient. Marc shared some of his thoughts on how Digital Twins and Digital Threads can support resilience.

In that context, Gartner saw a trend that their customers are now eagerly looking for solutions related to Digital Twin, Digital Thread, Model-Based Approaches, combined with the aim to move to the cloud. Related to Digital Thread and Digital Twin, most of Gartner’s clients are looking for traceability and transparency along the product lifecycle. Most Digital Twin initiatives focus on a twin of operational assets, particularly inside the manufacturing facility. Nicely linking to Niels Konig’s session related to 5G.

Marc stated that there seems to be a consensus that a Digital Thread is compelling enough for manufacturers to invest. In the end, they will have to. However, there are also significant risks involved. Marc illustrated the two extremes; in reality, companies will end up somewhere in the middle, illustrated later by Jeff Plant from Boeing. The image on the left is a sneaky preview for next week.

When discussing the Digital Thread, Marc again referred to it more as a Digital Net, a kind of connected infrastructure for various different threads based on the various areas of interest.

I show here a slide from Marc’s presentation at the PDT conference in 2018. It is more an artist’s impression of the same concept discussed during this conference again, the Boeing Diamond.

Related to the risk of implementing a Digital Thread and Digital Twin, Marc showed another artistic interpretation; The two extremes of two potential end states of Digital Thread investment. Marc shared the critical risks for both options.

For the Vendor Black Hole, his main points were that if you choose a combined solution, diminished negotiating power, higher implementation costs, and potentially innovative ideas might not be implemented as they are not so relevant for the vendor. They have the power!

As an example of combined solutions Marc mentioned, the recently announced SAP-Siemens partnership, the Rockwell Automation-PTC partnership, the Schneider Electric-Aveva-partnership, and the ABB-Dassault Systemes partnership.

Once you are in the black hole, you cannot escape. Therefore, Marc recommended making sure you do not depend on a few vendors for your Digital Twin infrastructure.

The picture on the left illustrates the critical risks of the Enterprise Architecture “Mess”. It is a topic that I am following for a long time. Suppose you have a collection of services related to the product lifecycle, like Workflow-services, 3D Modeling-services, BOM-services, Manufacturing-services.

Together they could provide a PLM-infrastructure.

The idea behind this is that thanks to openness and connectivity, every company can build its own unique enterprise architecture. No discussion about standard best practices. You build your company’s best practices (for the future, the current ?)

It is mainly promoted as a kind of bottom-up PLM. If you are missing capabilities, just build them yourselves, using REST-services, APIs, using Low-Code platforms. It seems attractive for the smaller enterprises, however most of the time, only a short time. I fully concur with Marc’s identified risks here.

As I often illustrated in presentations related to a digital future, you will need a mix of both. Based on your point of focus, you could imagine five major platforms being connected together to cover all aspects of a business. Depending on your company’s business model and products, one of them might be the dominant one. With my PLM-focus, this would be the Product Innovation Platform, where the business is created.

Marc ended with five priorities to enable a long-term Digital Thread success.

  • First of all – set the ground rules for data governance. A topic often mentioned but is your company actively engaging on that already?
  • Next, learn from Model-Based Systems Engineering as a foundation for a Model-Based Enterprise.  A topic often discussed during the previous CIMdata Roadmap / PDT-conference.
  • The change from storing and hiding information in siloes towards an infrastructure and mindset of search and access of data, in particular, the access to Bill of Materials

The last point induced two more points.

  • The need for an open architecture and standards. We would learn more on this topic on day 3 of the conference.
  • Make sure your digital transformation sticks within the organization by investing and executing on organizational change management.

Conclusion

The words “Digital Thread” and “Digital Twin” are mentioned 18 times in this post and during the conference even more. However, at this conference, they were not hollow marketing terms. They are part of a dictionary for the future, as we will see in next week’s post when discussing some of the remaining presentations.

Closing this time with a point we all agreed upon: “A Digital Twin without a Digital Thread is an orphan”. Next week more!

This post is based on a mix of interactions I had the last two weeks in my network, mainly on LinkedIn.  First, I enjoyed the discussion that started around Yoann Maingon post: Thoughts about PLM Business models. Yoann is quite seasoned in PLM, as you can see from his LinkedIn profile, and we have had interesting discussions in the past, and recently about a new PLM-system, he is developing Ganister PLM, based on a flexible Graph database.

Perhaps in that context, Yoann was exploring the various business models. Do you pay for the software (and maintenance), do you pay through subscription, what about a modular approach or a full license for all the functionality? All these questions made me think about the various business models that I encountered and how hard it is for a customer to choose the optimal solution.  And is the space for a new type of PLM? Is there space for free PLM? Some of my thoughts here:

PLM vendors need to be profitable

One of the most essential points to consider is that whatever PLM solution you are aiming to buy, make sure that your PLM vendor has a profitable business model. As once you started with a PLM solution, it is your company’s IP that will be stored in this environment, and you do not want to change every few years your PLM system. Switching PLM systems would be affordable if the PLM system would store their data in a standard format – I will share a more in-depth link under PLM and standards.

For the moment, you cannot state PLM vendors endorse standards. None of the real PLM vendors have a standardized data model, perhaps closest to standards are Eurostep, who have based that ShareAspace solution on top of the PLCS (ISO 10303) standard. However, ShareAspace is more positioned as a type of middleware, connecting between OEMs/Owner/Operators and their suppliers to benefit for standardized connectivity.

Coming back to the statement, PLM Vendors need to be profitable to provide a guarantee for the future of your company’s data is the first step. The major PLM Vendors are now profitable as during a consolidation phase starting 15 years ago, a lot of non-profitable PLM Vendors disappeared. Matrix One, Agile, Eigner & Partner PLM are the best-known companies that were bought for either their technology or market share. In that context, you might also look at OnShape.

Would they be profitable as a separate company, or would investors give up? To survive, you need to be profitable, so giving software away for free is not a good sign (see the software for free paragraph) as a company needs continuity.

PLM startups

In the past 10 years, I have seen and evaluated several new PLM companies. All of them did not really change the PLM paradigm, most of them were still focusing on being an engineering collaboration tools. Several of these companies have in their visionary statement that they are going to be the “Excel killer.” We all know Excel has the best user interface and capabilities to manipulate a collection of metadata.

Very popular is the BOM in Excel, extracted from the CAD-system (no need for an “expensive” PDM or PLM) or BOM used to share with suppliers and stakeholders (ERP is too rigid, purchasing does not work with PDM).

The challenge I see here is that these startups do not bring real new value. The cost of manipulating Excels is a hidden cost, and companies relying on Excel communication are the type of companies that do not have a strategic point of view. This is typical for Small and Medium businesses where execution (“let’s do it”) gets all the attention.

PLM startups often collect investor’s money because they promise to kill Excel, but is Excel the real problem? Modern PLM is about data sharing, which is an attitude change, not necessarily a technology change from Excel tables to (cloud) shared tables. However, will one of these “new Excel killers” PLMs be disruptive? I don’t think so.

PLM disruption?

A week ago, I read an interview with Clayton Christensen (thanks Hakan Karden), which I shared on LinkedIn a week ago. Clayton Christensen is the father of the Disruptive Innovation theory, and I have cited him several times in my blogs. His theory is, in my opinion, fundamental to understand how traditional businesses can be disrupted. The interview took place shortly before he died at the age of 67. He died due to complications caused by leukemia.

A favorite part of this interview is, where he restates what is really Disruptive Innovation as we often talk about disruption without understanding the context, just echoing other people:

Christensen: Disruptive innovation describes a process by which a product or service powered by a technology enabler initially takes root in simple applications at the low end of a market — typically by being less expensive and more accessible — and then relentlessly moves upmarket, eventually displacing established competitors. Disruptive innovations are not breakthrough innovations or “ambitious upstarts” that dramatically alter how business is done but, rather, consist of products and services that are simple, accessible, and affordable. These products and services often appear modest at their outset but over time have the potential to transform an industry.

Many of the PLM startups dream and position themselves as the new disruptor.  Will they succeed? I do not believe so if they only focus on replacing Excel, there is a different paradigm needed. Voice control and analysis perhaps (“Hey PLM if I change Part XYZ what will be affected”)?

This would be disruptive and open new options. I think PLM startups should focus here if they want my investment money.

PLM for free?

There are some voices that PLM should be free in an analogy to software management and collaboration tools. There are so many open-source software management tools, why not using them for PLM? I think there are two issues here:

  • PLM data is not like software data. A lot of PLM data is based on design models (3D CAD / Simulation), which is different from software. Designs are often not that modular as software for various reasons. Companies want to be modular in their products, but do they have the time and resources to reinvent their existing product. For software, these costs are so much lower as it is only a brain exercise. For hardware, the impact is significant. Bringing me to the second point.
  • The cost of change for hardware is entirely different compared to software. Changing software does not have an impact on existing stock or suppliers and, therefore, can be implemented once tested for its purpose. A hardware change impacts the existing production process. First, use the old parts before introducing the change, or do we accept the (costs) of scrap. Is our supply chain, or are our production tools ready to deliver continuity for the new version? Hardware changes are costly, and you want to avoid them. Software changes are cheap, therefore design your products to be configurable based on software (For example Tesla’s software controlling the features to be allowed)

Now imagine, with enough funding, you could provide a PLM for free.  Because of ease of deployment, this would be very likely a cloud offering, easy and scalable. However, all your IP is in that cloud too, and let’s imagine that the cloud is safer than on-premise, so it does not matter in which country your data is hosted (does it ?).

Next, the “free” PLM provider starts asking a small service fee after five years, as the promised ROI on the model hasn’t delivered enough value for the shareholders, they become anxious. Of course, you do not like to pay the fee. However, where is your data, and what happens when you do not pay?

If the PLM provider switches you off, you are without your IP. If you ask the PLM provider to provide your data, what will you get? A blob of XML-files, anything you can use?

In general, this is a challenge for all cloud solutions.

  • What if you want to stop your subscription?
  • What is the allowed Exit-strategy?

Here I believe customers should ask for clarity, and perhaps these questions will lead to a renewed understanding that we need standards.

PLM and standards

We had a vivid discussion in the blogging community in September last year. You can read more related to this topic in my post: PLM and the need for standards which describes the aspects of lock-in and needs for openness.

Finally, a remark related to the PLM-acronym. Another interesting discussion started around Joe Barkai’s post: Why I do not do PLM . Read the comments and the various viewpoint on PLM here. It is clear that the word PLM unites us all; however, the interpretation is different.

If someone in the street asks me what is your profession, I never mention I do PLM. I say: “I assist mainly manufacturing companies in redesigning their business processes using best practices and modern digital technologies”. The focus is on the business value, not on the ultimate definition of PLM

Conclusion

There are many business aspects related to PLM to consider. Yoann Maingon’s post started the thinking process, and we ended up with the PLM-definition. It all illustrates that being involved in PLM is never a boring journey. I am curious to learn about your journey and where we meet.

It is the holiday season many groups, religions have their celebrations in this period, mostly due to the return of the light on the Northern hemisphere – Christmas, Diwali, Hanukkah, Kwanzaa and Santa Lucia are a few of them. Combined with a new decade upcoming, also a time for me to reflect on what have we learned and what can we imagine in the next ten years.

Looking back the last decade

Globalization is probably one of the most significant changes we have seen. Almost the whole world is connected now through all kinds of social and digital media. Information is instantly available and influences our behavior dramatically. The amount of information coming to us is so huge that we only filter the information that touches us emotionally. The disadvantage of that, opinions become facts and different opinions become enemies. A colorful society becomes more black and white.

These trends have not reached us, in the same manner, the domain of PLM. There have been several startups in the past ten years, explaining that PLM should be as easy as social communication with Facebook. Most of these startups focused on integrations with MCAD-systems, as-if PLM is about developing mechanical products.

In my opinion, the past decade has shown that mechanical design is no longer a unique part of product development. Products have become systems, full of electronics and driven by software. Systems Engineering became more and more important, defining the overall concept of a product, moving the focus to the earlier stages of the product design – see image below.

During Ideation and system definition, we need “social” collaboration between all stakeholders. To get a grip on this collaboration, we use models (SysML, UML, Logical, 2D, 3D) to have an unambiguous representation, simulation and validation of the product already in the virtual world.

Actually, there is nothing social about that collaboration. It is a company-driven demand to deliver competitive products to remain in business. If our company does not improve multidisciplinary collaboration, global competitors with less nostalgic thoughts to the past will conquer the market space.

Currently in most companies there are two worlds, hardware and software, almost 100 %-separated managed by either PLM or ALM (Application Lifecycle Management). It is clear that “old” PLM – item-driven with related documents cannot match the approach required for ALM – data-elements (code) based on software models (and modules).

In the hardware world a change can have a huge effect on the cost (or waste) of the product, however, implementing a hardware change can take months. Think about new machinery, tooling in the worst case. In the software domain, a change can be executed almost immediately. However here testing the impact of the change can have serious effects. The software fix for the Boeing 737 Max is not yet proven and delivered.

Therefore, I would like to conclude that in the past decade we learned in the PLM-domain to work in a Coordinated manner – leaving silos mostly in place.

Next week I will look forward to our challenging upcoming decade. Topics on my list for the next decade are:

  • From Coordinated to Connected – Generation Change
  • Sustainability of our Earth (and how PLM can help)
  • Understanding our human behavior to understand how to explain PLM to your execs- PI PLMx London 2020
  • All combined with restoring trust in science

 

I wish you all a happy and healthy New Year – take time to listen and learn as we need dialogue for the future, not opinions.

See you in 2020

 

 

For me, the joint conference from CIMdata and Eurostep is always a conference to look forward too. The conference is not as massive as PLM-Vendor conferences (slick presentations and happy faces); it is more a collection of PLM-practitioners (this time a 100+) with the intent to discuss and share their understanding and challenges, independent from specific vendor capabilities or features.  And because of its size a great place to network with everyone.

Day 1 was more a business/methodology view on PLM and Day 2 more in-depth focusing on standards and BIM. In this post, the highlights from the first day.

The State of PLM

 

 

Peter Bilello, CIMdata’s president, kicked of with a review of the current state of the PLM industry. Peter mentioned the PLM-market grew by 9.4 % to $47.8 billion (more than the expected 7 %). Good for the PLM Vendors and implementers.

However, Peter also mentioned that despite higher spending, PLM is still considered as a solution for engineering, often implemented as PDM/CAD data management. Traditional organizational structures, marketing, engineering, manufacturing, quality were defined in the previous century and are measured as such.

This traditional approach blocks the roll-out of PLM across these disciplines. Who is the owner of PLM or where is the responsibility for a certain dataset are questions to solve. PLM needs to transform to deliver end-to-end support instead of remaining the engineering silo. Are we still talking about PLM in the future? See Peter’s takeaways below:

 

 

We do not want to open the discussion if the the name PLM should change – too many debates – however unfortunate too much framing in the past too.

The Multi View BOM

 

 

Fred Feru from Airbus presented a status the Aerospace & Defense PLM action group are working on: How to improve and standardize on a PLM solution for multi-view BOM management, in particular, the interaction between the EBOM and MBOM. See below:

 

You might think this is a topic already solved when you speak with your PLM-vendor. However, all existing solutions at the participant implementations rely on customizations and vary per company. The target is to come up with common requirements that need to be addressed in the standard methodology. Initial alignment on terminology was already a first required step as before you standardize, you need to have a common dictionary. Moreover, a typical situation in EVERY PLM implementation.

 

 

An initial version was shared with the PLM Editors for feedback and after iterations and agreement to come with a solution that can be implemented without customization. If you are interested in the details, you can read the current status here with Appendix A en Appendix B.

 

Enabling the Circular Economy for Long Term Prosperity

Graham Aid gave a fascinating presentation related to the potentials and flaws of creating a circular economy. Although Graham was not a PLM-expert (till he left this conference), as he is the Strategy and Innovation Coordinator for the Ragn-Sells Group, which performs environmental services and recycling across Sweden, Norway, Denmark, and Estonia. Have a look at their website here.

 

 

Graham shared with us the fact that despite logical arguments for a circular economy – it is more profitable at the end – however, our short term thinking and bias block us from doing the right things for future generations.

Look at the missing link for a closed resource-lifecycle view below.

Graham shared weird examples where scarce materials for the future currently were getting cheaper, and therefore there is no desire for recycling them. A sound barrier with rubble could contain more copper than copper ore in a mine.

In the PLM-domain, there is also an opportunity for supporting and working on more sustainable products and services. It is a mindset and can be a profitable business model. In the PDT 2014 conference, there was a session on circular product development with Xerox as the best example. Circular product development but also Product As A Service can be activities that contribute to a more sustainable world. Graham’s presentation was inspiring for our PLM community and hopefully planted a few seeds for the future. As it is all about thinking long-term.

 

 

With the PLM Green Alliance, I hope we will be able to create a larger audience and participation for a sustainable future. More about the PLM Green Alliance next week.

 

The Fundamental Role of PLM in Data-driven Product Portfolio Management

 

 

Hannu Hannila (Polar) presented his study related to data-driven product portfolio management and why it should be connected to PLM.  For many companies, it is a challenge to understand which products are performing well and where to invest. These choices are often supported by Data Damagement as Hannu called it.

An example below:

The result of this fragmented approach is that organizations make their decisions on subjective data and emotions. Where the assumption is that 20 % of the products a company is selling is related to 80 % of the revenue, Hannu found in his research companies where only 10 % of the products were contributing to the revenue. As PPM (Product Portfolio Management)  often is based on big emotions – who shouts the loudest mentality, influenced by the company’s pet products and influence by the HIPPO (HIghest Paid Person in the Office).  So how to get a better rationale?

 

 

Hannu explained a data-driven framework that would provide the right analytics on management level, depending on overall data governance from all disciplines and systems.  See below:

I liked Hannu’s conclusions as it aligns with my findings:

  • To be data-driven, you need Master Data Management and Data Governance
  • Product Portfolio Management is the driving discipline for PLM, and in a modern digital enterprise, it should be connected.

Sponsor sessions

Sponsors are always needed to keep a conference affordable for the attendees.  The sponsor sessions on day 1 were of good quality.  Here a quick overview and a link if you want to invest further

 

 

Configit – explaining the value of a configurator that connects marketing, technical and sales, introducing CLM (Configuration Lifecycle Management) – a new TLA

 

 

Aras – explaining their view on what we consider the digital thread

 

 

Variantum – explaining their CPQ solution as part of a larger suite of cloud offerings

 

 

Quick Release – bringing common sense to PLM implementations, similar to what I am doing as PLM coach – focusing on the flow of information

 

 

SAP – explaining the change in focus when a company moves toward a product as a service model

 

 

SharePLM – A unique company addressing the importance of PLM training delivered through eLearning

Conclusion

The first day was an easy to digest conference with a good quality of presentations. I only shared 50 % of the session as we already reached 1000+ words.  The evening I enjoyed the joint dinner, being able to network and discuss in depth with participants and finished with a social network event organized by SharePLM. Next week part 2.

The usage of standards has been a recurring topic the past 10 months, probably came back to the surface at PI PLMx Chicago during the PLM Leaders panel discussion. If you want to refresh the debate, Oleg Shilovitsky posted an overview: What vendors are thinking about PLM standards – Aras, Dassault Systemes, Onshape, Oracle PLM, Propel PLM, SAP, Siemens PLM.

It is clear for vendors when they would actively support standards they reduce their competitive advantage, after all, you are opening your systems to connect to other vendor solutions, reducing the chance to sell adjacent functionality. We call it vendor lock-in. If you think this approach only counts for PLM, I would suggest you open your Apple (iPhone) and think about vendor lock-in for a moment.

Vendors will only adhere to standards when pushed by their customers, and that is why we have a wide variety of standards in the engineering domain.

Take the example of JT as a standard viewing format, heavily pushed by Siemens for the German automotive industry to be able to work downstream with CATIA and NX models. There was a JT-version (v9.5) that reached ISO 1306 alignment, but after that, Siemens changed JT (v10) again to optimize their own exchange scenarios, and the standard was lost.

And as customers did not complain (too much), the divergence continued. So it clear  vendors will not maintain standards out of charity as your business does not work for charity either (or do you ?). So I do not blame them is there is no push from their customers to maintain them.

What about standards?

The discussion related to standards flared up around the IpX ConX19 conference and a debate between Oleg & Hakan Kardan (EuroSTEP) where Hakan suggested that PLCS could be a standard data model for the digital thread – you can read Oleg’s view here: Do we need a standard like PLCS to build a digital thread.

Oleg’s opening sentence made me immediately stop reading further as more and more I am tired of this type of framing if you want to do a serious discussion based on arguments. Such a statement is called framing and in particular in politics we see the bad examples of framing.

Standards are like toothbrushes, a good idea, but no one wants to use anyone else’s. The history of engineering and manufacturing software is full of stories about standards.

This opening sentence says all about the mindset related to standards – it is a one-liner – not a fact. It could have been a tweet in this society of experts.

Still later,I read the blog post and learned Oleg has no arguments to depreciate PLCS, however as he does not know the details, he will probably not use it. The main challenge of standards: you need to spend time to understand and adhere to them and agree on following them. Otherwise, you get the same diversion of JT again or similar examples.

However, I might have been wrong in my conclusion as Oleg did some thinking on a Sunday and came with an excellent post: What would happen if PLM Vendors agree about data standards. Here Oleg is making the comparison with a standard in the digital world, established by Google, Microsoft, Yahoo, and Yandex : Schema.org: Evolution of Structured Data on the Web.

There is a need for semantic mapping and understanding in the day-to-day-world, and this understanding makes you realize the same is needed for PLM. That was one of the reasons why I wrote in the past (2015) a series of posts related to the importance of a PLM data model:

All these posts were aimed to help companies and implementers to make the right choices for an item-centric PLM implementation. At that time – 2015, item-centric was the current PLM best practice. I learned from my engagements in the past 15 years, in particular when you have a flexible modeling tool like SmarTeam or nowadays Aras, making the right data model decisions are crucial for future growth.

Who needs standards?

First of all, as long as you stay in your controlled environment, you do not need standards. In particular, in the Aerospace and Automotive industry, the OEMs defined the software versions to be used, and the supply chain had to adhere to their chosen formats. Even this narrow definition was not complete enough as a 3D CAD model needed to be exported for simulation or manufacturing purposes. There was not a single vendor working on a single CAD model definition at that time. So the need for standards emerged as there was a need to exchange data.

Data exchange is the driving force behind standards.

In a second stage also neutral format data storage became an important point – how to save for 75 years an aircraft definition.

Oil & Gas / Building – Construction

These two industries both had the need for standards. The Oil & Gas industry relies on EPC (Engineering / Procurement / Construction)  companies that build plants or platforms. Then the owner/operator takes over the operation and needs a hand-over of all the relevant information. However if this information would be delivered in the application-specific formats the EPC companies have used, the owner/operator would require various software environments and skills, just to have access to the data.

Therefore if the data is delivered in a standard format (ISO 15926) and the exchange follows CFIHOS (Capital Facilities Information Hand Over Specification) this exchange can be done more automated between the EPC and Owner/Operator environment, leading to lower overall cost of delivering and maintaining the information combined with a higher quality. For that reason, the Oil & Gas industry has invested already for a long time in standards as their plants/platform have a long lifecycle.

And the same is happening in the construction industry. Initially Autodesk and Bentley were fighting to become the vendor-standard and ultimately the IFC-standard has taken a lot from the Autodesk-world, but has become a neutral standard for all parties involved in a construction project to share and exchange data. In particular for the construction industry,  the cloud has been an accelerator for collaboration.

So standards are needed where companies/people exchange information

For the same reason in most global companies, English became the standard language. If you needed to learn all the languages spoken in a worldwide organization, you would not have time for business. Therefore everyone making some effort to communicate in one standard language is the best way to operate.

And this is the same for a future data-driven environment – we cannot afford for every exchange to go to the native format from the receiver or source – common neutral (or winning) standards will ultimately also come up in the world of manufacturing data exchange and IoT.

Companies need to push

This is probably the blocking issue for standards. Developing standards, using standards require an effort without immediate ROI. So why not use vendor-formats/models and create custom point-to-point interface as we only need one or two interfaces?  Companies delivering products with a long lifecycle know that the current data formats are not guaranteed for the future, so they push for standards (aerospace/defense/ oil & gas/construction/ infrastructure).

3D PDF Model

Other companies are looking for short term results, and standards are slowing them down. However as soon as they need to exchange data with their Eco-system (suppliers/ customers) an existing standard will make their business more scalable. The lack of standards is one of the inhibitors for Model-Based Definition or the Model-Based Enterprise – see also my post on this topic: Model-Based – Connecting Engineering and Manufacturing

When we would imagine the Digital Enterprise of the future, information will be connected through data streams and models. In a digital enterprise file conversions and proprietary formats will impede the flow of data and create non-value added work. For example if we look to current “Digital Twin” concepts, the 3D-representation of the twin is recreated again instead of a neutral 3D-model continuity. This because companies currently work in a coordinated manner. In perhaps 10 years from now we will reach maturity of a model-based enterprise, which only can exist based on standards. If the standards are based on one dominating platform or based on a merger of standards will be the question.

To discuss this question and how to bridge from the past to the future I am looking forward meeting you at the upcoming PLM Roadmap & PDT 2019 EMEA conference on 13-14 November in Paris, France. Download the program here: PLM for Professionals – Product Lifecycle Innovation

Conclusion

I believe PLM Standards will emerge when building and optimizing a digital enterprise. We need to keep on pushing and actively working for meaningful standards as they are crucial to avoid a lock-in of your data. Potentially creating dead-ends and massive inefficiencies.  The future is about connected Eco-systems, and the leanest companies will survive. Standards do not need to be extraordinarily well-defined and can start from a high-level alignment as we saw from schema.org. Keep on investing and contributing to standards and related discussion to create a shared learning path.

Thanks Oleg Shilovitsky to keep the topic alive.

p.s. I had not time to read and process your PLM Data Commodizitation post

 

Image: waitbutwhy.com

Two weeks ago I wrote about the simplification discussion around PLM – Why PLM never will be simple.  There I focused on the fact that even sharing information in a consistent, future proof way of working, is already challenging, despite easy to use communication tools like email or social communities.

I mentioned that sharing PLM data is even more challenging due to their potential revision, version, status, and context.  This brings us to the topic of configuration management, needed to manage the consistency of information, a challenge with the increasingly sophisticated products or systems. Simple tools will never fix this complexity.

To manage the consistency of a product,  configuration management (CM) is required. Two weeks ago I read the following interesting post from CMstat: A Brief History of Configuration Management Software.

An excellent introduction if you want to know more about the roots of CM, be it that the post at the end starts to flush out all the disadvantages and reasons why you should not think about CM using PLM systems.

The following part amused me:

 The Reality of Enterprise PLM

It is no secret that PLM solutions were often sold based in good part on their promise to provide full-lifecycle change control and systems-level configuration management across all functions of the enterprise for the OEM as well as their supply and service chain partners. The appeal of this sales stick was financial; the cost and liability to the corporation from product failures or disasters due to a lack of effective change control was already a chief concern of the executive suite. The sales carrot was the imaginary ROI projected once full-lifecycle, system-level configuration control was in effect for the OEM and supply chain.

Less widely known is that for many PLM deployments, millions of budget dollars and months of calendar time were exhausted before reaching the point in the deployment road map where CM could be implemented. It was not uncommon that before the CM stage gate was reached in the schedule, customer requirements, budget allocations, management priorities, or executive sponsors would change. Or if not these disruptions within the customer’s organization, then the PLM solution provider, their software products or system integrators had been changed, acquired, merged, replaced, or obsoleted. Worse yet for users who just had a job to do was when solutions were “reimagined” halfway through a deployment with the promise (or threat) of “transforming” their workflow processes.

Many project managers were silently thankful for all this as it avoided anyone being blamed for enterprise PLM deployment failures that were over budget, over schedule, overweight, and woefully underwhelming. Regrettably, users once again had to settle for basic change control instead of comprehensive configuration management.

I believe the CMstat-writer is generalizing too much and preaches for their parish. Although my focus lies on PLM, I also learned the importance of CM and for that reason I will share a view on CM from the PLM side:

Configuration Management is not a target for every company

The origins of Configuration Management come from the Aerospace and Defense (A&D) industries. These industries have high quality, reliability and traceability constraints. In simple words, you need to prove your product works correctly specified in all described circumstances and keep this consistent along the lifecycle of the product.

Moreover, imagine you delivered the perfect product, next implementing changes require a full understanding of the impact of the change. What is the impact of the change on the behavior or performance? In A&D is the question is it still safe and reliable?

Somehow PLM and CM are enemies. The main reason why PLM-systems are used is Time to Market — bringing a product as fast as possible to the market with acceptable quality. Being first is sometimes more important than high quality. CM is considered as a process that slows down Time to Market as managing consistency, and continuous validating takes time and effort.

Configuration Management in Aviation is crucial as everyone understands that you cannot afford to discover a severe problem during a flight. All the required verification and validation efforts make CM a costly process along the product lifecycle. Airplane parts are 2 – 3 times more expensive than potential the same parts used in other industries. The main reason: airplane parts are tested and validated for all expected conditions along their lifecycle.  Other industries do not spend so much time on validation. They validate only where issues can hurt the company, either for liability or for costs.

Time to Market even impacts the aviation industry  as we can see from the commercial aircraft battle(s) between Boeing and Airbus. Who delivers the best airplane (size/performance) at the right moment in the global economy? The Airbus 380 seemed to miss its targets in the future – too big – not flexible enough. The Boeing 737 MAX appears to target a market sweet spot (fuel economy) however the recent tragic accidents with this plane seemed to be caused by Time to Market pressure to certify the aircraft too early. Or is the complexity of a modern airplane unmanageable?

CM based on PLM-systems

Most companies had their configuration management practices long before they started to implement PLM. These practices were most of the time documented in procedures, leading to all kind of coding systems for these documents. Drawing numbers (the specification of a part/product), Specifications, Parts Lists, all had a meaningful identifier combined with a version/revision and status. For example, the Philips 12NC coding system is famous in the Netherlands and is still used among spin-offs of Philips and their supplier as it offers a consistent framework to manage configurations.

Storing these documents into a PDM/PLM-system to provide centralized access was not a big problem; however, companies also expected the PLM-system to support automation and functionality to support their configuration management procedures.

A challenge for many implementers for several reasons:

  • PLM-systems do not offer a standard way of working – if they would do so, they could only serve a small niche market – so it needs to be “configured/customized.”
  • Company configuration management rules sometimes cannot be mapped to the provided PLM data-model and their internal business logic. This has led to costly customizations where, in the best case, implementer and company agreed somewhere in the middle. Worst case as the writer from the CM blog is mentioning it becomes an expensive, painful project
  • Companies do not have a consistent configuration management framework as Time to Market is leading – we will fix CM later is the idea, and they let their PLM –implementer configure the PLM-system as good a possible. Still, at the management level, the value of CM is not recognized.
    (see also: PLM-CM-ALM – not sexy ?)

In companies that I worked with, those who were interested in a standardized configuration management approach were trained in CMII. CMII (or CM2) is a framework supported by most PLM-systems, sometimes even as a pre-configured template to speed-up the implementation. Still, as PLM-systems serve multiple industries, I would not expect any generic PLM-vendor to offer Commercial Off-The-Shelf (COTS) CM-capabilities – there are too many legacy approaches. You can find a good and more in-depth article related to CMII here: Towards Integrated Configuration Change Management (CMII) from Lionel Grealou.

 

What’s next?

Current configuration management practices are very much based on the concepts of managing document. However, products are more and more described in a data-driven, model-based approach. You can find all the reasons why we are moving to a model-based approach in my last year’s blog post. Important to realize is that current CM practices in PLM were designed with mechanical products and lifecycles as a base. With the combination of hardware and software, integrated and with different lifecycles, CM has to be reconsidered with a new holistic concept. The Institute of Process Excellence provides CM2 training but is also active in developing concepts for the digital enterprise.

Martijn Dullaart, Lead Architect Configuration Management @ ASML & Chair @ IPE/CM2 Global Congress has published several posts related to CM and a Model-Based approach – you find them here related to his LinkedIn profile. As you can read from his articles organizations are trying to find a new consistent approach.

Perhaps CM as a service to a Product Innovation Platform, as the CMstat blog post suggests? (quote from the post below)

In Part 2 of this CMsights series on the future of CM software we will examine the emerging strategy of “Platform PLM” where functional services like CM are delivered via an open, federated architecture comprised of rapidly-deployable industry-configured applications.

I am looking forward to Part2 of CMsights . An approach that makes sense to me as system boundaries will disappear in a digital enterprise. It will be more critical in the future to create consistent data flows in the right context and based on data with the right quality.

Conclusion

Simple tools and complexity need to be addressed in the right order. Aligning people and processes efficiently to support a profitable enterprise remains the primary challenge for every enterprise. Complex products, more dependent on software than hardware, are requiring new ways of working to stay competitive. Digitization can help to implement these new ways of working. Experienced PLM/CM experts know the document-driven past. Now it is time for a new generation of PLM and CM experts to start from a digital concept and build consistent and workable frameworks. Then the simple tools can follow.

 

I was happy to take part at the PI PLMx London event last week. It was here and in the same hotel that this conference saw the light in 2011  – you can see my blog post from that event here: PLM and Innovation @ PLMINNOVATION 2011.

At that time the first vendor-independent PLM conference after a long time and it brought a lot of new people together to discuss their experience with PLM. Looking at the audience that time, many of the companies that were there, came back during the years, confirming the value this conference has brought to their PLM journey.

Similar to the PDT conference(s) – just announced for this year last week – here – the number of participants is diminishing.

Main hypotheses:

  1. the PLM-definition has become too vague. Going to a PLM conference does not guarantee it is your type of PLM discussions you expect to see?
  2. the average person is now much better informed related to PLM thanks to the internet and social media (blogs/webinars/ etc.) Therefore, the value retrieved from the PLM conference is not big enough any more?
  3. Digital Transformation is absorbing all the budget and attention downstream the organization not creating the need and awareness of modern PLM to the attention of the management anymore. g., a digital twin is sexier to discuss than PLM?

What do you think about the above three hypotheses – 1,2 and/or 3?

Back to the conference. The discussion related to PLM has changed over the past nine years. As I presented at PI from the beginning in 2011, here are the nine titles from my sessions:

2011       PLM – The missing link
2012       Making the case for PLM
2013       PLM loves Innovation
2014       PLM is changing
2015       The challenge of PLM upgrades
2016       The PLM identity crisis
2017       Digital Transformation affects PLM
2018       PLM transformation alongside Digitization
2019       The challenges of a connected Ecosystem for PLM

Where the focus started with justifying PLM, as well as a supporting infrastructure, to bring Innovation to the market, the first changes became visible in 2014. PLM was changing as more data-driven vendors appeared with new and modern (metadata) concepts and cloud, creating the discussion about what would be the next upgrade challenge.

The identity crisis reflected the introduction of software development / management combined with traditional (mechanical) PLM – how to deal with systems? Where are the best practices?

Then from 2017 on until now Digital Transformation and the impact on PLM and an organization became the themes to discuss – and we are not ready yet!

Now some of the highlights from the conference. As there were parallel sessions, I had to divide my attention – you can see the full agenda here:

How to Build Critical Architecture Models for the New Digital Economy

The conference started with a refreshing presentation from David Sherburne (Carestream) explaining their journey towards a digital economy.  According to David, the main reason behind digitization is to save time, as he quoted Harvey Mackay an American Businessman and Journalist,

Time is free, but it is priceless. You cannot own it, but you can use it. You can’t keep it, but you can spend it. Once you have lost it, you never can get it back

I tend to agree with this simplification as it makes the story easy to explain to everyone in your company. Probably I would add to that story that saving time also means less money spent on intermediate resources in a company, therefore, creating a two-sided competitive advantage.

David stated that today’s digital transformation is more about business change than technology and here I wholeheartedly agree. Once you can master the flow of data in your company, you can change and adapt your company’s business processes to be better connected to the customer and therefore deliver the value they expect (increases your competitive advantage).

Having new technology in place does not help you unless you change the way you work.

David introduced a new acronym ILM (Integrated Lifecycle Management) and I am sure some people will jump on this acronym.

David’s presentation contained an interesting view from the business-architectural point of view. An excellent start for the conference where various dimensions of digital transformation and PLM were explored.

Integrated PLM in the Chemical industry

Another interesting session was from Susanna Mäentausta  (Kemira oy)  with the title: “Increased speed to market, decreased risk of non-compliance through integrated PLM in Chemical industry.” I selected her session as from my past involvement with the process industry, I noticed that PLM adoption is very low in the process industry. Understanding Why and How they implemented PLM was interesting for me. Her PLM vision slide says it all:

There were two points that I liked a lot from her presentation, as I can confirm they are crucial.

  • Although there was a justification for the implementation of PLM, there was no ROI calculation done upfront. I think this is crucial, you know as a company you need to invest in PLM to stay competitive. Making an ROI-story is just consoling the people with artificial number – success and numbers depend on the implementation and Susanna confirmed that step 1 delivered enough value to be confident.
  • There were an end-to-end governance and a communication plan in place. Compared to PLM projects I know, this was done very extensive – full engagement of key users and on-going feedback – communicate, communicate, communicate. How often do we forget this in PLM projects?

Extracting More Value of PLM in an Engineer-to-Order Business

Sami Grönstrand & Helena Gutierrez presented as an experienced duo (they were active in PI P PLMx Hamburg/Berlin before) – their current status and mission for PLM @ Outotec. As the title suggests, it was about how to extract more value from PL M, in an Engineering to Order Business.

What I liked is how they simplified their PLM targets from a complex landscape into three story-lines.

If you jump into all the details where PLM is contributing to your business, it might get too complicated for the audience involved. Therefore, they aligned their work around three value messages:

  • Boosting sales, by focusing on modularization and encouraging the use of a product configurator. This instead of developing every time a customer-specific solution
  • Accelerating project deliverables, again reaping the benefits of modularization, creating libraries and training the workforce in using this new environment (otherwise no use of new capabilities). The results in reducing engineering hours was quite significant.
  • Creating New Business Models, by connecting all data using a joint plant structure with related equipment. By linking these data elements, an end-to-end digital continuity was established to support advanced service and support business models.

My conclusion from this session was again that if you want to motivate people on a PLM-journey it is not about the technical details, it is about the business benefits that drive these new ways of working.

Managing Product Variation in a Configure-To-Order Business

In the context of the previous session from Outotec, Björn Wilhemsson’s session was also addressing somehow the same topic of How to create as much as possible variation in your customer offering, while internally keep the number of variants and parts manageable.

Björn, Alfa Laval’s OnePLM Programme Director, explained in detail the strategy they implemented to address these challenges. His presentation was very educational and could serve as a lesson for many of us related to product portfolio management and modularization.

Björn explained in detail the six measures to control variation, starting from a model-strategy / roadmap (thinking first) followed by building a modularized product architecture, controlling and limiting the number of variants during your New Product Development process. Next as Alfa Laval is in a Configure-To-Order business, Björn the implementation of order-based and automated addition of pre-approved variants (not every variant needs to exist in detail before selling it), followed by the controlled introduction of additional variants and continuous analysis of quoted and sold variant (the power of a digital portfolio) as his summary slides shows below:

Day 1 closed with an inspirational keynote; Lessons-Learnt from the Mountaineering Experience 8848 Meter above sea level  – a mission to climb the highest mountain on each of the continents in 107 days – 9 hours – setting a new world record by Jonathan Gupta.

There are some analogies to discover between his mission and a PLM implementation. It is all about having the total picture in mind. Plan and plan, prepare step-by-step in detail and rely on teamwork – it is not a solo journey – and it is about reaching a top (deliverable phase) in the most efficient way.

The differences: PLM does not need world records, you need to go with the pace an organization can digest and understand. Although the initial PLM climate during implementation might be chilling too, I do not believe you have to suffer temperatures below 50 degrees Celsius.

During the morning, I was involved in several meetings, therefore unfortunate unable to see some of the interesting sessions at that time. Hopefully later available on PI.TV for review as slides-only do not tell the full story. Although there are experts that can conclude and comment after seeing a single slide. You can read it here from my blog buddy Oleg Shilovitsky’s post : PLM Buzzword Detox. I think oversimplification is exactly creating the current problem we have in this world – people without knowledge become louder and sure about their opinion compared to knowledgeable people who have spent time to understand the matter.

Have a look at the Dunning-Kruger effect here (if you take the time to understand).

 

PLM: Enabling the Future of a Smart and Connected Ecosystem

Peter Bilello from CIMdata shared his observations and guidance related to the current ongoing digital business revolution that is taking place thanks to internet and IoT technologies. It will fundamentally transform how people will work and interact between themselves and with machines. Survival in business will depend on how companies create Smart and Connected Ecosystems. Peter showed a slide from the 2015 World Economic Forum (below) which is still relevant:

Probably depending on your business some of these waves might have touched your organization already. What is clear that the market leaders here will benefit the most – the ones owning a smart and connected ecosystem will be the winners shortly.

Next, Peter explained why PLM, and in particular the Product Innovation Platform, is crucial for a smart and connected enterprise.  Shiny capabilities like a digital twin, the link between virtual and real, or virtual & augmented reality can only be achieved affordably and competitively if you invest in making the source digital connected. The scope of a product innovation platform is much broader than traditional PLM. Also, the way information is stored differs – moving from documents (files) towards data (elements in a database).  I fully agree with Peter’s opinion here that PLM is conceptually the Killer App for a Smart & Connected Ecosystem and this notion is spreading.

A recent article from Forbes in the category Leadership: Is Your Company Ready For Digital Product Life Cycle Management? shows there is awareness.  Still very basic and people are still confused to understand what is the difference with an electronic file (digital too ?) and a digital definition of information.

The main point to remember here: Digital information can be accessed directly through a programming interface (API/Service) without the need to open a container (document) and search for this piece of information.

Peter then zoomed in on some topics that companies need to investigate to reach a smart & connected ecosystem. Security (still a question hardly addressed in IoT/Digital Twin demos), Standards and Interoperability ( you cannot connect in all proprietary formats economically and sustainably) A lot of points to consider and I want to close with Peter’s slide illustrating where most companies are in reality

The Challenges of a Connected Ecosystem for PLM

I was happy to present after Peter Bilello and David Sherburne (on day 1) as they both gave a perspective on digital transformation complementary to what I submitted. My presentation was focusing on the incompatibility of current coordinated business systems and the concept of a connected ecosystem.

You can already download my slides from SlideShare here: The Challenges of a Connected Ecosystem for PLM . I will explain my presentation in an upcoming blog post as slides without a story might lead to the wrong interpretation, and we already reached 2000 words. Few words to come.

How to Run a PLM Project Using the Agile Manifesto

Andrew Lodge, head of Engineering Systems at JCB explained how applying the agile mindset towards a PLM project can lead to faster and accurate results needed by the business. I am a full supporter for this approach as having worked in long and waterfall-type of PLM implementations there was always the big crash and user dissatisfaction at the final delivery. Keeping the business involved every step seems to be the solution. The issue I discovered here is that agile implementation requires a lot of people, in particular, business, to be involved heavily. Some companies do not understand this need and dropped /reduced business contribution to the least, killing the value of an agile approach

 

Concluding

For me coming back to London for the PI PLMx event was very motivational. Where the past two, three conferences before in Germany might have led to little progress per year, this year, thanks to new attendees and inspiration, it became for me a vivid event, hopefully growing shortly. Networking and listening to your peers in business remains crucial to digest it all.

 

Translate

Email subscription to this blog

Categories

%d bloggers like this: