You are currently browsing the category archive for the ‘IoT’ category.
After a short summer break with almost no mentioning of the word PLM, it is time to continue this series of posts exploring the future of “connected” PLM. For those who also started with a cleaned-up memory, here is a short recap:
In part 1, I rush through more than 60 years of product development, starting from vellum drawings ending with the current PLM best practice for product development, the item-centric approach.
In part 2, I painted a high-level picture of the future, introducing the concept of digital platforms, which, if connected wisely, could support the digital enterprise in all its aspects. The five platforms I identified are the ERP and CRM platform (the oldest domains).
Next, the MES and PIP platform(modern domains to support manufacturing and product innovation in more detail) and the IoT platform (needed to support connected products and customers).
In part 3, I explained what is data-driven and how data-driven is closely connected to a model-based approach. Here we abandon documents (electronic files) as active information carriers. Documents will remain, however, as reports, baselines, or information containers. In this post, I ended up with seven topics related to data-driven, which I will discuss in upcoming posts.
Hopefully, by describing these topics – and for sure, there are more related topics – we will better understand the connected future and make decisions to enable the future instead of freezing the past.
Topic 1 for this post:
Data-driven does not imply, there needs to be a single environment, a single database that contains all information. As I mentioned in my previous post, it will be about managing connected datasets federated. It is not anymore about owned the data; it is about access to reliable data.
Platform or a collection of systems?
One of the first (marketing) hurdles to take is understanding what a data platform is and what is a collection of systems that work together, sold as a platform.
CIMdata published in 2017 an excellent whitepaper positioning the PIP (Product Innovation Platform): Product Innovation Platforms: Definition, Their Role in the Enterprise, and Their Long-Term Viability. CIMdata’s definition is extensive and covers the full scope of product innovation. Of course, you can find a platform that starts from a more focused process.
For example, look at OpenBOM (focus on BOM collaboration), OnShape (focus on CAD collaboration) or even Microsoft 365 (historical, document-based collaboration).
The idea behind a platform is that it provides basic capabilities connected to all stakeholders, inside and outside your company. In addition, to avoid that these capabilities are limited, a platform should be open and able to connect with other data sources that might be either local or central available.
From these characteristics, it is clear that the underlying infrastructure of a platform must be based on a multitenant SaaS infrastructure, still allowing local data to be connected and shielded for performance or IP reasons.
The picture below describes the business benefits of a Product Innovation Platform as imagined by Accenture in 2014
Link to CIMdata’s 2014 commentary of Digital PLM HERE
Sometimes vendors sell their suite of systems as a platform. This is a marketing trick because when you want to add functionality to your PLM infrastructure, you need to install a new system and create or use interfaces with the existing systems, not really a scalable environment.
In addition, sometimes, the collaboration between systems in such a marketing platform is managed through proprietary exchange (file) formats.
A practice we have seen in the construction industry before cloud connectivity became available. However, a so-called end-to-end solution working on PowerPoint implemented in real life requires a lot of human intervention.
Not a single environment
There has always been the debate:
“Do I use best-in-class tools, supporting the end-user of the software, or do I provide an end-to-end infrastructure with more generic tools on top of that, focusing on ease of collaboration?”
In the system approach, the focus was most of the time on the best-in-class tools where PLM-systems provide the data governance. A typical example is the item-centric approach. It reflects the current working culture, people working in their optimized siloes, exchanging information between disciplines through (neutral) files.
The platform approach makes it possible to deliver the optimized user interface for the end-user through a dedicated app. Assuming the data needed for such an app is accessible from the current platform or through other systems and platforms.
It might be tempting as a platform provider to add all imaginable data elements to their platform infrastructure as much as possible. The challenge with this approach is whether all data should be stored in a central data environment (preferably cloud) or federated. And what about filtering IP?
In my post PLM and Supply Chain Collaboration, I described the concept of having an intermediate hub (ShareAspace) between enterprises to facilitate real-time data sharing, however carefully filtered which data is shared in the hub.
It may be clear that storing everything in one big platform is not the future. As I described in part 2, in the end, a company might implement a maximum of five connected platforms (CRM, ERP, PIP, IoT and MES). Each of the individual platforms could contain a core data model relevant for this part of the business. This does not imply there might be no other platforms in the future. Platforms focusing on supply chain collaboration, like ShareAspace or OpenBOM, will have a value proposition too. In the end, the long-term future is all about realizing a digital tread of information within the organization.
Will we ever reach a perfectly connected enterprise or society? Probably not. Not because of technology but because of politics and human behavior. The connected enterprise might be the most efficient architecture, but will it be social, supporting all humanity. Predicting the future is impossible, as Yuval Harari described in his book: 21 Lessons for the 21st Century. Worth reading, still a collection of ideas.
Proprietary data model or standards?
So far, when you are a software vendor developing a system, there is no restriction in how you internally manage your data. In the domain of PLM, this meant that every vendor has its own proprietary data model and behavior.
I have learned from my 25+ years of experience with systems that the original design of a product combined with the vendor’s culture defines the future roadmap. So even if a PLM vendor would rewrite all their software to become data-driven, the ways of working, the assumptions will be based on past experiences.
This makes it hard to come to unified data models and methodology valid for our PLM domain. However, large enterprises like Airbus and Boeing and the major Automotive suppliers have always pushed for standards as they will benefit the most from standardization.
The recent PDT conferences were an example of this, mainly the 2020 Fall conference. Several Aerospace & Defense PLM Action groups reported their progress.
You can read my impression of this event in The weekend after PLM Roadmap / PDT 2020 – part 1 and The next weekend after PLM Roadmap PDT 2020 – part 2.
It would be interesting to see a Product Innovation Platform built upon a data model as much as possible aligned to existing standards. Probably it won’t happen as you do not make money from being open and complying with standards as a software vendor. Still, companies should push their software vendors to support standards as this is the only way to get larger connected eco-systems.
I do not believe in the toolkit approach where every company can build its own data model based on its current needs. I have seen this flexibility with SmarTeam in the early days. However, it became an upgrade risk when new, overlapping capabilities were introduced, not matching the past.
In addition, a flexible toolkit still requires a robust data model design done by experienced people who have learned from their mistakes.
The benefit of using standards is that they contain the learnings from many people involved.
Conclusion
I did not like writing this post so much, as my primary PLM focus lies on people and methodology. Still, understanding future technologies is an important point to consider. Therefore, this time a not-so-exciting post. There is enough to read on the internet related to PLM technology; see some of the recent articles below. Enjoy
Matthias Ahrens shared: Integrated Product Lifecycle Management (Google translated from German)
Oleg Shilovitsky wrote numerous articles related to technology –
in this context:
3 Challenges of Unified Platforms and System Locking and
SaaS PLM Acceleration Trends
So far, I have been discussing PLM experiences and best practices that have changed due to introducing electronic drawings and affordable 3D CAD systems for the mainstream. From vellum to PDM to item-centric PLM to manage product designs and manufacturing specifications.
Although the technology has improved, the overall processes haven’t changed so much. As a result, disciplines could continue to work in their own comfort zone, most of the time hidden and disconnected from the outside world.
Now, thanks to digitalization, we can connect and format information in real-time. Now we can provide every stakeholder in the company’s business to have almost real-time visibility on what is happening (if allowed). We have seen the benefits of platformization, where the benefits come from real-time connectivity within an ecosystem.
Apple, Amazon, Uber, Airbnb are the non-manufacturing related examples. Companies are trying to replicate these models for other businesses, connecting the concept owner (OEM ?), with design and manufacturing (services), with suppliers and customers. All connected through information, managed in data elements instead of documents – I call it connected PLM
Vendors have already shared their PowerPoints, movies, and demos from how the future would be in the ideal world using their software. The reality, however, is that implementing such solutions requires new business models, a new type of organization and probably new skills.
The last point is vital, as in schools and organizations, we tend to teach what we know from the past as this gives some (fake) feeling of security.
The reality is that most of us will have to go through a learning path, where skills from the past might become obsolete; however, knowledge of the past might be fundamental.
In the upcoming posts, I will share with you what I see, what I deduct from that and what I think would be the next step to learn.
I firmly believe connected PLM requires the usage of various models. Not only the 3D CAD model, as there are so many other models needed to describe and analyze the behavior of a product.
I hope that some of my readers can help us all further on the path of connected PLM (with a model-based approach). This series of posts will be based on the max size per post (avg 1500 words) and the ideas and contributes coming from you and me.
What is platformization?
In our day-to-day life, we are more and more used to direct interaction between resellers and services providers on one side and consumers on the other side. We have a question, and within 24 hours, there is an answer. We want to purchase something, and potentially the next day the goods are delivered. These are examples of a society where all stakeholders are connected in a data-driven manner.
We don’t have to create documents or specialized forms. An app or a digital interface allows us to connect. To enable this type of connectivity, there is a need for an underlying platform that connects all stakeholders. Amazon and Salesforce are examples for commercial activities, Facebook for social activities and, in theory, LinkedIn for professional job activities.
The platform is responsible for direct communication between all stakeholders.
The same applies to businesses. Depending on the products or services they deliver, they could benefit from one or more platforms. The image below shows five potential platforms that I identified in my customer engagements. Of course, they have a PLM focus (in the middle), and the grouping can be made differently.
The 5 potential platforms
The ERP platform
is mainly dedicated to the company’s execution processes – Human Resources, Purchasing, Finance, Production scheduling, and potentially many more services. As platforms try to connect as much as possible all stakeholders. The ERP platform might contain CRM capabilities, which might be sufficient for several companies. However, when the CRM activities become more advanced, it would be better to connect the ERP platform to a CRM platform. The same logic is valid for a Product Innovation Platform and an ERP platform. Examples of ERP platforms are SAP and Oracle (and they will claim they are more than ERP)
Note: Historically, most companies started with an ERP system, which is not the same as an ERP platform. A platform is scalable; you can add more apps without having to install a new system. In a platform, all stored data is connected and has a shared data model.
The CRM platform
a platform that is mainly focusing on customer-related activities, and as you can see from the diagram, there is an overlap with capabilities from the other platforms. So again, depending on your core business and products, you might use these capabilities or connect to other platforms. Examples of CRM platforms are Salesforce and Pega, providing a platform to further extend capabilities related to core CRM.
The MES platform
In the past, we had PDM and ERP and what happened in detail on the shop floor was a black box for these systems. MES platforms have become more and more important as companies need to trace and guide individual production orders in a data-driven manner. Manufacturing Execution Systems (and platforms) have their own data model. However, they require input from other platforms and will provide specific information to other platforms.
For example, if we want to know the serial number of a product and the exact production details of this product (used parts, quality status), we would use an MES platform. Examples of MES platforms (none PLM/ERP related vendors) are Parsec and Critical Manufacturing
The IoT platform
these platforms are new and are used to monitor and manage connected products. For example, if you want to trace the individual behavior of a product of a process, you need an IoT platform. The IoT platform provides the product user with performance insights and alerts.
However, it also provides the product manufacturer with the same insights for all their products. This allows the manufacturer to offer predictive maintenance or optimization services based on the experience of a large number of similar products. Examples of IoT platforms (none PLM/ERP-related vendors) are Hitachi and Microsoft.
The Product Innovation Platform (PIP)
All the above platforms would not have a reason to exist if there was not an environment where products were invented, developed, and managed. The Product Innovation Platform PIP – as described by CIMdata -is the place where Intellectual Property (IP) is created, where companies decide on their portfolio and more.
The PIP contains the traditional PLM domain. It is also a logical place to manage product quality and technical portfolio decisions, like what kind of product platforms and modules a company will develop. Like all previous platforms, the PIP cannot exist without other platforms and requires connectivity with the other platforms is applicable.
Look below at the CIMdata definition of a Product Innovation Platform.
You will see that most of the historical PLM vendors aiming to be a PIP (with their different flavors): Aras, Dassault Systèmes, PTC and Siemens.
Of course, several vendors sell more than one platform or even create the impression that everything is connected as a single platform. Usually, this is not the case, as each platform has its specific data model and combining them in a single platform would hurt the overall performance.
Therefore, the interaction between these platforms will be based on standardized interfaces or ad-hoc connections.
Standard interfaces or ad-hoc connections?
Suppose your role and information needs can be satisfied within a single platform. In that case, most likely, the platform will provide you with the right environment to see and manipulate the information.
However, it might be different if your role requires access to information from other platforms. For example, it could be as simple as an engineer analyzing a product change who needs to know the actual stock of materials to decide how and when to implement a change.
This would be a PIP/ERP platform collaboration scenario.
Or even more complex, it might be a product manager wanting to know how individual products behave in the field to decide on enhancements and new features. This could be a PIP, CRM, IoT and MES collaboration scenario if traceability of serial numbers is needed.
The company might decide to build a custom app or dashboard for this role to support such a role. Combining in real-time data from the relevant platforms, using standard interfaces (preferred) or using API’s, web services, REST services, microservices (for specialists) and currently in fashion Low-Code development platforms, which allow users to combine data services from different platforms without being an expert in coding.
Without going too much in technology, the topics in this paragraph require an enterprise architecture and vision. It is opportunistic to think that your existing environment will evolve smoothly into a digital highway for the future by “fixing” demands per user. Your infrastructure is much more likely to end up congested as spaghetti.
In that context, I read last week an interesting post Low code: A promising trend or Pandora’s box. Have a look and decide for yourself
I am less focused on technology, more on methodology. Therefore, I want to come back to the theme of my series: The road to model-based and connected PLM. For sure, in the ideal world, the platforms I mentioned, or other platforms that run across these five platforms, are cloud-based and open to connect to other data sources. So, this is the infrastructure discussion.
In my upcoming blog post, I will explain why platforms require a model-based approach and, therefore, cause a challenge, particularly in the PLM domain.
It took us more than fifty years to get rid of vellum drawings. It took us more than twenty years to introduce 3D CAD for design and engineering. Still primarily relying on drawings. It will take us for sure one generation to switch from document-based engineering to model-based engineering.
Conclusion
In this post, I tried to paint a picture of the ideal future based on connected platforms. Such an environment is needed if we want to be highly efficient in designing, delivering, and maintaining future complex products based on hardware and software. Concepts like Digital Twin and Industry 4.0 require a model-based foundation.
In addition, we will need Digital Twins to reach our future sustainability goals efficiently. So, there is work to do.
Your opinion, Your contribution?
Another episode of “The PLM Doctor is IN“. This time a question from Ilan Madjar, partner and co-founder of XLM Solutions. Ilan is my co-moderator at the PLM Global Green Alliance for sustainability topics.
All these activities resulted in the following question(s) related to the Digital Twin. Now sit back and enjoy.
PLM and the Digital Twin
Is it a new concept? How to implement and certify the result?
Relevant topics discussed in this video
- An article about the introduction of the Digital Twin concept by Michael Grieves: The history and creation of the digital twin concept discussing Michael Grieves
- A presentation from Don Farr explaining the digital value chain with the “famous” Boeing diamond model – PDF HERE
- My Slideshare presentation discussing the difference between Coordinated and Connected PLM
- My Slideshare presentation from the Digital Twin conference Eindhoven Nov 2020: Digital Twins do not run on documents
Conclusion
I hope you enjoyed the answer and look forward to your questions and comments. Let me know if you want to be an actor in one of the episodes.
The main rule: A (single) open question that is puzzling you related to PLM.
After the series about “Learning from the past,” it is time to start looking toward the future. I learned from several discussions that I probably work most of the time with advanced companies. I believe this would motivate companies that lag behind even to look into the future even more.
If you look into the future for your company, you need new or better business outcomes. That should be the driver for your company. A company does not need PLM or a Digital Twin. A company might want to reduce its time to market and improve collaboration between all stakeholders. These objectives can be realized by different ways of working and an IT infrastructure to allow these processes to become digital and connected.
That is the “game”. Coming back to the future of PLM. We do not need a discussion about definitions; I leave this to the academics and vendors. We will see the same applies to the concept of a Digital Twin.
My statement: The digital twin is not new. Everybody can have their own digital twin as long as you interpret the definition differently. Does this sound like the PLM definition?
The definition
I like to follow the Gartner definition:
A digital twin is a digital representation of a real-world entity or system. The implementation of a digital twin is an encapsulated software object or model that mirrors a unique physical object, process, organization, person, or other abstraction. Data from multiple digital twins can be aggregated for a composite view across a number of real-world entities, such as a power plant or a city, and their related processes.
As you see, not a narrow definition. Now we will look at the different types of interpretations.
Single-purpose siloed Digital Twins
- Simple – data only
One of the most straightforward applications of a digital twin is, for example, my Garmin Connect environment. My device registers performance parameters (speed, cadence, power, heartbeat, location) when cycling. Then, after every trip, I can analyze my performance. I can see changes in my overall performance; compare my performance with others in my category (weight, age, sex).
Based on that, I can decide if I want to improve my performance. My personal business goal is to maintain and improve my overall performance, knowing I cannot stop aging by upgrading my body.
On November 4th, 2020, I am participating in the (almost virtual) Digital Twin conference organized by Bits&Chips in the Netherlands. In the context of human performance, I look forward to Natal van Riel’s presentation: Towards the metabolic digital twin – for sure, this direction is not simple. Natal is a full professor at the Technical University in Eindhoven, the “smart city” in the Netherlands.
- Medium – data and operating models
Many connected devices in the world use the same principle. An airplane engine, an industrial robot, a wind turbine, a medical device, and a train carriage; all track the performance based on this connection between physical and virtual, based on some sort of digital connectivity.
The business case here is also monitoring performance, predicting maintenance, and upgrading the product when needed.
This is the domain of Asset Lifecycle Management, a practice that has existed for decades. Based on financial and performance models, the optimal balance between maintaining and overhauling has to be found. Repairs are disruptive and can be extremely costly. A manufacturing site that cannot produce can cost millions per day. Connecting data between the physical and the virtual model allows us to have real-time insights and be proactive. It becomes a digital twin.
- Advanced – data and connected 3D model
The digital twin we see the most in marketing videos is a virtual twin, using a 3D representation for understanding and navigation. The 3D representation provides a Virtual Reality (VR) environment with connected data. When pointing at the virtual components, information might appear, or some animation might take place.
Building such a virtual representation is a significant effort; therefore, there needs to be a serious business case.
The simplest business case is to use the virtual twin for training purposes. A flight simulator provides a virtual environment and behavior as-if you are flying in a physical airplane – the behavior model behind the simulator should match as well as possibly the real behavior. However, as it is a model, it will never be 100 % reality and requires updates when new findings or product changes appear.
A virtual model of a platform or plant can be used for training on Standard Operating Procedures (SOPs). In the physical world, there is no place or time to conduct such training. Here the complexity might be lower. There is a 3D Model; however, serious updates can only be expected after a major maintenance or overhaul activity.
These practices are not new either and are used in places where physical training cannot be done.
More challenging is the Augmented Reality (AR) use case. Here the virtual model, most of the time, a lightweight 3D Model, connects to real-time data coming from other sources. For example, AR can be used when an engineer has to service a machine. The AR environment might project actual data from the machine, indicate service points and service procedures.
The positive side of the business case is clear for such an opportunity, ensuring service engineers always work with the right information in a real-time context. The main obstacle to implementing AR, in reality, is the access to data, the presentation of the data and keeping the data in the AR environment matching the reality.
And although there are 3D Models in use, they are, to my knowledge, always created in siloes, not yet connected to their design sources. Have a look at the Digital Twin conference from Bits&Chips, as mentioned before.
Several of the cases mentioned above will be discussed here. The conference’s target is to share real cases concluded by Q & A sessions, crucial for a virtual event.
Connected Virtual Twins along the product lifecycle
So far, we have been discussing the virtual twin concept, where we connect a product/system/person in the physical world to a virtual model. Now let us zoom in on the virtual twins relevant for the early parts of the product lifecycle, the manufacturing twin, and the development twin. This image from Siemens illustrates the concept:
On slides they imagine a complete integrated framework, which is the future vision. Let us first zoom in on the individual connected twins.
The digital production twin
This is the area of virtual manufacturing and creating a virtual model of the manufacturing plant. Virtual manufacturing planning is not a new topic. DELMIA (Dassault Systèmes) and Tecnomatix (Siemens) are already for a long time offering virtual manufacturing planning solutions.
At that time, the business case was based on the fact that the definition of a manufacturing plant and process done virtually allows you to optimize the plant before investing in physical assets.
Saving money as there is no costly prototype phase to optimize production. In a virtual world, you can perform many trade-off studies without extra costs. That was the past (and, for many companies, still the current situation).
With the need to be more flexible in manufacturing to address individual customer orders without increasing the overhead of delivering these customer-specific solutions, there is a need for a configurable plant that can produce these individual products (batch size 1).
This is where the virtual plant model comes into the picture again. Instead of having a virtual model to define the ultimate physical plant, now the virtual model remains an active model to propose and configure the production process for each of these individual products in the physical plant.
This is partly what Industry 4.0 is about. Using a model-based approach to configure the plant and its assets in a connected manner. The digital production twin drives the execution of the physical plant. The factory has to change from a static factory to a dynamic “smart” factory.
In the domain of Industry 4.0, companies are reporting progress. However, in my experience, the main challenge is still that the product source data is not yet built in a model-based, configurable manner. Therefore, requires manual rework. This is the area of Model-Based Definition, and I have been writing about this aspect several times. Latest post: Model-Based: Connecting Engineering and Manufacturing
The business case for this type of digital twin, of course, is to be able to customer-specific products with extremely competitive speed and reduced cost compared to standard. It could be your company’s survival strategy. As it is hard to predict the future, as we see from COVID-19, it is still crucial to anticipate the future instead of waiting.
The digital development twin
Before a product gets manufactured, there is a product development process. In the past, this was pure mechanical with some electronic components. Nowadays, many companies are actually manufacturing systems as the software controlling the product plays a significant role. In this context, the model-based systems engineering approach is the upcoming approach to defining and testing a system virtually before committing to the physical world.
Model-Based Systems Engineering can define a single complex product and perform all kinds of analyses on the system even before there is a physical system in place. I will explain more about model-based systems engineering in future posts. In this context, I want to stress that having a model-based system engineering environment combined with modularity (do not confuse it with model-based) is a solid foundation for dealing with unique custom products. Solutions can be configured and validated against their requirements already during the engineering phase.
The business case for the digital development twin is easy to make. Shorter time to market, improved and validated quality, and reduced engineering hours and costs compared to traditional ways of working. To achieve these results, for sure, you need to change your ways of working and the tools you are using. So it won’t be that easy!
For those interested in Industry 4.0 and the Model-Based System Engineering approach, join me at the upcoming PLM Road Map 2020 and PDT 2020 conference on 17-18-19 November. As you can see from the agenda, a lot of attention to the Digital Twin and Model-Based approaches.
Three digital half-days with hopefully a lot to learn and stay with our feet on the ground. In particular, I am looking forward to Marc Halpern’s keynote speech: Digital Thread: Be Careful What you Wish For, It Just Might Come True
Conclusion
It has been very noisy on the internet related to product features and technologies, probably due to COVID-19 and therefore disrupted interactions between all of us – vendors, implementers and companies trying to adjust their future. The Digital Twin concept is an excellent framing for a concept that everyone can relate to. Choose your business case and then look for the best matching twin.
To avoid that software geeks are getting curious about the title – in this context, ALM means Asset Lifecycle Management. In 2008 I was active for SmarTeam to promote PLM concepts relevant for Asset Lifecycle Management. The focus was on PLM being complementary to asset operation management (EAM Enterprise Asset Management and MRO – Maintenance Repair and Overhaul).
This topic has become actual for me in the past two months, having discussed and seen (PDT) the concepts of a model-based approach for assets and constructions. PLM, ALM, and BIM converge conceptually. Every year I give a one-day update from the field for students doing a master for PLM & BIM on top of their engineering/architectural background. Five years ago, there was no mentioning of BIM, now the ratio of BIM-oriented students has become significant. For me it is always great to see young students willing to learn PLM or BIM on top of their own skillset. Read more about this particular Master class in French when you click on the logo to the left.
In 2012 I started to explain PLM benefits to EPC companies (Engineering Procurement Construction), targeting a more profitable and efficient delivery of their constructions (oil platform, plant, building, infrastructure). The simplified reasoning behind using PLM was related to a more efficient and quality of multidisciplinary collaboration, reducing costly fixes during construction, and smoothening the intensive process of data handover.
More and more in the process industry, standards, like ISO 15926 (Process Industry) and ISO 19650 (BIM – mainly in the UK), became crucial. At that time, it was difficult to convince companies to focus on the horizontal-integrated process instead of dedicated, disconnected tools. Meanwhile, this has changed, thanks to the Digital Twin hype. Let’s have a look.
PLM and ALM
The initial value for using PLM concepts complementary to MRO systems came from the fact that MRO systems are mainly focusing on plant operations. You could compare these systems with ERP systems for manufacturing companies, focusing execution and continuous operation. Scheduled maintenance and inspections are also driven by the MRO system. Typical MRO systems are Maximo and SAP PM. PLM could deliver configuration management, linking the design intent to the physical implementation. Therefore provide higher data quality, visibility, and traceability of the asset history.
In 2010, I shared these concepts in two posts: Asset Lifecycle Management using a PLM-system and PLM for Asset Lifecycle Management and Asset Development based on lessons learned with some (nuclear) plant owner/operators. They started to discover the need for configuration management to ensure data quality for operations. In 2010-2014 the business case using PLM complementary to MRO was data quality and therefore reduced down-time when executing large maintenance programs (dependencies between the individual projects were not visible without PLM)
In MRO-systems, like in ERP-systems, the data for execution is based on information coming from various engineering sources – specifications, PFDs, P&IDs. Questions owner/operators ask themselves are:
- What are the designed operational settings?
- Are the asset parameters currently running as designed?
- What is the optimized maintenance period?
- Can we stretch maintenance intervals?
- Can we reduce inspections?
- Can we reduce downtime for maintenance and overhaul?
- What about predictive maintenance?
Most of these questions are answered by experts that use their tacit knowledge and experience to give the best so far answers. And when the answers were wrong, they were accepted as new learning points. Next time we won’t make this mistake, and the experts become even more knowledgeable.
Now, these questions could be answered if you can model your asset in a virtual environment. In the virtual world, you would use simulation models, logical models, and 3D Models to describe the asset. This is where Model-Based Systems Engineering practices are used. However, these models need to be calibrated based on reality. And that is where IoT and Asset Operation Monitoring comes in connecting physical behavior with virtual predicted behavior. You can read more about this relationship in my post: Will MBSE the new PLM instead of IoT?
PLM and BIM
In 2014 when I started to discuss PLM concepts with EPC-companies (Engineering, Procurement, and Construction), mainly in the Oil & Gas industry. Here excellent asset development tools (AVEVA, Intergraph, Bentley) are the standard, and as the purpose of an EPC company is to deliver a plant or platform. Each software tool has its purpose and there is no lifecycle strategy. The value PLM could bring was providing a program overview (complementary with Primavera), standardization, multidisciplinary coordination and visibility across projects to capture knowledge.
Most of the time, the EPC companies did not see the value of optimizing themselves as this was accepted in the process. Even while their productivity and cost due to poor quality (fixing during construction /commissioning) were absurd (10-20 % of the project budget). Cultural change – think longer instead of fix later – was hard to explain. In the end, the EPC was not responsible for operations, so why bother that much?
My blog posts: PLM for all Industries and 2014 – the year that the construction industry did not discover PLM illustrate the challenge at that time. None of the EPCs and construction companies had the, that improving collaboration based on information-continuity (not data-driven yet) could bring the significant benefits, despite their relatively low-profit margin (1- 3 % is considered excellent). Breaking the silos is too.
Two recent trends, however, changed the status quo that existed.
First of all, more and more, the owner/operator does not want to be responsible for the maintenance and operations of the asset. The typical EPC-companies now became DBO-companies (Design Build and Operate), this requires lifecycle thinking for these companies as most of the costs of an asset are during its maintenance and operation phase.
Advanced Thinking (read: (Model-Based) Systems Engineering) can help these companies to shift their focus on a more sustainable design of the asset for the future and get rewarded for that. In the old EPC-model, the target was “just” to deliver as specified.
A second significant trend is the availability of cloud infrastructure for the construction world. A cloud infrastructure does not require considerable investment for the stakeholders in a construction project. By introducing BIM in a common data environment (CDE), a comparable infrastructure to PLM is created and likely the Maintenance-and-Operatie stakeholder is eager to have the full virtual definition here for the future.
Read more about BIM and CDE for example, here: CDE – strategic BIM process tool.
Of course, technology and standards are there to collaborate. Now it is up to the stakeholders involved to develop new skills for collaboration (learn or hire) and implement them through new ways of working. A learning process can never be pushed by a big-bang, so make sure your company operates in two modes while learning.
As I mentioned the Maintenance-and-Operate stakeholders or in traditional cases, the Owner/Operators are incredibly interested in a well-defined virtual model of the asset. This allows them to analyze and simulate the implementation of fixes and enhancements for the future with an optimum result. Again we are talking about a digital twin of the asset here
Conclusion
Even though the digital twin is on the top of the Gartner Hype cycle, it has become already a vital principle to implement in particular for substantial, critical assets. As these precious assets, minor inefficiencies in data continuity can still be afforded to learn. From the moment companies have established a digital continuity between their virtual and physical assets, the concept for Digital Twin can also be profitable (and required) for other industries. In particular when these companies want to deliver their products as a service.
Note: I have been talking this year a lot about the challenges of digital transformation applied to PLM in particular. During PI PLMx London 2020 on February 3 and 4, I will lead a Think Thank session related to the challenge of connecting your PLM transformation to your executives’ vision (and budget). See you there ?
This is the moment of the year to switch-off from the details. No more talking and writing about digital transformation or model-based approaches. It is time to sit back and relax. Two years ago I shared the PLM Songbook, now it is time to see one or more movies. Here are my favorite top five PLM movies:
Bruce Almighty
Bruce Nolan, an engineer in Buffalo, N.Y., is discontented with almost everything in the company despite his popularity and the love of his draftswoman Grace. At the end of the worst day of his life, Bruce angrily ridicules and rages against PLM and PLM responds. PLM appears in human form and, endowing Bruce with divine powers op collaboration, challenges Bruce to take on the big job to see if he can do it any better.
A movie that makes you modest and you realize there is more than your small ecosystem.
The good, the bad and the ugly
Blondie (The Good PLM consultant) is a professional who is out trying to earn a few dollars. Angel Eyes (The Bad PLM Vendor) is a PLM salesman who always commits to a task and sees it through, as long as he is paid to do so. And Tuco (The Ugly PLM Implementer) is a wanted outlaw trying to take care of his own hide. Tuco and Blondie share a partnership together making money off Tuco’s bounty, but when Blondie unties the partnership, Tuco tries to hunt down Blondie. When Blondie and Tuco come across a PLM implementation loaded with dead bodies, they soon learn from the only survivor (Bill Carson – the PLM admin) that he and a few other men have buried a stash of value on a file server. Unfortunately, Carson dies, and Tuco only finds out the name of the file server, while Blondie finds out the name on the hard disk. Now the two must keep each other alive in order to find the value. Angel Eyes (who had been looking for Bill Carson) discovers that Tuco and Blondie met with Carson and knows they know the location of the value. All he needs is for the two to ..
A movie that makes you realize that it is a challenging journey to find the value out of PLM. It is not only about execution – but it is also about all the politics of people involved – and there are good, bad and ugly people on a PLM journey.
The Grump
The Grump is a draftsman in Finland from the past. A man who knows that everything used to be so much better in the old days. Pretty much everything that’s been done after 1953 has always managed to ruin The Grump’s day. Our story unfolds The Grump opens a 3D Model on his computer, hurting his brain. He has to spend a weekend in Helsinki to attend a model-based therapy. Then the drama unfolds …….
A movie that makes you realize that progress and innovation do not come from grumps. In every environment when you want to do a change of the status quo, grumps will appear. With the exciting Finish atmosphere, a perfect film for Christmas.
Deliverance
The Cahulawassee River Valley company in Northern Georgia is one of the last analog companies in the state, which will soon change with the imminent implementation of a PLM system in the company, breaking down silos everywhere. As such, four Atlanta city slickers, alpha male Lewis Medlock, generally even-keeled Ed Gentry, slightly condescending Bobby Trippe, and wide-eyed Drew Ballinger decide to implement PLM in one trip, with only Lewis and Ed having experience in CAD. They know going in that the area is ethnoculturally homogeneous and isolated, but don’t understand the full extent of such until they arrive and see what they believe is the result of generations of inbreeding. Their relatively peaceful trip takes a turn for the worse when half way through they encounter a couple of hillbilly moonshiners. That encounter not only makes the four battle their way out of the PLM project intact and alive but threatens the relationships of the four as they do.
This movie, from 1972, makes you realize that in the early days of PLM starting a big-bang implementation journey into an area that is not ready for it, can be deadly, for your career and friendship. Not suitable for small children!
Diamonds Are Forever or Tron (legacy)
James Bond’s mission is to find out who has been drawing diamonds, which are appearing on blogs. He adopts another identity in the form of Don Farr. He joins up with CIMdata and acts as if he is developing diamonds, but everyone is hungry for these diamonds. He also has to avoid Mr. Brouwer and Mr. Kidd, the dangerous couple who do not leave anyone in their way when it comes to model-based. And Ernst Stavro Blofeld isn’t out of the question. He may have changed his looks, but is he linked with the V-shape? And if he is, can Bond finally defeat his ultimate enemy?
Sam Flynn, the tech-savvy 27-year-old son of Kevin Flynn, looks into his father’s disappearance and finds himself pulled into the same world of virtual twins and augmented reality where his father has been living for 20 years. Along with Kevin’s loyal confidant Quorra, father and son embark on a life-and-death journey across a visually-stunning cyber universe that has become far more advanced and exceedingly dangerous. Meanwhile, the malevolent program IoT, who dominates the digital world, plans to invade the real world and will stop at nothing to prevent their escape
I could not decide about number five. The future is bright with Boeing’s new representation of Systems Engineering, see my post on CIMdata’s PLM Europe roadmap event where Don Farr presented his diamond(s). However, the future is also becoming a mix of real with virtual and here Tron (legacy) will help my readers to understand the beauty of a mixed virtual and real world. You can decide – or send me your favorite PLM movies.
Note: All movie reviews are based on IMBd.com story lines, and I thank the authors of these story lines for their contribution and hope they agree with the PLM-related twist. Click on the image to find the full details and original review.
Conclusion
2018 has been an exciting year with a lot of buzzwords combined with the reality that the current PLM approach is incompatible with the future. How we can address this issue more in 2019 – first at PI PLMx 2019 in London (be there – last chance to meet people in the UK when they are still Europeans and share/discuss plans for the upcoming year)
Wishing you all the best during the break and a happy and prosperous 2019
According to LinkedIn, there are over a 7500 PLM consultants in my network. It is quite an elite group of people as I have over 100.000 CEOs in my network according to LinkedIn. Being a CEO is a commodity.
PLM consultants share a common definition, the words Product Lifecycle Management. However, what we all mean by PLM is one of the topics that has evolved over the past 19 years in a significant way.
PLM or cPDM (collaborative PDM)?
In the early days, PLM was considered as an engineering tool for collaboration, either between global subsidiaries or suppliers. The main focus of PLM was to bring engineering information to manufacturing in a controlled way. PLM and cPDM, often seen as solving the same business needs as the implementation of a PLM system most of the time got stuck at the cPDM level.
Main players at that time were Dassault Systemes, UGS (later Siemens PLM) and PTC – their solutions were MCAD-driven with limited scope – bringing engineering information towards manufacturing in a coordinated way.
PLM was not really an approach that created visibility at the management level of a company. How do you value and measure collaboration? Because connectivity was expensive in the early days of PLM, combined with the idea that PLM systems needed to be customized, PLM was framed as costly and hard to deliver value.
Systems Engineering and New Product Introduction
Then, 2005 and beyond, thanks to better connectivity and newcomers in the PLM market, the solution landscape from PLM became broader. CAD integrations were not a necessary part of the PLM scope according to these newcomers as they focused on governance (New Product Introduction), Bill of Materials or at the front-end of the product design cycle, connecting systems engineering by adding requirements management to their PLM suite.
New players in this domain where SAP, Aras, followed by Autodesk – their focus was more metadata-driven, connection and creating an end-to-end data flow for the product. Autodesk started the PLM and cloud path.
These new capabilities brought a broader scope for PLM indeed. However, they also strengthened the idea that PLM is there for engineers. For the management too complicated, unless they understood the value of coordinated collaboration. Large enterprises saw the benefits of having common processes for PLM as an essential reason to invest in PLM. The graph below showed the potential of PLM, where the shaded area indicates the potential revenue benefits.
Still, this graph does not create “hard numbers,” and it requires visionaries to get a PLM implementation explained and justified across the board. PLM is framed as expensive even if the budgets spent on PLM are twenty percent or less compared to ERP implementations. As PLM is not about transactional data, the effects of PLM are hard to benchmark. Success has many fathers, and in case of difficulties, the newcomer is to blame.
PLM = IoT?
With the future possibilities, connectivity to the machine-level (IoT or IIoT), a new paradigm related to PLM was created by PTC. PLM equals IoT – read more here.
Through IoT, it became possible to connect to products/assets in the field, and the simplified message from PTC was that now thanks to IoT (read ThingWorx) PLM was now really possible, releasing traditional PLM out of its engineering boundaries. The connected sensors created the possibility to build and implement more advanced and flexible manufacturing processes, often called Smart Manufacturing or Industrie 4.0.
None of the traditional PLM vendors is talking about PLM solely anymore. Digital transformation is a topic discussed at the board level, where GE played a visionary role with their strong message for change, driven by their CEO Jeff Immelt at that time – have a look at one of his energizing talks here.
However is PLM part of this discussion?
Digital Transformation opened a new world for everyone. Existing product lifecycle concepts could be changed, products are becoming systems, interacting with the environment realized through software features. Systems can be updated/upgraded relatively fast, in particular when you are able to watch and analyze the performance of your assets in almost real-time.
All consultants (me included) like to talk about digital transformation as it creates a positive mood towards the future, imagining everything that is possible. And with the elite of PLM consultants we are discovering the new roles of PLM – see picture below:
Is PLM equal to IoT or Digital Transformation?
I firmly believe the whole Digital Transformation and IoT hypes are unfortunately obfuscating the maximum needs for a digital enterprise. The IoT focus only exposes the last part of the lifecycle, disconnected from the concept and engineering cycles – yes on PowerPoint slides there might be a link. Re-framing PLM as Digital Transformation makes is even vaguer as we discussed during the CIMdata / PDT Europe conference last October. My main argument: Companies fail to have a link with their digital operations and dreams because current engineering processes and data, hardware (mechanical and electronics) combined with software are still operating in an analog, document-driven mode.
PLM = MBSE?
However what we also discussed during this conference was the fact that actually there is a need for an end-to-end model-based systems engineering infrastructure to support the full product lifecycle. Don Farr’s (Boeing) new way to depict the classical systems engineering “V” also hinted into that direction. See the image below – a connected environment between the virtual modeled word and the physical world at any time of the product lifecycle
So could MBSE be the new naming for PLM?
The problem is as Peter Bilello also mentioned during the CIMdata/PDT conference is that the word “ENGINEERING” is in Model-Based Systems Engineering. Therefore keeping the work what the PLM “elite” is doing again in the engineering box.
So perhaps Model-Based Enterprise as the new name?
Unfortunate MBE has already two current definitions – look here and here. Already too much confusion, and there a lot of people who like confusion. See Model-Based – The confusion. So any abbreviation with Model-Based terminology in it will not get attention at the board level. Even if it is crucial the words, Model-Based create less excitement as compared to Digital Twin, although the Digital Twin depends on a model-based approach.
Conclusion
Creating and maintaining unique products and experiences for their customers is the primary target of almost every company. However, no easy acronym that frames these aspects to value at the board level. Perhaps PID – the Product Innovation Diamond approach will be noticed? Your say ….
This is almost my last planned post related to the concepts of model-based. After having discussed Model-Based Systems Engineering (needed to develop complex products/systems including hardware and software) and Model-Based Definition (creating an efficient connection between Engineering and Manufacturing), my last post will be related to the most over-hyped topic: The Digital Twin
There are several reasons why the Digital Twin is over-hyped. One of the reasons is that the Digital Twin is not necessarily considered as a PLM-related topic. Other vendors like SAP (the network of digital twins), Oracle (Digital Twins for IoT applications) and GE with their Predix-platform also contributed to the hype related to the digital twin. The other reason is that the concept of Digital Twin is a great idea for marketers to shine above the clouds. Are recent comment from Monica Schnitger says it all in her post 5 quick takeaways from Siemens Automation summit. Monica’s take away related to Digital Twin:
The whole digital twin concept is just starting to gain traction with automation users. In many cases, they don’t have a digital representation of the equipment on their lines; they may have some data from the equipment OEM or their automation contractors but it’s inconsistent and probably incomplete. The consensus seemed to be that this is a great idea but out of many attendees’ immediate reach. [But it is important to start down this path: model something critical, gather all the data you can, prove benefit then move on to a bigger project.]
Monica is aiming to the same point I have been mentioning several times. There is no digital representation and the existing data is inconsistent. Don’t wait: The importance of accurate data – act now !
What is a digital twin?
I think there are various definitions of the digital twin and I do not want to go in a definition debate like we had before with the acronyms MBD/MBE (Model Based Definition/Enterprise – the confusion) or even the acronym PLM (classical PLM or digital PLM ?). Let’s agree on the following high-level statements:
- A digital twin is a virtual representation of a physical product
- The virtual part of the digital twin is defined by what you want to analyze, simulate, predict related to the physical product
- One physical product can have multiple digital twins, only in the ideal world there is potentially a unique digital twin for every physical product in the world
- When a product interacts with the environment, based on inputs and outputs, we normally call them systems. When I use Product, it will be most of the time a System, in particular in the context of a digital twin
Given the above statements, I will give some examples of digital twin concepts:
As a cyclist I am active on platforms like Garmin and Strava, using a tracking device, heart monitor and a power meter. During every ride my device plus the sensors measure my performance and all the data is uploaded to the platform, providing me with a report where I drove, how fast, my heartbeat, cadence and power during the ride. On Strava I can see the Flybys (other digital twins that crossed my path and their performances) and I can see per segment how I performed considered to others and I can filter by age, by level etc.)
This is the easiest part of a digital twin. Every individual can monitor and analyze their personal behavior and discover trends. Additionally, the platform owner has all the intelligence about all cyclists around the world, how they perform and what would be the best performance per location. And based on their Premium offering (where you pay) they can give you advanced advise on how you can improve. This is the Strava business model bringing value to the individual meanwhile learning from the behavior of thousands. Note in this scenario there is no 3D involved.
Another known digital twin story is related to plants in operation. In the past 10 years I have been advocating for Plant Lifecycle Management (PLM for Owner/Operators), describing the value of a virtual plant model using PLM capabilities combined with Maintenance, Repair and Overhaul (MRO) in order to reduce downtime. In a nuclear environment the usage of 3D verification, simulation and even control software in a virtual environment, can bring great benefit due to the fact that the physical twin is not always accessible and downtime can be up to several million per week.
The above examples provide two types of digital twins. I will discuss some characteristics in the next paragraphs.
Digital Twin – performance focus
Companies like GE and SAP focus a lot on the digital twin in relation to the asset performance. Measuring the performance of assets, compare their performance with other similar assets and based on performance characteristics the collector of the data can sell predictive maintenance analysis, performance optimization guidance and potentially other value offerings to their customers.
Small improvements in the range of a few percents can have a big impact on the overall net results. The digital twin is crucial in this business model to build-up knowledge, analyze and collect it and sell the knowledge again. This type of scenario is the easiest one. You need products with sensors, you need an infrastructure to collect the data and extract and process information in a manner that it can be linked to a behavior model with parameters that influence the model.

Image SAP blogs
This is the model-based part of the digital twin. For a single product there can be different models related to the parameters driving your business. E.g. performance parameters for output, parameters for optimal up-time (preventive maintenance – usage optimization) or parameters related to environmental impact, etc..) Building and selling the results of such a model is an add-on business, creating more value for your customer combined with creating more loyalty. Using the digital twin in the context of performance focus does not require a company to change the way they are working totally. Yes, you need new skills, data collection and analysis, and more sensor technology but a lot of the product development activities can remain the same (for the moment).
As a conclusion for this type of digital twin I would state, yes there is some PLM involved, but the main focus is on business execution.
Due to the fact that I already reach more than 1000 words, I will focus in my next post on the most relevant digital twin for PLM. Here all disciplines come together. The 3D Mechanical model, the behavior models, the embedded and control software, (manufacturing) simulation and more. All to create an almost perfect virtual copy of a real product or system in the physical world. And there we will see that this is not as easy, as concepts depend on accurate data and reliable models, which is not the case currently in most companies in their engineering environment.
Conclusion
Digital Twin is a marketing hype however when you focus on only performance monitoring and tuning it becomes a reality as it does not require a company to align in a digital manner across the whole lifecycle. However this is just the beginning of a real digital twin.
Where are you in your company with the digital twin journey?
Last week I posted my first review of the PDT Europe conference. You can read the details here: The weekend after PDT Europe (part 1). There were some questions related to the abbreviation PDT. Understanding the history of PDT, you will discover it stands for Product Data Technology. Yes, there are many TLA’s in this world.
Microsoft’s view on the digital twin
Now back to the conference. Day 2 started with a remote session from Simon Floyd. Simon is Microsoft’s Managing Director for Manufacturing Industry Architecture Enterprise Services and a frequent speaker at PDT. Simon shared with us Microsoft’s viewpoint of a Digital Twin, the strategy to implement a Digit Twin, the maturity status of several of their reference customers and areas these companies are focusing. From these customers it was clear most companies focused on retrieving data in relation to maintenance, providing analytics and historical data. Futuristic scenarios like using the digital twin for augmented reality or design validation. As I discussed in the earlier post, this relates to my observations, where creating a digital thread between products in operations is considered as a quick win. Establishing an end-to-end relationship between products in operation and their design requires many steps to fix. Read my post: Why PLM is the forgotten domain in digital transformation.
When discussing the digital twin architecture, Simon made a particular point for standards required to connect the results of products in the field. Connecting a digital twin in a vendor-specific framework will create a legacy, vendor lock-in, and less open environment to use digital twins. A point that I also raised in my presentation later that day.
Simon concluded with a great example of potential future Artificial Intelligence, where an asset based on its measurements predicts to have a failure before the scheduled maintenance stop and therefore requests to run with a lower performance so it can reach the maintenance stop without disruption.
Closing the lifecycle loop
Sustainability and the circular economy has been a theme at PDT for some years now too. In his keynote speech, Torbjörn Holm from Eurostep took us through the global megatrends (Hay group 2030) and the technology trends (Gartner 2018) and mapped out that technology would be a good enabler to discuss several of the global trends.
Next Torbjörn took us through the reasons and possibilities (methodologies and tools) for product lifecycle circularity developed through the ResCoM project in which Eurostep participated.
The ResCoM project (Resource Conservative Manufacturing) was a project co-funded by the European Commission and recently concluded. More info at www.rescom.eu
Torbjörn concluded discussing the necessary framework for Digital Twin and Digital Thread(s), which should be based on a Model-Based Definition, where ISO 10303 is the best candidate.
Later in the afternoon, there were three sessions in a separate track, related to design optimization for value, circular and re-used followed by a panel discussion. Unfortunate I participated in another track, so I have to digest the provided materials still. Speakers in that track were Ola Isaksson (Chalmers University), Ingrid de Pauw & Bram van der Grinten (IDEAL&CO) and Michael Lieder (KTH Sweden)
Connecting many stakeholders
Rebecca Ihrfors, CIO from the Swedish Defense Material Administration (FMV) shared her plans on transforming the IT landscape to harmonize the current existing environments and to become a broker between industry and the armed forces (FM). As now many of the assets come with their own data sets and PDM/PLM environments, the overhead to keep up all these proprietary environments is too expensive and fragmented. FWM wants to harmonize the data they retrieve from industry and the way they offer it to the armed forces in a secure way. There is a need for standards and interoperability.
The positive point from this presentation was that several companies in the audience and delivering products to Swedish Defense could start to share and adapt their viewpoints how they could contribute.
Later in the afternoon, there were three sessions in a separate track rented to standards for MBE inter-operability and openness that would fit very well in this context. Brian King (Koneksys), Adrian Murton (Airbus UK) and Magnus Färneland (Eurostep) provided various inputs, and as I did not attend these parallel sessions I will dive deeper in their presentations at a later time
PLM something has to change – bimodal and more
In my presentation, which you can download from SlideShare here: PLM – something has to change. My main points were related to the fact that apparently, companies seem to understand that something needs to happen to benefit really from a digital enterprise. The rigidness from large enterprise and their inhibitors to transform are more related to human and incompatibility issues with the future.
How to deal with this incompatibility was also the theme for Martin Eigner’s presentation (System Lifecycle Management as a bimodal IT approach) and Marc Halpern’s closing presentation (Navigating the Journey to Next Generation PLM).
Martin Eigner’s consistent story was about creating an extra layer on top of the existing (Mode 1) systems and infrastructure, which he illustrated by a concept developed based on Aras.
By providing a new digital layer on top of the existing enterprise, companies can start evolving to a modern environment, where, in the long-term, old Mode 1 systems will be replaced by new digital platforms (Mode 2). Oleg Shilovitsky wrote an excellent summary of this approach. Read it here: Aras PLM platform “overlay” strategy explained.
Marc Halpern closed the conference describing his view on how companies could navigate to the Next Generation PLM by explaining in more detail what the Gartner bimodal approach implies. Marc’s story was woven around four principles.
Principle 1 The bimodal strategy as the image shows.
Principle 2 was about Mode 1 thinking in an evolutionary model. Every company has to go through maturity states in their organization, starting from ad-hoc, departmental, enterprise-based to harmonizing and fully digital integrated. These maturity steps also have to be taken into account when planning future steps.
Principle 3 was about organizational change management, a topic often neglected or underestimated by product vendors or service providers as it relates to a company culture, not easy to change and navigate in a particular direction.
Finally, Principle 4 was about Mode 2 activities. Here an organization should pilot (in a separate environment), certify (make sure it is a realistic future), adopt (integrate it in your business) and scale (enable this new approach to exists and grow for the future).
Conclusions
This post concludes my overview of PDT Europe 2017. Looking back there was a quiet aligned view of where we are all heading with PLM and related topics. There is the hype an there is reality, and I believe this conference was about reality, giving good feedback to all the attendees what is really happening and understood in the field. And of course, there is the human factor, which is hard to influence.
Share your experiences and best practices related to moving to the next generation of PLM (digital PLM ?) !
PDT Europe is over, and it was this year a surprising aligned conference, showing that ideas and concepts align more and more for modern PLM. Håkan Kårdén opened the conference mentioning the event was fully booked, about 160 attendees from over 19 countries. With a typical attendance of approx. 120 participants, this showed the theme of the conference: Continuous Transformation of PLM to support the Lifecycle Model-Based Enterprise was very attractive and real. You can find a history of tweets following the hashtag #pdte17
Setting the scene
Peter Bilello from CIMdata kicked-off by bringing some structure related to the various Model-Based areas and Digital Thread. Peter started by mentioning that technology is the least important issue as organization culture, changing processing and adapting people skills are more critical factors for a successful adoption of modern PLM. Something that would repeatedly be confirmed by other speakers during the conference.
Peter presented a nice slide bringing the Model-Based terminology together on one page. Next, Peter took us through various digital threads in the different stages of the product lifecycle. Peter concluded with the message that we are still in a learning process redefining optimal processes for PLM, using Model-Based approaches and Digital Threads and thanks (or due) to digitalization these changes will be rapid. Ending with an overall conclusion that we should keep in mind:
It isn’t about what we call digitalization; It is about delivering value to customers and all other stakeholders of the enterprise
Next Marc Halpern busted the Myth of Digital Twins (according to his session title) and looked into realistic planning them. I am not sure if Marc smashed some of the myths although it is sure Digital Twin is at the top of the hype cycle and we are all starting to look for practical implementations. A digital twin can have many appearances and depends on its usage. For sure it is not just a 3D Virtual model.
There are still many areas to consider when implementing a digital twin for your products. Depending on what and how you apply the connection between the virtual and the physical model, you have to consider where your vendor really is in maturity and avoid lock in on his approach. In particular, in these early stages, you are not sure which technology will last longer, and data ownership and confidentially will play an important role. And opposite to quick wins make sure your digital twin is open and use as much as possible open standards to stay open for the future, which also means keep aiming for working with multiple vendors.
Industry sessions
Next, we had industry-focused sessions related to a lifecycle Model-Based enterprise and later in the afternoon a session from Outotec with the title: Managing Installed Base to Unlock Service opportunities.
The first presentation from Väino Tarandi, professor in IT in Construction at KTH Sweden presented his findings related to BIM and GIS in the context of the lifecycle, a test bed where PLCS meets IFC. Interesting as I have been involved in BIM Level 3 discussions in the UK, which was already an operational challenge for stakeholders in the construction industry now extended with the concept of the lifecycle. So far these projects are at the academic level, and I am still waiting for companies to push and discover the full benefits of an integrated approach.
Concepts for the industrial approach could be learned from Outotec as you might understand later in this post. Of course the difference is that Outotec is aiming for data ownership along the lifecycle, where in case of the construction industries, each silo often is handled by a different contractor.
Fredrik Ekström from Swedish Transport Administration shared his challenges of managing assets for both road and railway transport – see image on the left. I have worked around this domain in the Netherlands, where asset management for infrastructure and asset management for the rail infrastructure are managed in two different organizations. I believe Fredrik (and similar organizations) could learn from the concepts in other industries. Again Outotec’s example is also about having relevant information to increase service capabilities, where the Swedish Transport Administration is aiming to have the right data for their services. When you look at the challenges reported by Fredrik, I assume he can find the answers in other industry concepts.
Outotec’s presentation related to managing installed base and unlock service opportunities explained by Sami Grönstrand and Helena Guiterrez was besides entertaining easy to digest content and well-paced. Without being academic, they explained somehow the challenges of a company with existing systems in place moving towards concepts of a digital twin and the related data management and quality issues. Their practical example illustrated that if you have a clear target, understanding better a customer specific environment to sell better services, can be achieved by rational thinking and doing, a typical Finish approach. This all including the “bi-modal approach” and people change management.
Future Automotive
Ivar Hammarstadt, Senior Analyst Technology Intelligence for Volvo Cars Corporation entertained us with a projection toward the future based on 160 years of automotive industry. Interesting as electrical did not seem to be the only way to go for a sustainable future depending on operational performance demands.
Next Jeanette Nilsson and Daniel Adin from Volvo Group Truck shared their findings related to an evaluation project for more than one year where they evaluated the major PLM Vendors (Dassault Systemes / PTC / Siemens) on their Out-of-the-box capabilities related to 3D product documentation and manufacturing.
They concluded that none of the vendors were able to support the full Volvo Truck complexity in a OOTB matter. Also, it was a good awareness project for Volvo Trucks organization to understand that a common system for 3D geometry reduces the need for data transfers and manual data validation. Cross-functional iterations can start earlier, and more iterations can be performed. This will support a shortening of lead time and improve product quality. Personally, I believe this was a rather expensive approach to create awareness for such a conclusion, pushing PLM vendors in a competitive pre-sales position for so much detail.
Future Aerospace
Kenny Swope from Boeing talked us through the potential Boeing journey towards a Model-Based Enterprise. Boeing has always been challenging themselves and their partners to deliver environments close to what is possible. Look at the Boeing journey and you can see that already in 2005 they were aiming for an approach that most of current manufacturing enterprises cannot meet. And now they are planning their future state.
To approach the future state Boeing aims to align their business with a single architecture for all aspects of the company. Starting with collecting capabilities (over 400 in 6 levels) and defining value streams (strategic/operational) the next step is mapping the capabilities to the value streams. Part of the process would be to look at the components of a value stream if they could be fulfilled by a service. In this way you design your business for a service-oriented architecture, still independent from any system constraints. As Kenny states the aerospace and defense industry has a long history and therefore slow to change as its culture is rooted in the organization. It will be interesting to learn from Kenny next hear how much (mandatory) progress towards a model-based enterprise has been achieved and which values have been confirmed.
Gearing up for day 2
Martin Eigner took us in high-speed mode through his vision and experience working in a bi-modular approach with Aras to support legacy environments and a modern federated layer to support the complexity of a digital enterprise where the system architecture is leading. I will share more details on these concepts in my next post as during day 2 of PDT Europe both Marc Halpern and me were talking related to this topic, and I will combine it in a more extended story.
The last formal presentation for day one was from Nigel Shaw from Eurostep Ltd where he took us through the journey of challenges for a model-based enterprise. As there will not be a single model that defines all, it will be clear various models and derived models will exist for a product/system. Interesting was Nigel’s slide showing the multiple models disciplines can have from an airplane (1948). Similar to the famous “swing” cartoon, used to illustrate that every single view can be entirely different from the purpose of the product.
Next are these models consistent and still describing the same initial specified system. On top of that, even the usage of various modeling techniques and tools will lead to differences in the system. And the last challenge on top is managing the change over the system’s lifecycle. From here Nigel stepped into the need for digital threads to govern relations between the various views per discipline and lifecycle stage, not only for the physical and the virtual twin. When comparing the needs of a model-based enterprise through its lifecycle, Nigel concluded that using PLCS as a framework provides an excellent fit to manage such complexity.
Finally, after a panel discussion, which was more a collection of opinions as the target was not necessary to align in such a short time, it was time for the PDT dinner always an excellent way to share thoughts and verify them with your peers.
Conclusion
Day 1 was over before you knew it without any moment of boredom and so I hope is also this post. Next week I will close reviewing the PDT conference with some more details about my favorite topics.
Jos, great thoughts about BOM management. Here are some of my thoughts. I can see how BOM management will evolve…
As a complement, even if more and more of the diversity of a product is managed at the software level…
1) A wiring diagram stores information (wires between ports of the electrical components) that does not exist in most of…
BOM has NEVER been the sole "master" of the Product. The DEFINITION FILE is ! For example the wiring of…
Interesting discussion about part numbers and where they originate. Though there seems to be consensus about the EBOM and MBOM,…