You are currently browsing the tag archive for the ‘Digital Enterprise’ tag.
After a short summer break with almost no mentioning of the word PLM, it is time to continue this series of posts exploring the future of “connected” PLM. For those who also started with a cleaned-up memory, here is a short recap:
In part 1, I rush through more than 60 years of product development, starting from vellum drawings ending with the current PLM best practice for product development, the item-centric approach.
In part 2, I painted a high-level picture of the future, introducing the concept of digital platforms, which, if connected wisely, could support the digital enterprise in all its aspects. The five platforms I identified are the ERP and CRM platform (the oldest domains).
Next, the MES and PIP platform(modern domains to support manufacturing and product innovation in more detail) and the IoT platform (needed to support connected products and customers).
In part 3, I explained what is data-driven and how data-driven is closely connected to a model-based approach. Here we abandon documents (electronic files) as active information carriers. Documents will remain, however, as reports, baselines, or information containers. In this post, I ended up with seven topics related to data-driven, which I will discuss in upcoming posts.
Hopefully, by describing these topics – and for sure, there are more related topics – we will better understand the connected future and make decisions to enable the future instead of freezing the past.
Topic 1 for this post:
Data-driven does not imply, there needs to be a single environment, a single database that contains all information. As I mentioned in my previous post, it will be about managing connected datasets federated. It is not anymore about owned the data; it is about access to reliable data.
Platform or a collection of systems?
One of the first (marketing) hurdles to take is understanding what a data platform is and what is a collection of systems that work together, sold as a platform.
CIMdata published in 2017 an excellent whitepaper positioning the PIP (Product Innovation Platform): Product Innovation Platforms: Definition, Their Role in the Enterprise, and Their Long-Term Viability. CIMdata’s definition is extensive and covers the full scope of product innovation. Of course, you can find a platform that starts from a more focused process.
For example, look at OpenBOM (focus on BOM collaboration), OnShape (focus on CAD collaboration) or even Microsoft 365 (historical, document-based collaboration).
The idea behind a platform is that it provides basic capabilities connected to all stakeholders, inside and outside your company. In addition, to avoid that these capabilities are limited, a platform should be open and able to connect with other data sources that might be either local or central available.
From these characteristics, it is clear that the underlying infrastructure of a platform must be based on a multitenant SaaS infrastructure, still allowing local data to be connected and shielded for performance or IP reasons.
The picture below describes the business benefits of a Product Innovation Platform as imagined by Accenture in 2014
Link to CIMdata’s 2014 commentary of Digital PLM HERE
Sometimes vendors sell their suite of systems as a platform. This is a marketing trick because when you want to add functionality to your PLM infrastructure, you need to install a new system and create or use interfaces with the existing systems, not really a scalable environment.
In addition, sometimes, the collaboration between systems in such a marketing platform is managed through proprietary exchange (file) formats.
A practice we have seen in the construction industry before cloud connectivity became available. However, a so-called end-to-end solution working on PowerPoint implemented in real life requires a lot of human intervention.
Not a single environment
There has always been the debate:
“Do I use best-in-class tools, supporting the end-user of the software, or do I provide an end-to-end infrastructure with more generic tools on top of that, focusing on ease of collaboration?”
In the system approach, the focus was most of the time on the best-in-class tools where PLM-systems provide the data governance. A typical example is the item-centric approach. It reflects the current working culture, people working in their optimized siloes, exchanging information between disciplines through (neutral) files.
The platform approach makes it possible to deliver the optimized user interface for the end-user through a dedicated app. Assuming the data needed for such an app is accessible from the current platform or through other systems and platforms.
It might be tempting as a platform provider to add all imaginable data elements to their platform infrastructure as much as possible. The challenge with this approach is whether all data should be stored in a central data environment (preferably cloud) or federated. And what about filtering IP?
In my post PLM and Supply Chain Collaboration, I described the concept of having an intermediate hub (ShareAspace) between enterprises to facilitate real-time data sharing, however carefully filtered which data is shared in the hub.
It may be clear that storing everything in one big platform is not the future. As I described in part 2, in the end, a company might implement a maximum of five connected platforms (CRM, ERP, PIP, IoT and MES). Each of the individual platforms could contain a core data model relevant for this part of the business. This does not imply there might be no other platforms in the future. Platforms focusing on supply chain collaboration, like ShareAspace or OpenBOM, will have a value proposition too. In the end, the long-term future is all about realizing a digital tread of information within the organization.
Will we ever reach a perfectly connected enterprise or society? Probably not. Not because of technology but because of politics and human behavior. The connected enterprise might be the most efficient architecture, but will it be social, supporting all humanity. Predicting the future is impossible, as Yuval Harari described in his book: 21 Lessons for the 21st Century. Worth reading, still a collection of ideas.
Proprietary data model or standards?
So far, when you are a software vendor developing a system, there is no restriction in how you internally manage your data. In the domain of PLM, this meant that every vendor has its own proprietary data model and behavior.
I have learned from my 25+ years of experience with systems that the original design of a product combined with the vendor’s culture defines the future roadmap. So even if a PLM vendor would rewrite all their software to become data-driven, the ways of working, the assumptions will be based on past experiences.
This makes it hard to come to unified data models and methodology valid for our PLM domain. However, large enterprises like Airbus and Boeing and the major Automotive suppliers have always pushed for standards as they will benefit the most from standardization.
The recent PDT conferences were an example of this, mainly the 2020 Fall conference. Several Aerospace & Defense PLM Action groups reported their progress.
You can read my impression of this event in The weekend after PLM Roadmap / PDT 2020 – part 1 and The next weekend after PLM Roadmap PDT 2020 – part 2.
It would be interesting to see a Product Innovation Platform built upon a data model as much as possible aligned to existing standards. Probably it won’t happen as you do not make money from being open and complying with standards as a software vendor. Still, companies should push their software vendors to support standards as this is the only way to get larger connected eco-systems.
I do not believe in the toolkit approach where every company can build its own data model based on its current needs. I have seen this flexibility with SmarTeam in the early days. However, it became an upgrade risk when new, overlapping capabilities were introduced, not matching the past.
In addition, a flexible toolkit still requires a robust data model design done by experienced people who have learned from their mistakes.
The benefit of using standards is that they contain the learnings from many people involved.
Conclusion
I did not like writing this post so much, as my primary PLM focus lies on people and methodology. Still, understanding future technologies is an important point to consider. Therefore, this time a not-so-exciting post. There is enough to read on the internet related to PLM technology; see some of the recent articles below. Enjoy
Matthias Ahrens shared: Integrated Product Lifecycle Management (Google translated from German)
Oleg Shilovitsky wrote numerous articles related to technology –
in this context:
3 Challenges of Unified Platforms and System Locking and
SaaS PLM Acceleration Trends
So far, I have been discussing PLM experiences and best practices that have changed due to introducing electronic drawings and affordable 3D CAD systems for the mainstream. From vellum to PDM to item-centric PLM to manage product designs and manufacturing specifications.
Although the technology has improved, the overall processes haven’t changed so much. As a result, disciplines could continue to work in their own comfort zone, most of the time hidden and disconnected from the outside world.
Now, thanks to digitalization, we can connect and format information in real-time. Now we can provide every stakeholder in the company’s business to have almost real-time visibility on what is happening (if allowed). We have seen the benefits of platformization, where the benefits come from real-time connectivity within an ecosystem.
Apple, Amazon, Uber, Airbnb are the non-manufacturing related examples. Companies are trying to replicate these models for other businesses, connecting the concept owner (OEM ?), with design and manufacturing (services), with suppliers and customers. All connected through information, managed in data elements instead of documents – I call it connected PLM
Vendors have already shared their PowerPoints, movies, and demos from how the future would be in the ideal world using their software. The reality, however, is that implementing such solutions requires new business models, a new type of organization and probably new skills.
The last point is vital, as in schools and organizations, we tend to teach what we know from the past as this gives some (fake) feeling of security.
The reality is that most of us will have to go through a learning path, where skills from the past might become obsolete; however, knowledge of the past might be fundamental.
In the upcoming posts, I will share with you what I see, what I deduct from that and what I think would be the next step to learn.
I firmly believe connected PLM requires the usage of various models. Not only the 3D CAD model, as there are so many other models needed to describe and analyze the behavior of a product.
I hope that some of my readers can help us all further on the path of connected PLM (with a model-based approach). This series of posts will be based on the max size per post (avg 1500 words) and the ideas and contributes coming from you and me.
What is platformization?
In our day-to-day life, we are more and more used to direct interaction between resellers and services providers on one side and consumers on the other side. We have a question, and within 24 hours, there is an answer. We want to purchase something, and potentially the next day the goods are delivered. These are examples of a society where all stakeholders are connected in a data-driven manner.
We don’t have to create documents or specialized forms. An app or a digital interface allows us to connect. To enable this type of connectivity, there is a need for an underlying platform that connects all stakeholders. Amazon and Salesforce are examples for commercial activities, Facebook for social activities and, in theory, LinkedIn for professional job activities.
The platform is responsible for direct communication between all stakeholders.
The same applies to businesses. Depending on the products or services they deliver, they could benefit from one or more platforms. The image below shows five potential platforms that I identified in my customer engagements. Of course, they have a PLM focus (in the middle), and the grouping can be made differently.
The 5 potential platforms
The ERP platform
is mainly dedicated to the company’s execution processes – Human Resources, Purchasing, Finance, Production scheduling, and potentially many more services. As platforms try to connect as much as possible all stakeholders. The ERP platform might contain CRM capabilities, which might be sufficient for several companies. However, when the CRM activities become more advanced, it would be better to connect the ERP platform to a CRM platform. The same logic is valid for a Product Innovation Platform and an ERP platform. Examples of ERP platforms are SAP and Oracle (and they will claim they are more than ERP)
Note: Historically, most companies started with an ERP system, which is not the same as an ERP platform. A platform is scalable; you can add more apps without having to install a new system. In a platform, all stored data is connected and has a shared data model.
The CRM platform
a platform that is mainly focusing on customer-related activities, and as you can see from the diagram, there is an overlap with capabilities from the other platforms. So again, depending on your core business and products, you might use these capabilities or connect to other platforms. Examples of CRM platforms are Salesforce and Pega, providing a platform to further extend capabilities related to core CRM.
The MES platform
In the past, we had PDM and ERP and what happened in detail on the shop floor was a black box for these systems. MES platforms have become more and more important as companies need to trace and guide individual production orders in a data-driven manner. Manufacturing Execution Systems (and platforms) have their own data model. However, they require input from other platforms and will provide specific information to other platforms.
For example, if we want to know the serial number of a product and the exact production details of this product (used parts, quality status), we would use an MES platform. Examples of MES platforms (none PLM/ERP related vendors) are Parsec and Critical Manufacturing
The IoT platform
these platforms are new and are used to monitor and manage connected products. For example, if you want to trace the individual behavior of a product of a process, you need an IoT platform. The IoT platform provides the product user with performance insights and alerts.
However, it also provides the product manufacturer with the same insights for all their products. This allows the manufacturer to offer predictive maintenance or optimization services based on the experience of a large number of similar products. Examples of IoT platforms (none PLM/ERP-related vendors) are Hitachi and Microsoft.
The Product Innovation Platform (PIP)
All the above platforms would not have a reason to exist if there was not an environment where products were invented, developed, and managed. The Product Innovation Platform PIP – as described by CIMdata -is the place where Intellectual Property (IP) is created, where companies decide on their portfolio and more.
The PIP contains the traditional PLM domain. It is also a logical place to manage product quality and technical portfolio decisions, like what kind of product platforms and modules a company will develop. Like all previous platforms, the PIP cannot exist without other platforms and requires connectivity with the other platforms is applicable.
Look below at the CIMdata definition of a Product Innovation Platform.
You will see that most of the historical PLM vendors aiming to be a PIP (with their different flavors): Aras, Dassault Systèmes, PTC and Siemens.
Of course, several vendors sell more than one platform or even create the impression that everything is connected as a single platform. Usually, this is not the case, as each platform has its specific data model and combining them in a single platform would hurt the overall performance.
Therefore, the interaction between these platforms will be based on standardized interfaces or ad-hoc connections.
Standard interfaces or ad-hoc connections?
Suppose your role and information needs can be satisfied within a single platform. In that case, most likely, the platform will provide you with the right environment to see and manipulate the information.
However, it might be different if your role requires access to information from other platforms. For example, it could be as simple as an engineer analyzing a product change who needs to know the actual stock of materials to decide how and when to implement a change.
This would be a PIP/ERP platform collaboration scenario.
Or even more complex, it might be a product manager wanting to know how individual products behave in the field to decide on enhancements and new features. This could be a PIP, CRM, IoT and MES collaboration scenario if traceability of serial numbers is needed.
The company might decide to build a custom app or dashboard for this role to support such a role. Combining in real-time data from the relevant platforms, using standard interfaces (preferred) or using API’s, web services, REST services, microservices (for specialists) and currently in fashion Low-Code development platforms, which allow users to combine data services from different platforms without being an expert in coding.
Without going too much in technology, the topics in this paragraph require an enterprise architecture and vision. It is opportunistic to think that your existing environment will evolve smoothly into a digital highway for the future by “fixing” demands per user. Your infrastructure is much more likely to end up congested as spaghetti.
In that context, I read last week an interesting post Low code: A promising trend or Pandora’s box. Have a look and decide for yourself
I am less focused on technology, more on methodology. Therefore, I want to come back to the theme of my series: The road to model-based and connected PLM. For sure, in the ideal world, the platforms I mentioned, or other platforms that run across these five platforms, are cloud-based and open to connect to other data sources. So, this is the infrastructure discussion.
In my upcoming blog post, I will explain why platforms require a model-based approach and, therefore, cause a challenge, particularly in the PLM domain.
It took us more than fifty years to get rid of vellum drawings. It took us more than twenty years to introduce 3D CAD for design and engineering. Still primarily relying on drawings. It will take us for sure one generation to switch from document-based engineering to model-based engineering.
Conclusion
In this post, I tried to paint a picture of the ideal future based on connected platforms. Such an environment is needed if we want to be highly efficient in designing, delivering, and maintaining future complex products based on hardware and software. Concepts like Digital Twin and Industry 4.0 require a model-based foundation.
In addition, we will need Digital Twins to reach our future sustainability goals efficiently. So, there is work to do.
Your opinion, Your contribution?
After the first article discussing “The Future of PLM,” now again a post in the category of PLM and complementary practices/domains a topic that is already for a long time on the radar: Model-Based Definition, I am glad to catch up with Jennifer Herron, founder of Action Engineering, who is one of the thought leaders related to Model-Based Definition (MBD) and Model-Based Enterprise (MBE).
In 2016 I spoke with Jennifer after reading her book: “Re-Use Your CAD – The Model-Based CAD Handbook”. At that time, the discussion was initiated through two articles on Engineering.com. Action Engineering introduced OSCAR seven years later as the next step towards learning and understanding the benefits of Model-Based Definition.
Therefore, it is a perfect moment to catch up with Jennifer. Let’s start.
Model-Based Definition
Jennifer, first of all, can you bring some clarity in terminology. When I discussed the various model-based approaches, the first response I got was that model-based is all about 3D Models and that a lot of the TLA’s are just marketing terminology.
Can you clarify which parts of the model-based enterprise you focus on and with the proper TLA’s?
Model-Based means many things to many different viewpoints and systems of interest. All these perspectives lead us down many rabbit holes, and we are often left confused when first exposed to the big concepts of model-based.
At Action Engineering, we focus on Model-Based Definition (MBD), which uses and re-uses 3D data (CAD models) in design, fabrication, and inspection.
There are other model-based approaches, and the use of the word “model” is always a challenge to define within the proper context.
For MBD, a model is 3D CAD data that comes in both native and neutral formats
Another model-based approach is Model-Based Systems Engineering (MBSE). The term “model” in this context is a formalized application of modeling to support system requirements, design, analysis, verification and validation activities beginning in the conceptual design phase and continuing throughout development and later lifecycle phases.
<Jos> I will come back on Model-Based Systems Engineering in future posts
Sometimes MBSE is about designing widgets, and often it is about representing the entire system and the business operations. For MBD, we often focus our education on the ASME Y14.47 definition that MBD is an annotated model and associated data elements that define the product without a drawing.
Model-Based Definition for Everybody?
I believe it took many years till 3D CAD design became a commodity; however, I still see the disconnected 2D drawing used to specify a product or part for manufacturing or suppliers. What are the benefits of model-based definition?
Are there companies that will not benefit from the model-based definition?
There’s no question that the manufacturing industry is addicted to their drawings. There are many reasons why, and yet mostly the problem is lack of awareness of how 3D CAD data can make design, fabrication, and inspection work easier.
For most, the person doing an inspection in the shipping and receiving department doesn’t have exposure to 3D data, and the only thing they have is a tabulated ERP database and maybe a drawing to read. If you plop down a 3D viewable that they can spin and zoom, they may not know how that relates to their job or what you want them to do differently.
Today’s approach of engineering championing MBD alone doesn’t work. To evolve information from the 2D drawing onto the 3D CAD model without engaging the stakeholders (machinists, assembly technicians, and inspectors) never yields a return on investment.
Organizations that succeed in transitioning to MBD are considering and incorporating all departments that touch the drawing today.
Incorporating all departments requires a vision from the management. Can you give some examples of companies that have transitioned to MBD, and what were the benefits they noticed?
I’ll give you an example of a small company with no First Article Inspection (FAI) regulatory requirements and a huge company with very rigorous FAI requirements.
Note: click on the images below to enjoy the details.
The small company instituted a system of CAD modeling discipline that allowed them to push 3D viewable information directly to the factory floor. The assembly technicians instantly understood engineering’s requirements faster and better.
The positive MBD messages for these use cases are 3D navigation, CAD Re-Use, and better control of their revisions on the factory floor.
The large company has added inspection requirements directly onto their engineering and created a Bill of Characteristics (BOC) for the suppliers and internal manufacturers. They are removing engineering ambiguity, resulting in direct digital information exchange between engineering, manufacturing, and quality siloes.
These practices have reduced error and reduced time to market.
The positive MBD messages for these use cases are unambiguous requirements capture by Engineering, Quality Traceability, and Model-Based PMI (Product and Manufacturing Information).
Model-Based Definition and PLM?
How do you see the relation between Model-Based Definition and PLM? Is a PLM system a complication or aid to implement a Model-Based Definition? And do you see a difference between the old and new PLM Vendors?
Model-Based Definition data is complex and rich in connected information, and we want it to be. With that amount of connected data, a data management system (beyond upload/download of documents) must keep all that data straight.
Depending on the size and function of an organization, a PLM may not be needed. However, a way to manage changes and collaboration amongst those using 3D data is necessary. Sometimes that results in a less sophisticated Product Data Management (PDM) system. Large organizations often require PLM.
There is significant resistance to doing MBD and PLM implementations simultaneously because PLM is always over budget and behind schedule. However, doing just MBD or just PLM without the other doesn’t work either. I think you should be brave and do both at once.
I think we can debate why PLM is always over budget and behind schedule. I hear the same about ERP implementations. Perhaps it has to deal with the fact that enterprise applications have to satisfy many users?
I believe that working with model versions and file versions can get mixed in larger organizations, so there is a need for PDM or PLM. Have you seen successful implementations of both interacting together?
Yes, the only successful MBD implementations are those that already have a matured PDM/PLM (scaled best to the individual business).
Model-Based Definition and Digital Transformation
In the previous question, we already touched on the challenge of old and modern PLM. How do you see the introduction of Model-Based Definition addressing the dreams of Industry 4.0, the Digital Twin and other digital concepts?
I just gave a presentation at the ASME Digital Twin Summit discussing the importance of MBD for the Digital Twin. MBD is a foundational element that allows engineering to compare their design requirements to the quality inspection results of digital twin data.
The feedback loop between Engineering and Quality is fraught with labor-intensive efforts in most businesses today.
Leveraging the combination of MBD and Digital Twin allows automation possibilities to speed up and increase the accuracy of the engineering to inspection feedback loop. That capability helps organizations realize the vision of Industry 4.0.
And then there is OSCAR.
I noticed you announced OSCAR. First, I thought OSCAR was a virtual aid for model-based definition, and I liked the launching page HERE. Can you tell us more about what makes OSCAR unique?
One thing that is hard with MBD implementation is there is so much to know. Our MBDers at Action Engineering have been involved with MBD for many years and with many companies. We are embedded in real-life transitions from using drawings to using models.
Suppose you start down the model-based path for digital manufacturing. In that case, there are significant investments in time to learn how to get to the right set of capabilities and the right implementation plan guided by a strategic focus. OSCAR reduces that ramp-up time with educational resources and provides vetted and repeatable methods for an MBD implementation.
OSCAR combines decades of Action Engineering expertise and lessons learned into a multi-media textbook of sorts. To kickstart an individual or an organization’s MBD journey, it includes asynchronous learning, downloadable resources, and CAD examples available in Creo, NX, and SOLIDWORKS formats.
CAD users can access how-to training and downloadable resources such as the latest edition of Re-Use Your CAD (RUYC). OSCAR enables process improvement champions to make their case to start the MBD journey. We add content regularly and post what’s new. Free trials are available to check out the online platform.
Learn more about what OSCAR is here:
Want to learn more?
In this post, I believe we only touched the tip of the iceberg. There is so much to learn and understand. What would you recommend to a reader of this blog who got interested?
RUYC (Re-Use Your CAD) is an excellent place to start, but if you need more audio-visual, and want to see real-life examples of MBD in action, get a Training subscription of OSCAR to get rooted in the vocabulary and benefits of MBD with a Model-Based Enterprise. Watch the videos multiple times! That’s what they are for. We love to work with European companies and would love to support you with a kickstart coaching package to get started.
What I learned
First of all, I learned that Jennifer is a very pragmatic person. Her company (Action Engineering) and her experience are a perfect pivot point for those who want to learn and understand more about Model-Based Definition. In particular, in the US, given her strong involvement in the American Society of Mechanical Engineers (ASME).
I am still curious if European or Asian counterparts exist to introduce and explain the benefits and usage of Model-Based Definition to their customers. Feel free to comment.
Next, and an important observation too, is the fact that Jennifer also describes the tension between Model-Based Definition and PLM. Current PLM systems might be too rigid to support end-to-end scenarios, taking benefit of the Model-Based definition.
I have to agree here. PLM Vendors mainly support their own MBD (model-based definition), where the ultimate purpose is to share all product-related information using various models as the main information carriers efficiently.
We have to study and solve a topic in the PLM domain, as I described in my technical highlights from the PLM Road Map & PDT Spring 2021 conference.
There is work to do!
Conclusion
Model-Based Definition is, for me, one of the must-do steps of a company to understand the model-based future. A model-based future sometimes incorporates Model-Based Systems Engineering, a real Digital Thread and one or more Digital Twins (depending on your company’s products).
It is a must-do activity because companies must transform themselves to depend on digital processes and digital continuity of data to remain competitive. Document-driven processes relying on the interpretation of a person are not sustainable.
After the first article discussing “The Future of PLM,” now again a post in the category of PLM and complementary practices/domains a topic that is already for a long time on the radar: Model-Based Definition, I am glad to catch up with Jennifer Herron, founder of Action Engineering, who is one of the thought leaders related to Model-Based Definition (MBD) and Model-Based Enterprise (MBE).
In 2016 I spoke with Jennifer after reading her book: “Re-Use Your CAD – The Model-Based CAD Handbook”. At that time, the discussion was initiated through two articles on Engineering.com. Action Engineering introduced OSCAR seven years later as the next step towards learning and understanding the benefits of Model-Based Definition.
Therefore, it is a perfect moment to catch up with Jennifer. Let’s start.
Model-Based Definition
Jennifer, first of all, can you bring some clarity in terminology. When I discussed the various model-based approaches, the first response I got was that model-based is all about 3D Models and that a lot of the TLA’s are just marketing terminology.
Can you clarify which parts of the model-based enterprise you focus on and with the proper TLA’s?
Model-Based means many things to many different viewpoints and systems of interest. All these perspectives lead us down many rabbit holes, and we are often left confused when first exposed to the big concepts of model-based.
At Action Engineering, we focus on Model-Based Definition (MBD), which uses and re-uses 3D data (CAD models) in design, fabrication, and inspection.
There are other model-based approaches, and the use of the word “model” is always a challenge to define within the proper context.
For MBD, a model is 3D CAD data that comes in both native and neutral formats
Another model-based approach is Model-Based Systems Engineering (MBSE). The term “model” in this context is a formalized application of modeling to support system requirements, design, analysis, verification and validation activities beginning in the conceptual design phase and continuing throughout development and later lifecycle phases.
<Jos> I will come back on Model-Based Systems Engineering in future posts
Sometimes MBSE is about designing widgets, and often it is about representing the entire system and the business operations. For MBD, we often focus our education on the ASME Y14.47 definition that MBD is an annotated model and associated data elements that define the product without a drawing.
Model-Based Definition for Everybody?
I believe it took many years till 3D CAD design became a commodity; however, I still see the disconnected 2D drawing used to specify a product or part for manufacturing or suppliers. What are the benefits of model-based definition?
Are there companies that will not benefit from the model-based definition?
There’s no question that the manufacturing industry is addicted to their drawings. There are many reasons why, and yet mostly the problem is lack of awareness of how 3D CAD data can make design, fabrication, and inspection work easier.
For most, the person doing an inspection in the shipping and receiving department doesn’t have exposure to 3D data, and the only thing they have is a tabulated ERP database and maybe a drawing to read. If you plop down a 3D viewable that they can spin and zoom, they may not know how that relates to their job or what you want them to do differently.
Today’s approach of engineering championing MBD alone doesn’t work. To evolve information from the 2D drawing onto the 3D CAD model without engaging the stakeholders (machinists, assembly technicians, and inspectors) never yields a return on investment.
Organizations that succeed in transitioning to MBD are considering and incorporating all departments that touch the drawing today.
Incorporating all departments requires a vision from the management. Can you give some examples of companies that have transitioned to MBD, and what were the benefits they noticed?
I’ll give you an example of a small company with no First Article Inspection (FAI) regulatory requirements and a huge company with very rigorous FAI requirements.
Note: click on the images below to enjoy the details.
The small company instituted a system of CAD modeling discipline that allowed them to push 3D viewable information directly to the factory floor. The assembly technicians instantly understood engineering’s requirements faster and better.
The positive MBD messages for these use cases are 3D navigation, CAD Re-Use, and better control of their revisions on the factory floor.
The large company has added inspection requirements directly onto their engineering and created a Bill of Characteristics (BOC) for the suppliers and internal manufacturers. They are removing engineering ambiguity, resulting in direct digital information exchange between engineering, manufacturing, and quality siloes.
These practices have reduced error and reduced time to market.
The positive MBD messages for these use cases are unambiguous requirements capture by Engineering, Quality Traceability, and Model-Based PMI (Product and Manufacturing Information).
Model-Based Definition and PLM?
How do you see the relation between Model-Based Definition and PLM? Is a PLM system a complication or aid to implement a Model-Based Definition? And do you see a difference between the old and new PLM Vendors?
Model-Based Definition data is complex and rich in connected information, and we want it to be. With that amount of connected data, a data management system (beyond upload/download of documents) must keep all that data straight.
Depending on the size and function of an organization, a PLM may not be needed. However, a way to manage changes and collaboration amongst those using 3D data is necessary. Sometimes that results in a less sophisticated Product Data Management (PDM) system. Large organizations often require PLM.
There is significant resistance to doing MBD and PLM implementations simultaneously because PLM is always over budget and behind schedule. However, doing just MBD or just PLM without the other doesn’t work either. I think you should be brave and do both at once.
I think we can debate why PLM is always over budget and behind schedule. I hear the same about ERP implementations. Perhaps it has to deal with the fact that enterprise applications have to satisfy many users?
I believe that working with model versions and file versions can get mixed in larger organizations, so there is a need for PDM or PLM. Have you seen successful implementations of both interacting together?
Yes, the only successful MBD implementations are those that already have a matured PDM/PLM (scaled best to the individual business).
Model-Based Definition and Digital Transformation
In the previous question, we already touched on the challenge of old and modern PLM. How do you see the introduction of Model-Based Definition addressing the dreams of Industry 4.0, the Digital Twin and other digital concepts?
I just gave a presentation at the ASME Digital Twin Summit discussing the importance of MBD for the Digital Twin. MBD is a foundational element that allows engineering to compare their design requirements to the quality inspection results of digital twin data.
The feedback loop between Engineering and Quality is fraught with labor-intensive efforts in most businesses today.
Leveraging the combination of MBD and Digital Twin allows automation possibilities to speed up and increase the accuracy of the engineering to inspection feedback loop. That capability helps organizations realize the vision of Industry 4.0.
And then there is OSCAR.
I noticed you announced OSCAR. First, I thought OSCAR was a virtual aid for model-based definition, and I liked the launching page HERE. Can you tell us more about what makes OSCAR unique?
One thing that is hard with MBD implementation is there is so much to know. Our MBDers at Action Engineering have been involved with MBD for many years and with many companies. We are embedded in real-life transitions from using drawings to using models.
Suppose you start down the model-based path for digital manufacturing. In that case, there are significant investments in time to learn how to get to the right set of capabilities and the right implementation plan guided by a strategic focus. OSCAR reduces that ramp-up time with educational resources and provides vetted and repeatable methods for an MBD implementation.
OSCAR combines decades of Action Engineering expertise and lessons learned into a multi-media textbook of sorts. To kickstart an individual or an organization’s MBD journey, it includes asynchronous learning, downloadable resources, and CAD examples available in Creo, NX, and SOLIDWORKS formats.
CAD users can access how-to training and downloadable resources such as the latest edition of Re-Use Your CAD (RUYC). OSCAR enables process improvement champions to make their case to start the MBD journey. We add content regularly and post what’s new. Free trials are available to check out the online platform.
Learn more about what OSCAR is here:
Want to learn more?
In this post, I believe we only touched the tip of the iceberg. There is so much to learn and understand. What would you recommend to a reader of this blog who got interested?
RUYC (Re-Use Your CAD) is an excellent place to start, but if you need more audio-visual, and want to see real-life examples of MBD in action, get a Training subscription of OSCAR to get rooted in the vocabulary and benefits of MBD with a Model-Based Enterprise. Watch the videos multiple times! That’s what they are for. We love to work with European companies and would love to support you with a kickstart coaching package to get started.
What I learned
First of all, I learned that Jennifer is a very pragmatic person. Her company (Action Engineering) and her experience are a perfect pivot point for those who want to learn and understand more about Model-Based Definition. In particular, in the US, given her strong involvement in the American Society of Mechanical Engineers (ASME).
I am still curious if European or Asian counterparts exist to introduce and explain the benefits and usage of Model-Based Definition to their customers. Feel free to comment.
Next, and an important observation too, is the fact that Jennifer also describes the tension between Model-Based Definition and PLM. Current PLM systems might be too rigid to support end-to-end scenarios, taking benefit of the Model-Based definition.
I have to agree here. PLM Vendors mainly support their own MBD (model-based definition), where the ultimate purpose is to share all product-related information using various models as the main information carriers efficiently.
We have to study and solve a topic in the PLM domain, as I described in my technical highlights from the PLM Road Map & PDT Spring 2021 conference.
There is work to do!
Conclusion
Model-Based Definition is, for me, one of the must-do steps of a company to understand the model-based future. A model-based future sometimes incorporates Model-Based Systems Engineering, a real Digital Thread and one or more Digital Twins (depending on your company’s products).
It is a must-do activity because companies must transform themselves to depend on digital processes and digital continuity of data to remain competitive. Document-driven processes relying on the interpretation of a person are not sustainable.
Last week I wrote about the recent PLM Road Map & PDT Spring 2021 conference day 1, focusing mainly on technology. There were also interesting sessions related to exploring future methodologies for a digital enterprise. Now on Day 2, we started with two sessions related to people and methodology, indispensable when discussing PLM topics.
Designing and Keeping Great Teams
This keynote speech from Noshir Contractor, Professor of Behavioral Sciences in the McCormick School of Engineering & Applied Science, intrigued me as the subtitle states: Lessons from Preparing for Mars. What Can PLM Professionals Learn from This?
You might ask yourself, is a PLM implementation as difficult and as complex as a mission to Mars? I hoped, so I followed with great interest Noshir’s presentation.
Noshir started by mentioning that many disruptive technologies have emerged in recent years, like Teams, Slack, Yammer and many more.
The interesting question he asked in the context of PLM is:

As the domain of PLM is all about trying to optimize effective collaboration, this is a fair question
Noshir shared with us that it is not the most crucial point to look at people’s individual skills but more about who they know.
Measure who they work with is more important than who they are.
Based on this statement, Noshir showed some network patterns of different types of networks.
Click on the image to see the enlarged picture.
It is clear from these patterns how organizations communicate internally and/or externally. It would be an interesting exercise to perform in a company and to see if the analysis matches the perceived reality.
Noshir’s research was used by NASA to analyze and predict the right teams for a mission to Mars.
Noshir went further by proposing what PLM can learn from teams that are going into space. And here, I was not sure about the parallel. Is a PLM project comparable to a mission to Mars? I hope not! I have always advocated that a PLM implementation is a journey. Still, I never imagined that it could be a journey into the remote unknown.
Noshir explained that they had built tools based on their scientific model to describe and predict how teams could evolve over time. He believes that society can also benefit from these learnings. Many inventions from the past were driven by innovations coming from space programs.
I believe Noshir’s approach related to team analysis is much more critical for organizations with a mission. How do you build multidisciplinary teams?
The proposed methodology is probably best for a holocracy based organization. Holocrazy is an interesting concept for companies to get their employees committed, however, it also demands a type of involvement that not every person can deliver. For me, coming back to PLM, as a strategy to enable collaboration, the effectiveness of collaboration depends very much on the organizational culture and created structure.
DISRUPTION – EXTINCTION or still EVOLUTION?
We talk a lot about disruption because disruption is a painful process that you do not like to happen to yourself or your company. In the context of this conference’s theme, I discussed the awareness that disruptive technologies will be changing the PLM Value equation.
However, disruptive technologies are not alone sufficient. In PLM, we have to deal with legacy data, legacy processes, legacy organization structures, and often legacy people.
A disruption like the switch from mini-computers to PCs (killed DEC) or from Symbian to iOS (killed Nokia) is therefore not likely to happen that fast. Still, there is a need to take benefit from these new disruptive technologies.
My presentation was focusing on describing the path of evolution and focus areas for the PLM community. Doing nothing means extinction; experimenting and learning towards the future will provide an evolutionary way.

Starting from acknowledging that there is an incompatibility between data produced most of the time now and the data needed in the future, I explained my theme: From Coordinated to Connected. As a PLM community, we should spend more time together in focus groups, conferences on describing and verifying methodology and best practices.

Nigel Shaw (EuroStep) and Mark Williams (Boeing) hinted in this direction during this conference (see day 1). Erik Herzog (SAAB Aeronautics) brought this topic to last year’s conference (see day 3). Outside this conference, I have comparable touchpoints with Martijn Dullaert when discussing Configuration Management in the future in relation to PLM.
In addition, this decade will probably be the most disruptive decade we have known in humanity due to external forces that push companies to change. Sustainability regulations from governments (the Paris agreement), the implementation of circular economy concepts combined with the positive and high Total Share Holder return will push companies to adapt themselves more radical than before.
What is clear is that disruptive technologies and concepts, like Industry 4.0, Digital Thread and Digital Twin, can serve a purpose when implemented efficiently, ensuring the business becomes sustainable.
Due to the lack of end-to-end experience, we need focus groups and conferences to share progress and lessons learned. And we do not need to hear the isolated vendor success stories here as a reference, as often they are siloed again and leading to proprietary environments.
You can see my full presentation on SlideShare: DISRUPTION – EXTINCTION or still EVOLUTION?
Building a profitable Digital T(win) business
Beatrice Gasser, Technical, Innovation, and Sustainable Development Director from the Egis group, gave an exciting presentation related to the vision and implementation of digital twins in the construction industry.
The Egis group both serves as a consultancy firm as well as an asset management organization. You can see a wide variety of activities on their website or have a look at their perspectives
Historically the construction industry has been lagging behind having low productivity due to fragmentation, risk aversion and recently, more and more due to the lack of digital talent. In addition, some of the construction companies make their money from claims inside of having a smooth and profitable business model.
Without innovation in the construction industry, companies working the traditional way would lose market share and investor-focused attention, as we can see from the BCG diagram I discussed in my session.
The digital twin of construction is an ideal concept for the future. It can be built in the design phase to align all stakeholders, validate and integrate solutions and simulate the building operational scenarios at almost zero materials cost. Egis estimates that by using a digital twin during construction, the engineering and construction costs of a building can be reduced between 15 and 25 %
More importantly, the digital twin can also be used to first simulate operations and optimize energy consumption. The connected digital twin of an existing building can serve as a new common data environment for future building stakeholders. This could be the asset owner, service companies, and even the regulatory authorities needing to validate the building’s safety and environmental impact.
Beatrice ended with five principles essential to establish a digital twin, i.e
I think the construction industry has a vast potential to disrupt itself. Faster than the traditional manufacturing industries due to their current needs to work in a best-connected manner.
Next, there is almost no legacy data to deal with for these companies. Every new construction or building is a unique project on its own. The key differentiators will be experience and efficient ways of working.
It is about the belief, the guts and the skilled people that can make it work – all for a more efficient and sustainable future.
Leveraging PLM and Cloud Technology for Market Success
Stan Przybylinski, Vice President of CIMdata, reported their global survey related to the cloud, completed in early 2021. Also, Stan typified Industry 4.0 as a connected vision and cloud and digital thread as enablers to implementing this vision.
The companies interviewed showed a lot of goodwill to make progress – click on the image to see the details. CIMdata is also working with PLM Vendors to learn and describe better the areas of beneft. I remain curious about who comes with a realization and business case that is future-proof. This will define our new PLM Value Equation.
Conclusion
These were two exciting days with enough mentioning of disruptive technologies. Our challenge in the PLM domain will be to give them a purpose. A purpose is likely driven by external factors related to the need for a sustainable future. Efficiency and effectiveness must come from learning to work in connected environments (digital twin, digital thread, industry 4.0, Model-Based (Systems) Engineering.
Note: You might have seen the image below already – a nice link between sustainability and the mission to Mars
One of my favorite conferences is the PLM Road Map & PDT conference. Probably because in the pre-COVID days, it was the best PLM conference to network with peers focusing on PLM practices, standards, and sustainability topics. Now the conference is virtual, and hopefully, after the pandemic, we will meet again in the conference space to elaborate on our experiences further.
Last year’s fall conference was special because we had three days filled with a generic PLM update and several A&D (Aerospace & Defense) working groups updates, reporting their progress and findings. Sessions related to the Multiview BOM research, Global Collaboration, and several aspects of Model-Based practices: Model-Based Definition, Model-Based Engineering & Model-Based Systems engineering.
All topics that I will elaborate on soon. You can refresh your memory through these two links:
- The weekend after PLM Roadmap / PDT 2020 – part 1
- The next weekend after PLM Roadmap / PDT 2020 – part 2
This year, it was a two-day conference with approximately 200 attendees discussing how emerging technologies can disrupt the current PLM landscape and reshape the PLM Value Equation. During the first day of the conference, we focused on technology.
On the second day, we looked in addition to the impact new technology has on people and organizations.
Today’s Emerging Trends & Disrupters
Peter Bilello, CIMdata’s President & CEO, kicked off the conference by providing CIMdata observations of the market. An increasing number of technology capabilities, like cloud, additive manufacturing, platforms, digital thread, and digital twin, all with the potential of realizing a connected vision. Meanwhile, companies evolve at their own pace, illustrating that the gap between the leaders and followers becomes bigger and bigger.
Where is your company? Can you afford to be a follower? Is your PLM ready for the future? Probably not, Peter states.
Next, Peter walked us through some technology trends and their applicability for a future PLM, like topological data analytics (TDA), the Graph Database, Low-Code/No-Code platforms, Additive Manufacturing, DevOps, and Agile ways of working during product development. All capabilities should be related to new ways of working and updated individual skills.
I fully agreed with Peter’s final slide – we have to actively rethink and reshape PLM – not by calling it different but by learning, experimenting, and discussing in the field.
Digital Transformation Supporting Army Modernization
An interesting viewpoint related to modern PLM came from Dr. Raj Iyer, Chief Information Officer for IT Reform from the US Army. Rai walked us through some of the US Army’s challenges, and he gave us some fantastic statements to think about. Although an Army cannot be compared with a commercial business, its target remains to be always ahead of the competition and be aware of the competition.
Where we would say “data is the new oil”, Rai Iyer said: “Data is the ammunition of the future fight – as fights will more and more take place in cyberspace.”
The US Army is using a lot of modern technology – as the image below shows. The big difference here with regular businesses is that it is not about ROI but about winning fights.
Also, for the US Army, the cloud becomes the platform of the future. Due to the wide range of assets, the US Army has to manage, the importance of product data standards is evident. – Rai mentioned their contribution and adherence to the ISO 10303 STEP standard crucial for interoperability. It was an exciting insight into the US Army’s current and future challenges. Their primary mission remains to stay ahead of the competition.
Joining up Engineering Data without losing the M in PLM
Nigel Shaw’s (Eurostep) presentation was somehow philosophical but precisely to the point what is the current dilemma in the PLM domain. Through an analogy of the internet, explaining that we live in a world of HTTP(s) linking, we create new ways of connecting information. The link becomes an essential artifact in our information model.
Where it is apparent links are crucial for managing engineering data, Nigel pointed out some of the significant challenges of this approach, as you can see from his (compiled) image below.
I will not discuss this topic further here as I am planning to come back to this topic when explaining the challenges of the future of PLM.
As Nigel said, they have a debate with one of their customers to replace the existing PLM tools or enhance the existing PLM tools. The challenge of moving from coordinated information towards connected data is a topic that we as a community should study.
Integration is about more than Model Format.
This was the presentation I have been waiting for. Mark Williams from Boeing had built the story together with Adrian Burton from Airbus. Nigel Shaw, in the previous session, already pointed to the challenge of managing linked information. Mark elaborated further about the model-based approach for system definition.
All content was related to the understanding that we need a model-based information infrastructure for the future because storing information in documents (the coordinated approach) is no longer viable for complex systems. Mark ‘slide below says it all.
Mark stressed the importance of managing model information in context, and it has become a challenge.
Mark mentioned that 20 years ago, the IDC (International Data Corporation) measured Boeing’s performance and estimated that each employee spent 2 ½ hours per day. In 2018, the IDC estimated that this number has grown to 30 % of the employee’s time and could go up to 50 % when adding the effort of reusing and duplicating data.
The consequence of this would be that a full-service enterprise, having engineering, manufacturing and services connected, probably loses 70 % of its information because they cannot find it—an impressive number asking for “clever” ways to find the correct information in context.
It is not about just a full indexed search of the data, as some technology geeks might think. It is also about describing and standardizing metadata that describes the models. In that context, Mark walked through a list of existing standards, all with their pros and cons, ending up with the recommendation to use the ISO 10303-243 – MoSSEC standard.
MoSSEC standing for Modelling and Simulation information in a collaborative Systems Engineering Context to manage and connect the relationships between models.
MoSSEC and its implication for future digital enterprises are interesting, considering the importance of a model-based future. I am curious how PLM Vendors and tools will support and enable the standard for future interoperability and collaboration.
Additive Manufacturing
– not as simple as paper printing – yet
Andreas Graichen from Siemens Energy closed the day, coming back to the new technologies’ topic: Additive Manufacturing or in common language 3D Printing. Andreas shared their Additive Manufacturing experiences, matching the famous Gartner Hype Cycle. His image shows that real work needs to be done to understand the technology and its use cases after the first excitement of the hype is over.
Material knowledge was one of the important topics to study when applying additive manufacturing. It is probably a new area for most companies to understand the material behaviors and properties in an Additive Manufacturing process.
The ultimate goal for Siemens Energy is to reach an “autonomous” workshop anywhere in the world where gas turbines could order their spare parts by themselves through digital warehouses. It is a grand vision, and Andreas confirmed that the scalability of Additive Manufacturing is still a challenge.
For rapid prototyping or small series of spare parts, Additive Manufacturing might be the right solution. The success of your Additive Manufacturing process depends a lot on how your company’s management has realistic expectations and the budget available to explore this direction.
Conclusion
Day 1 was enjoyable and educational, starting and ending with a focus on disruptive technologies. The middle part related to data the data management concepts needed for a digital enterprise were the most exciting topics to follow up in my opinion.
Next week I will follow up with reviewing day 2 and share my conclusions. The PLM Road Map & PDT Spring 2021 conference confirmed that there is work to do to understand the future (of PLM).
Last summer, I wrote a series of blog posts grouped by the theme “Learning from the past to understand the future”. These posts took you through the early days of drawings and numbering practices towards what we currently consider the best practice: PLM BOM-centric backbone for product lifecycle information.
You can find an overview and links to these posts on the Learning from the past page.
If you have read these posts, or if you have gone through this journey, you will realize that all steps were more or less done evolutionary. There were no disruptions. Affordable 3D CAD systems, new internet paradigms (interactive internet), global connectivity and mobile devices all introduced new capabilities for the mainstream. As described in these posts, the new capabilities sometimes create friction with old practices. Probably the most popular topics are the whole Form-Fit-Function interpretation and the discussion related to meaningful part numbers.
What is changing?
In the last five to ten years, a lot of new technology has come into our lives. The majority of these technologies are related to dealing with data. Digital transformation in the PLM domain means moving from a file-based/document-centric approach to a data-driven approach.
A Bill of Material on the drawing has become an Excel-like table in a PLM system. However, an Excel file is still used to represent a Bill of Material in companies that have not implemented PLM.

Another example is the specification document which has become a collection of individual requirements in a system. Each requirement is a data object with its own status and content. The specification becomes a report combining all valid requirement objects.
Related to CAD, the 2D drawing is no longer the deliverable as a document; the 3D CAD model with its annotated views becomes the information carrier for engineering and manufacturing.
Most importantly, traditional PLM methodologies have been based on a mechanical design and release process. Meanwhile, modern products are systems where the majority of capabilities are defined by software. Software has an entirely different configuration and lifecycle approach which conflict with a mechanical approach, which is too rigid for software.
The last two aspects, from 2D drawings to 3D Models and Mechanical products towards Systems (hardware and software), require new data management methods. In this environment, we need to learn to manage simulation models, behavior models, physics models and 3D models as connected as possible.
I wrote about these changes three years ago: Model-Based – an introduction, which led to a lot of misunderstanding (too advanced – too hypothetical).
I plan to revisit these topics in the upcoming months again to see what has changed over the past three years.
What will I discuss in the upcoming weeks?
My first focus is on participating and contributing to the upcoming PLM Roadmap & PDT spring 2021 conference. Here speakers will discuss the need for reshaping the PLM Value Equation due to new emerging technologies. A topic that contributes perfectly to the future of PLM series.
My contribution will focus on the fact that technology alone cannot disrupt the PLM domain. We also have to deal with legacy data and legacy ways of working.
Next, I will discuss with Jennifer Herron from Action Engineering the progress made in Model-Based Definition, which fits best practices for today – a better connection between engineering and manufacturing. We will also discuss why Model-Based Definition is a significant building block required for realizing the concepts of a digital enterprise, Industry 4.0 and digital twins.
Another post will focus on the difference between the digital thread and the digital thread. Yes, it looks like I am writing twice the same words. However, you will see based on its interpretation, one definition is hanging on the past, the other is targeting the future. Again here, the differentiation is crucial if the need for a maintainable Digital Twin is required.
Model-Based Systems Engineering (MBSE) in all its aspects needs to be discussed too. MBSE is crucial for defining complex products. Model-Based Systems Engineering is seen as a discipline to design products. Understanding data management related to MBSE will be the foundation for understanding data management in a Model-Based Enterprise. For example, how to deal with configuration management in the future?
Writing Learning from the past was an easy job as explaining with hindsight is so much easier if you have lived it through. I am curious and excited about the outcome of “The Future of PLM”. Writing about the future means you have digested the information coming to you, knowing that nobody has a clear blueprint for the future of PLM.
There are people and organizations are working on this topic more academically, for example read this post from Lionel Grealou related to the Place of PLM in the Digital Future. The challenge is that an academic future might be disrupted by unpredictable events, like COVID, or disruptive technologies combined with an opportunity to succeed. Therefore I believe, it will be a learning journey for all of us where we need to learn to give technology a business purpose. Business first – then technology.
No Conclusion
Normally I close my post with a conclusion. At this moment. there is no conclusion as the journey has just started. I look forward to debating and learning with practitioners in the field. Work together on methodology and concepts that work in a digital enterprise. Join me on this journey. I will start sharing my thoughts in the upcoming months
After “The PLM Doctor is IN #2,” now again a written post in the category of PLM and complementary practices/domains.
After PLM and Configuration Lifecycle ManagementCLM (January 2021) and PLM and Configuration Management CM (February 2021), now it is time to address the third interesting topic:
PLM and Supply Chain collaboration.
In this post, I am speaking with Magnus Färneland from Eurostep, a company well known in my PLM ecosystem, through their involvement in standards (STEP and PLCS), the PDT conferences, and their PLM collaboration hub, ShareAspace.
Supply Chain collaboration
The interaction between OEMs and their suppliers has been a topic of particular interest to me. As a warming-up, read my post after CIMdata/PDT Roadmap 2020: PLM and the Supply Chain. In this post, I briefly touched on the Eurostep approach – having a Supply Chain Collaboration Hub. Below an image from that post – in this case, the Collaboration Hub is positioned between two OEMs.
Recently Eurostep shared a blog post in the same context: 3 Steps to remove data silos from your supply chain addressing the dreams of many companies: moving from disconnected information silos towards a logical flow of data. This topic is well suited for all companies in the digital transformation process with their supply chain. So, let us hear it from Eurostep.
Eurostep – the company / the mission
First of all, can you give a short introduction to Eurostep as a company and the unique value you are offering to your clients?
Eurostep was founded in 1994 by several world-class experts on product data and information management. In the year 2000, we started developing ShareAspace. We took all the experience we had from working with collaboration in the extended enterprise, mixed it with our standards knowledge, and selected Microsoft as the technology for our software platform.
We now offer ShareAspace as a solution for product information collaboration in all three industry verticals where we are active: Manufacturing, Defense and AEC & Plant.
ShareAspace is based on an information standard called PLCS (ISO 10300-239). This means we have a data model covering the complete life cycle of a product from requirements and conceptual design to an existing installed base. We have added things needed, such as consolidation and security. Our partnership with Microsoft has also resulted in ShareAspace being available in Azure as a service (our Design to Manufacturing software).
Why a supply chain collaboration hub?
Currently, most suppliers work in a disconnected manner with their clients – sending files up and down or the need to work inside the OEM environment. What are the reasons to consider a supply chain collaboration hub or, as you call it, a product information collaboration solution?
The hub concept is not new per se. There are plenty of examples of file sharing hubs. Once you realize that sending files back and forth by email is a disaster for keeping control of your information being shared with suppliers, you would probably try out one of the available file-sharing alternatives.
However, after a while, you begin to realize that a file share can be quite time consuming to keep up to date. Files are being changed. Files are being removed! Some files are enormous, and you realize that you only need a fraction of what is in the file. References within one file to another file becomes corrupt because the other file is of a new version. Etc. Etc.
This is about the time when you realize that you need similar control of the data you share with suppliers as you have in your internal systems. If not better.
A hub allows all partners to continue to use their internal tools and processes. It is also a more secure way of collaboration as the suppliers and partners are not let into the internal systems of the OEM.
Another significant side effect of this is that you only expose the data in the hub intended for external sharing and avoid sharing too much or exposing internal sensitive data.
A hub is also suitable for business flexibility as partners are not hardwired with the OEM. Partners can change, and IT systems in the value chain can change without impacting more than the single system’s connecting to the hub.
Should every company implement a supply chain collaboration hub?
Based on your experience, what types of companies should implement a supply chain collaboration hub and what are the expected benefits?
The large OEMs and 1st tier suppliers certainly benefit from this since they can incorporate hundreds, if not thousands, of suppliers. Sharing technical data across the supply chain from a dedicated hub will remove confusions, improve control of the shared data, and build trust with their partners.
With our cloud-based offering, we now also make it possible for at least mid-sized companies (like 200+ employees) to use ShareAspace. They may not have a well-adopted PLM system or the issues of communicating complex specifications originating from several internal sources. However still, they need to be professional in dealing with suppliers.
The smallest client we have is a manufacturer of pool cleaners, a complex product with many suppliers. The company Weda [www.weda.se] has less than 10 employees, and they use ShareAspace as SaaS. With ShareAspace, they have improved their collaboration process with suppliers and cut costs and lowered inventory levels.
ShareAspace can really scale big. It serves as a collaboration solution for the two new Aircraft carriers in the UK, the QUEEN ELIZABETH class. The aircraft carriers were built by a consortium that was closed in early 2020.
ShareAspace is being used to hold the design data and other documentation from the consortium to be available to the multiple organizations (both inside and outside of the Ministry of Defence) that need controlled access.
What is the dependency on standards?
I always associate Eurostep with the PLCS (ISO 10303-239) standard, providing an information model for “hardware” products along the lifecycle. How important is this standard for you in the context of your ShareAspace offering?
Should everyone adapt to this standard?
We have used PLCS to define the internal data schema in ShareAspace. This is an excellent starting point for capturing information from different systems and domains and still getting it to fit together. Why invent something new?
However, we can import data in most formats, and it does not have to be according to a standard. When connecting to Teamcenter, Windchill, Enovia, SAP, Oracle, Maximo etc., it is more often in a proprietary format than according to any standards.
On the other hand, in some industries like Defense, standards-based data exchange is required and put into contracts. Sometimes it prescribes PLCS. For the plant industry, it could be CFIHOS or ISO15926.
Supply Chain Collaboration and digital transformation
As stated at the beginning of this post, digital transformation is about connecting the information siloes through a digital thread. How important is this related to the supply chain?
Many companies have come a long way in improving their internal management of product data. But still, the exchange and sharing of data with the external world has considerable potential for improvement. Just look at the chaos everyone has experienced with emails, still used a lot, in finding the latest Word document or PowerPoint file. Imagine if you collaborate on a ship, a truck, a power plant, or a piece of complex infrastructure. FTP is not the answer, and for product data, Dropbox is not doing the trick.
A Digital Thread must support versions and changes in all directions, as changes are natural with reasonably advanced products. Much of the information created about or around a product is generated within the supply chain, like production parameters, test and inspection protocols, certifications, and more. Without an intelligent way of capturing this data, companies will continue to spend a fortune on administration trying to manage this manually.
As the Digital Thread extends across the value chain, a useful sharing tool is needed to allow for configuration management across the complete chain – ShareAspace is designed for this. The great thing with PLCS is that it gives a standard model for the Digital Thread covering several Digital Twins. PLCS adds the life cycle component, which is essential, and there is no alternative. Therefore, we are welcome with ShareAspace and PLCS to add capabilities to snapshot standards like IFC etc., that are outside the STEP series of standards.
Learning more
We discussed that a supply chain collaboration hub can have specific value to a company. Where can readers learn more?
There is a lot of information available. Of course, on our Eurostep website, you will find information under the tab Resources or on the ShareAspace website under the tab News.
Other sources are:
What I have learned
- I am surprised to see that the type of Supplier Collaboration Platform delivered by Eurostep is not a booming market. Where Time to Market is significantly impacted by how companies work with their suppliers, most companies still rely on the exchange of data packages.
- The most advanced exchanges are using a model-based definition if relevant. Traditional PLM Vendors will not develop such platforms as the platform needs to be agnostic in both directions.
- Having a recommended data model based on PLCS or a custom-data model in case of a large OEM can provide such a collaboration hub. Relative easy to implement (as you do not change your own PLM) and relatively easy to scale (adding a new supplier is easy). For me, the supplier collaboration platform is a must in a modern, digital connected enterprise.
Conclusion
A lot of marketing money is spent on “Digital Thread” or “Digital Continuity”. If you are looking at the full value chain of product development and operational support, there are still many manual hand-over processes with suppliers. A supplier collaboration hub might be the missing piece of the puzzle to realize a real digital thread or continuity.
After the first episode of “The PLM Doctor is IN“, this time a question from Helena Gutierrez. Helena is one of the founders of SharePLM, a young and dynamic company focusing on providing education services based on your company’s needs, instead of leaving it to function-feature training.
I might come back on this topic later this year in the context of PLM and complementary domains/services.
Now sit back and enjoy.
Note: Due to a technical mistake Helena’s mimic might give you a “CNN-like” impression as the recording of her doctor visit was too short to cover the full response.
PLM and Startups – is this a good match?
Relevant links discussed in this video
Marc Halpern (Gartner): The PLM maturity table
VirtualDutchman: Digital PLM requires a Model-Based Enterprise
Conclusion
I hope you enjoyed the answer and look forward to your questions and comments. Let me know if you want to be an actor in one of the episodes.
The main rule: A single open question that is puzzling you related to PLM.






















Before going further on this topic, there is also the observation that many outspoken PLM experts are “old.” Meanwhile, all kinds of new disruptive technologies are comping up.
General PLM conferences have been an excellent way to listen to other people’s experiences in the past. Depending on the type of conference, you would be able to narrow your learning scope.
One of the best learnings comes from having honest discussions with other people who all have different backgrounds. To be part of such a discussion, you need to have at least some basic knowledge about the topic. This avoids social media-like discussions where millions of “experts” have an opinion behind the keyboard. (The 







[…] (The following post from PLM Green Global Alliance cofounder Jos Voskuil first appeared in his European PLM-focused blog HERE.) […]
[…] recent discussions in the PLM ecosystem, including PSC Transition Technologies (EcoPLM), CIMPA PLM services (LCA), and the Design for…
Jos, all interesting and relevant. There are additional elements to be mentioned and Ontologies seem to be one of the…
Jos, as usual, you've provided a buffet of "food for thought". Where do you see AI being trained by a…
Hi Jos. Thanks for getting back to posting! Is is an interesting and ongoing struggle, federation vs one vendor approach.…