You are currently browsing the tag archive for the ‘Data Model’ tag.

A week ago I attended the joined CIMdata Roadmap and PDT Europe conference in Stuttgart as you can recall from last week’s post: The weekend after CIMdata Roadmap / PDT Europe 2018. As there was so much information to share, I had to split the report into two posts. This time the focus on the PDT Europe. In general, the PDT conferences have always been focusing on sharing experiences and developments related to standards. A topic you will not see at PLM Vendor conferences. Therefore, your chance to learn and take part if you believe in standards.

This year’s theme: Collaboration in the Engineering and Manufacturing Supply Chain – the Extended Digital Thread and Smart Manufacturing. Industry 4.0 plays a significant role here.

 

Model-based X: What is it and what is the status?

I have seen Peter Bilello presenting this topic now several times, and every time there is a little more progress. The fact that there is still an acronym war illustrated that the various aspects of a model-based approach are not yet defined. Some critics will be stating that’s because we do not need model-based and it is only a vendor marketing trick again.  Two comments here:

  • If you want to implement an end-to-end model-based approach including your customers and supply chain, you cannot avoid standard. More will become clear when you read the rest of this post. Vendors will not promote standards as it reduces their capabilities to deliver unique So standards must come from the market, not from the marketing.
  • In 2007 Carl Bass, at that time CEO at Autodesk made his statement: “There are only three customers in the world that have a PLM problem; Dassault, PTC, and There are no other companies that say I have a PLM problem”. Have a look here. PLM is understood by now and even by Autodesk. The statement illustrates that in the beginning the PLM target was not clear and people thought PLM was a system instead of a strategic approach. Model-based ways of working have to go through the same learning path, hopefully, faster.

Peter’s presentation was a good walk-through pointing out what exists, where we focus and that there is still working to be done. Not by vendors but by companies. Therefore I wholeheartedly agree with Peter’s closing remarks – no time to sit back and watch if you want to benefit from model-based approaches.

Smart Manufacturing

Kenny Swope, known from his presentations related to Boeing, now spoke to us as the Chair of the ISO/TC 184/SC 4 workgroup related to Industrial Data. To say it in decoded mode: Kenny is heading Sub-committee 4 with a focus on Industrial Data. SC4 is part of a more prominent theme: Automation Systems and integration identified by TC 184 all as part of the ISO framework. The scope:

Standardization of the content, meaning, structure, representation and quality management of the information required to define an engineered product and its characteristics at any required level of detail at any part of its lifecycle from conception through disposal, together with the interfaces required to deliver and collect the information necessary to support any business or technical process or service related to that engineered product during its lifecycle.

Perhaps boring to read if you think about all the demos you have seen at trade shows related to Smart Manufacturing. If you want these demos to become true in a vendor-independent environment, you will need to agree on a common framework of definitions to ensure future continuity beyond the demo. And here lies the business excitement, the real competitive advantages companies can have implementing Smart Manufacturing in a Scaleable, future-oriented way.

One of the often heard statements is that standards are too slow or incomplete. Incomplete is not a problem when there is a need, the standard will follow. Compare it with language, we will always invent new words for new concepts.

Being slow might be the case in the past. Kenny showed the relative fast convergence from country-specific Smart Manufacturing standards into a joined ISO/IEC framework – all within three years. ISO and IEC have been teaming-up already to build Smart Manufacturing Reference models.

This is already a considerable effort,  as the local reference models need to be studied and mapped to a common architecture. The target is to have a first Technical Specification for a joint standard final 2020 – quite fast!

Meinolf Gröpper from the German VDMA  presented what they are doing to support Smart Manufacturing / Industrie 4.0. The VDMA is a well-known engineering federation with 3200 member companies, 85 % of them are Small and Medium Enterprises – the power of the German economy.

The VDMA provides networking capabilities, readiness assessments for members to be the enabler for companies to transform. As Meinolf stated Industrie 4.0 is not about technology, it is about cross-border services and international cooperation. A strategy that every company has to develop and if possible implement at its own pace. Standards will accelerate the implementation of Industrie 4.0

The Smart Manufacturing session was concluded by Gunilla Sivard, Professor at KTH in Stockholm and Hampus Wranér, Consultant at Eurostep. They presented the work done on the DIgln project, targeting an infrastructure for Smart Manufacturing.

The presentation showed the implementation of the testbed using twittering bus communication and the ISO 10303-239 PLCS information standard as the persistent layer. The results were promising to further build capabilities on top of the infrastructure below:

The conclusion from the Smart Manufacturing session was that emerging and available standards can accelerate the deployment.

 

Enabling digital continuity in the Factory of the Future

Alcibiades Gonzalez-Noval from Airbus shared challenges and the strategy for Airbus’s factory of the future based on digital continuity from the virtual world towards the physical world, connecting with PLM, ERP, and MOM. Concepts many companies are currently working on with various maturity stages.

I agree with his lessons learned. We cannot think in silos anymore in a digital future – everything is connected. And please forget the PoC, to gain time start piloting and fail or succeed fast. Companies have lost years because of just doing PoCs and not going into action. The last point, networks segregation for sure is an issue, relevant for plant operations. I experienced this also in the past when promoting PLM concepts for (nuclear) owners/operators of plants. Network security is for sure an issue to resolve.

 

Cross-Discipline Lifecycle Collaboration Forum
Setting up the digital thread across engineering and the value chain.

Peter Gerber, Chairman of CDLC Forum and Data Exchange & Integration Leader at Schaefller and Pierre Bodin at Senior Manager Mews Partners, presented their findings related to the challenge of managing complex products (mechanical, electrical, software using system engineering methodology)  to work properly at affordable cost in a real-time mode, multidisciplinary and coordination across the whole value chain. Something you might expect could be done when reviewing all PLM Vendor’s marketing materials, something you might expect hard to do when remembering Martin Eigner’s statement that 95 % of the companies have not solved mechatronics collaboration yet. (See: The weekend after CIMdata PLM Roadmap and PDT Europe)

A demonstrator was defined, and various vendors participated in building a demonstrator based on their Out-Of-The-Box capabilities. The result showed that for all participants there were still gaps to resolve for full collaboration. A new version of the demonstrator is now planned for the middle of next year – curious to learn the results at that time. Multi-disciplinary collaboration is a (conceptual) pillar for future digital business – it needs to be possible.

 

A Digital Thread based on the PLCS standard.

Nigel Shaw, Eurostep’s managing director in the UK, took us through his evolution of PLCS (Product Life Cycle Support) and extension of the ISO 10303 STEP standard. (STEP Standard for Exchange of Product data). Nigel mentioned how over all these years, millions (and a lot of brain power) have been invested in PLCS to where it is now.

PLCS has been extremely useful as an interface standard for contracting, provide product data in a neutral way. As an example, last year the Swedish Defense organization (FMV) and France’s DGA made PLCS DEXs as part of the contractual conditions. It would be too costly to have all product data for all defense systems in proprietary vendor formats and this over the product lifecycle.

Those following the standards in the process industry will rely on ISO 15926 / CFIHOS as this standard’s dictionary, and data model is more geared to process data- and in particular the exchange of data from the various contractors with the owner/operator.

Coming back to PLCS and the Digital Twin – it is all about digital continuity of information. Otherwise, if we have to recreate information in every lifecycle stage of a product (design/manufacturing / operations), it will be too costly and not digital connected. This illustrates the growing needs for standards. I had nothing to add to Nigel’s conclusions:

It is interesting to note that product management has moved a long way over the last 10-20 years however as we include more and more into PLM, there are all the time new concepts to be solved. The cases we discuss today in our PLM communities were most of the time visions 10 years ago. Nowadays we want to include Model-Based Systems Engineering, 3D Modeling and simulation, electronics and software and even aftermarket, product support in true PLM. This was not the case 20 years ago. The people involved in the development of PLCS were for sure visionaries as product data connectivity along the whole lifecycle is needed and enabled by the standard.

 

Investing in Industry 4.0?
Hard Realities of the Grand Vision.

Marc Halpern from Gartner is one of the regular speakers at the PDT conference. Unfortunate he could not be with us that day, however, through a labor-intensive connection (mobile phone close to the speaker and Nigel Shaw trying to stay in sync with the presented slides) we could hear Marc speak about what we wanted to achieve too – a digital continuity.

Marc restated the massive potential of Industrie 4.0 when it comes to scalability, agility, flexibility, and efficiency.

Although technologies are evolving rapidly, it is the existing legacy that inhibits fast adoption. A topic that was also central in my presentation. It is not just a change in technology, there is much more connected.

Marc recommends a changing role for IT, where they should focus more on business priorities and business leadership strategies. This as opposed to the classical role of the IT organization where IT needed to support the business, now they will be part of leading the business too.

To orchestrate such an IT evolution, Marc recommends a “systems of systems” planning and execution across IT and Business. One of my recent blog posts: Moving to a model-based enterprise:  The business (information) model can be seen in that context.

How to deal with the incompatible future?

I was happy to conclude the sessions with the topic that concerns me the most at this time. Companies in their current business are already struggling to get aligned and coordinated between disciplines and external stakeholders, the gap to be connected is vast as it requires a master data management approach, an enterprise data model and model-based ways of working. Read my posts from the past ½ year starting here, and you get the picture.

Note: This image is based on Marc Halpern’s (Gartner) Technology/Maturity diagram from PDT 2015

I concluded with explaining companies need to learn to work in two modes. One mode will be the traditional way of working which I call the coordinated approach and a growing focus on operating in a connected mode.  You can see my full presentation here on SlideShare: How to deal with the incompatible future.

Conclusion

The conference was closed with a panel discussion where we shared our concerns related to the challenges companies face to change their traditional ways of working meanwhile entering a digital era. The positive points are there – baby steps – PLM is becoming understood, the significance of standards is becoming more clear. The need: a long-term vision.

 This concludes my review of an excellent conference – I learned again a lot and I hope to see you next year too. Thanks again to CIMdata and Eurostep for organizing this event

 

 

 

 

 

 

Advertisements

Last week I attended the long-awaited joined conference from CIMdata and Eurostep in Stuttgart. As I mentioned in earlier blog posts. I like this conference because it is a relatively small conference with a focused audience related to a chosen theme.

Instead of parallel sessions, all attendees follow the same tracks and after two days there is a common understanding for all. This time there were about 70 people discussing the themes:  Digitalizing Reality—PLM’s role in enabling the digital revolution (CIMdata) and Collaboration in the Engineering and Manufacturing Supply Chain –the Extended Digital Thread and Smart Manufacturing (EuroStep)

As you can see all about Digital. Here are my comments:

The State of the PLM Industry:
The Digital Revolution

Peter Bilello kicked off with providing an overview of the PLM industry. The PLM market showed an overall growth of 7.3 % toward 43.6 Billion dollars. Zooming in into the details cPDM grew with 2.9 %. The significant growth came from the PLM tools (7.7 %). The Digital Manufacturing sector grew at 6.2 %. These numbers show to my opinion that in particular, managing collaborating remains the challenging part for PLM. It is easier to buy tools than invest in cPDM.

Peter mentioned that at the board level you cannot sell PLM as this acronym is too much framed as an engineering tool. Also, people at the board have been trained to interpret transactional data and build strategies on that. They might embrace Digital Transformation. However, the Product innovation related domain is hard to define in numbers. What is the value of collaboration? How do you measure and value innovation coming from R&D? Recently we have seen more simplified approaches how to get more value from PLM. I agree with Peter, we need to avoid the PLM-framing and find better consumable value statements.

Nothing to add to Peter’s closing remarks:

 

An Alternative View of the Systems Engineering “V”

For me, the most interesting presentation of Day 1 was Don Farr’s presentation. Don and his Boeing team worked on depicting the Systems Engineering process for a Model-Based environment. The original “V” looks like a linear process and does not reflect the multi-dimensional iterations at various stages, the concept of a virtual twin and the various business domains that need to be supported.

The result was the diamond symbol above. Don and his team have created a consistent story related to the depicted diamond which goes too far for this blog post. Current the diamond concept is copyrighted by Boeing, but I expect we will see more of this in the future as the classical systems engineering “V” was not design for our model-based view of the virtual and physical products to design AND maintain.

 

Sponsor vignette sessions

The vignette sponsors of the conference, Aras, ESI,-group, Granta Design, HCL, Oracle and TCS all got a ten minutes’ slot to introduce themselves, and the topics they believed were relevant for the audience. These slots served as a teaser to come to their booth during a break. Interesting for me was Granta Design who are bringing a complementary data service related to materials along the product lifecycle, providing a digital continuity for material information. See below.

 

The PLM – CLM Axis vital for Digitalization of Product Process

Mikko Jokela, Head of Engineering Applications CoE, from ABB, completed the morning sessions and left me with a lot of questions. Mikko’s mission is to provide the ABB companies with an information infrastructure that is providing end-to-end digital services for the future, based on apps and platform thinking.

Apparently, the digital continuity will be provided by all kind of BOM-structures as you can see below.In my post, Coordinated or Connected, related to a model-based enterprise I call this approach a coordinated approach, which is a current best practice, not an approach for the future. There we want a model-based enterprise instead of a BOM-centric approach to ensure a digital thread. See also Don Farr’s diamond. When I asked Mikko which data standard(s) ABB will use to implement their enterprise data model it became clear there was no concept yet in place. Perhaps an excellent opportunity to look at PLCS for the product related schema.

A general comment: Many companies are thinking about building their own platform. Not all will build their platform from scratch. For those starting from scratch have a look at existing standards for your industry. And to manage the quality of data, you will need to implement Master Data Management, where for the product part the PLM system can play a significant role. See Master Data Management and PLM.

 

Systems of Systems Approach to Product Design

Professor Martin Eigner keynote presentation was about the concepts how new products and markets need a Systems of Systems approach combined with Model-Based Systems Engineering (MBSE) and Product Line Engineering (PLE) where the PLM system can be the backbone to support the MBSE artifacts in context. All these concepts require new ways of working as stated below:

And this is a challenge. A quick survey in the room (and coherent with my observations from the field) is the fact that most companies (95 %) haven’t even achieved to work integrated for mechatronics products. You can imagine the challenge to incorporate also Software, Simulation, and other business disciplines. Martin’s presentations are always an excellent conceptual framework for those who want to dive deeper a start point for discussion and learning.

Additive Manufacturing (Enabled Supply) at Moog

Moog Inc, a manufacturer of precision motion controls for various industries have made a strategic move towards Additive Manufacturing. Peter Kerl, Moog’s Engineering Systems Manager, gave a good introduction what is meant by Additive Manufacturing and how Moog is introducing Additive Manufacturing in their organization to create more value for their customer base and attract new customers in a less commodity domain. As you can image delivering products through Additive Manufacturing requires new skills (Design / Materials), new processes and a new organizational structure. And of course a new PLM infrastructure.

Jim van Oss, Moog’s PLM Architect and Strategist, explained how they have been involved in a technology solution for digital-enabled parts leveraging blockchain technology.  Have a look at their VeriPart trademark. It was interesting to learn from Peter and Jim that they are actively working in a space that according to the Gartner’s hype curve is in the early transform phase.  Peter and Jim’s presentation were very educational for the audience.

For me, it was also interesting to learn from Jim that at Moog they were really practicing the modes for PLM in their company. Two PLM implementations, one with the legacy data and the wrong data for the future and one with the new data model for the future. Both implementations build on the same PLM vendor’s release. A great illustration showing the past and the future data for PLM are not compatible

Value Creation through Synergies between PLM & Digital Transformation

Daniel Dubreuil, Safran’s CDO for Products and Services gave an entertaining lecture related to Safran’s PLM journey and the introduction of new digital capabilities, moving from an inward PLM system towards a digital infrastructure supporting internal (model-based systems engineering / multiple BOMs) and external collaboration with their customers and suppliers introducing new business capabilities. Daniel gave a very precise walk-through with examples from the real world. The concluding slide: KEY SUCCESS FACTORS was a slide that we have seen so many times at PLM events.

Apparently, the key success factors are known. However, most of the time one or more of these points are not possible to address due to various reasons. Then the question is: How to mitigate this risk as there will be issues ahead?

 

Bringing all the digital trends together. What’s next?

The day ended with a virtual Fire Place session between Peter Bilello and Martin Eigner, the audience did not see a fireplace however my augmented twitter feed did it for me:

Some interesting observations from this dialogue:

Peter: “Having studied physics is a good base for understanding PLM as you have to model things you cannot see” – As I studied physics I can agree.

Martin: “Germany is the center of knowledge for Mechanical, the US for Electronics and now China becoming the center for Electronics and Software” Interesting observation illustrating where the innovation will come from.

Both Peter and Martin spent serious time on the importance of multidisciplinary education. We are teaching people in silos, faculties work in silos. We all believe these silos must be broken down. It is hard to learn and experiment skills for the future. Where to start and lead?

Conclusion:

The PLM roadmap had some exciting presentations combined with CIMdata’s PLM update an excellent opportunity to learn and discuss reality. In particular for new methodologies and technologies beyond the hype. I want to thank CIMdata for the superb organization and allowing me to take part. Next week I will follow-up with a review of the PDT Europe conference part (Day 2)

 

 

The digital thread according to GE

In my earlier posts, I have explored the incompatibility between current PLM practices and future needs for digital PLM.  Digital PLM is one of the terms I am using for future concepts. Actually, in a digital enterprise, system borders become vague, it is more about connected platforms and digital services. Current PLM practices can be considered as Coordinated where the future for PLM is aiming at Connected information. See also Coordinated or Connected.

Moving from current PLM practices towards modern ways of working is a transformation for several reasons.

  • First, because the scope of current PLM implementation is most of the time focusing on engineering. Digital PLM aims to offer product information services along the product lifecycle.
  • Second, because the information in current PLM implementations is mainly stored in documents – drawings still being the leading In advanced PLM implementations BOM-structures, the EBOM and MBOM are information structures, again relying on related specification documents, either CAD- or Office files.

So let’s review the transformation challenges related to moving from current PLM to Digital PLM

Current PLM – document management

The first PLM implementations were most of the time advanced cPDM implementations, targeting sharing CAD models and drawings. Deployments started with the engineering department with the aim to centralize product design information. Integrations with mechanical CAD systems had the major priority including engineering change processes. Multidisciplinary collaboration enabled by introducing the concept of the Engineering Bill of Materials (EBOM).  Every discipline, mechanical, electrical and sometimes (embedded) software teams, linked their information to the EBOM. The product release process was driven by the EBOM. If the EBOM is released, the product is fully specified and can be manufactured.

Although people complain implementing PLM is complex, this type of implementation is relatively simple. The only added mental effort you are demanding from the PLM user is to work in a structured way and have a more controlled (rigid) way of working compared to a directory structure approach. For many people, this controlled way of working is already considered as a limitation of their freedom. However, companies are not profitable because their employees are all artists working in full freedom. They become successful if they can deliver in some efficient way products with consistent quality. In a competitive, global market there is no room anymore for inefficient ways of working as labor costs are adding to the price.

The way people work in this cPDM environment is coordinated, meaning based on business processes the various stakeholders agree to offer complete sets of information (read: documents) to contribute to the full product definition. If all contributions are consistent depends on the time and effort people spent to verify and validate its consistency. Often this is not done thoroughly and errors are only discovered during manufacturing or later in the field. Costly but accepted as it has always been the case.

Next Step PLM – coordinated document management / item-centric

When the awareness exists that data needs to flow through an organization is a consistent manner, the next step of PLM implementations come into the picture. Here I would state we are really talking about PLM as the target is to share product data outside the engineering department.

The first logical extension for PLM is moving information from an EBOM view (engineering) towards a Manufacturing Bill of Materials (MBOM) view. The MBOM is aiming to represent the manufacturing definition of the product and becomes a placeholder to link with the ERP system and suppliers directly. Having an integrated EBOM / MBOM process with your ERP system is already a big step forward as it creates an efficient way of working to connect engineering and manufacturing.

As all the information is now related to the EBOM and MBOM, this approach is often called the item-centric approach. The Item (or Part) is the information carrier linked to its specification documents.

 

Managing the right version of the information in relation to a specific version of the product is called configuration management. And the better you have your configuration management processes in place, the more efficient and with high confidence you can deliver and support your products.  Configuration Management is again a typical example where we are talking about a coordinated approach to managing products and documents.

Implementing this type of PLM requires already more complex as it needs different disciplines to agree on a collective process across various (enterprise) systems. ERP integrations are technically not complicated, it is the agreement on a leading process that makes it difficult as the holistic view is often failing.

Next, next step PLM – the Digital Thread

Continuing reading might give you the impression that the next step in PLM evolution is the digital thread. And this can be the case depending on your definition of the digital thread. Oleg Shilovitsky recently published an article: Digital Thread – A new catchy phrase to replace PLM? related to his observations from  ConX18 illustrate that there are many viewpoints to this concept. And of course, some vendors promote their perfect fit based on their unique definition. In general, I would classify the idea of Digital Thread in two approaches:

The Digital Thread – coordinated

In the Digital Thread – coordinated approach we are not revolutionizing the way of working in an enterprise. In the coordinated approach, the PLM environment is connected with another overlay, combining data from various disciplines into an environment where the dependencies are traceable. This can be the Aras overlay approach (here explained by Oleg Shilovitsky), the PTC Navigate approach or others, using a new extra layer to connect the various discipline data and create traceability in a more or less non-intrusive way. Similar concepts, but less intrusive can be done through Business Intelligence applications, although they are more read-only than a system approach.

The Digital Thread – connected

In the Digital Thread – connected approach the idea is that information is stored in an extreme granular way and shared among disciplines. Instead of the coordinated way, where every discipline can have their own data sources, here the target is to be data-driven (neutral/standard formats). I described this approach in the various aspects of the model-based enterprise. The challenge of a connected enterprise is the standardized data definition to make it available for all stakeholders.

Working in a connected enterprise is extremely difficult, in particular for people educated in the old-fashioned ways of working. If you have learned to work with shared documents, like Google Docs or Office documents in sharing mode, you will understand the mental change you have to go through. Continuous sharing the information instead of waiting until you feel your part is complete.

In the software domain, companies are used to work this way and to integrate data in a continuous stream. We have to learn to apply these practices also to a complete product lifecycle, where the product consists of hardware and software.

Still, the connect way if working is the vision where digital enterprises should aim for as it dramatically reduces the overhead of information conversion, overhead, and ambiguity. How we will implement in the context of PLM / Product Innovation is a learning process, where we should not be blocked by our echo chamber as Jan Bosch states it in his latest post: Don’t Get Stuck In Your Company’s Echo Chamber

Jan Bosch is coming from the software world, promoting the Software-Centric Systems conference SC2 as a conference to open up your mind. I recommend you to take part in upcoming PLM related events: CIMdata’s PLM roadmap Europe combined with PDT Europe on 24/25th October in Stuttgart, or if you are living in the US there is the upcoming PI PLMx CHICAGO 2018 on Nov 5/6th.

Conclusion

Learning and understanding are crucial and takes time. A digital transformation has many aspects to learn – keep in mind the difference between coordinated (relatively easy) and connected (extraordinarily challenging but promising). Unfortunate there is no populist way to become digital.

Note:
If you want to continue learning, please read this post – The True Impact of Industry 4.0 Revealed  -and its internal links to reference information from Martijn Dullaart – so relevant.

 

What I want to discuss this time is the challenging transformation related to product data that needs to take place.

The top image of this post illustrates the current PLM world on the left, and on the right the potential future positioning of PLM in a digital enterprise.  How the right side will behave is still vague – it can be a collection of platforms or a vast collection of small services all contributing to the performance of the company.  Some vendors might dream, all these capabilities are defined in one system of systems, like the human body; all functions are available and connected.

Coordinated or connected?

This is THE big question for a future digital enterprise. In the current PLM approach, there are governance structures that allow people to share data along the product lifecycle in a structured way.

These governance structures can be project breakdown structures, where with a phase-gate approach the full delivery is guided. Deliverables related to task and gates will make sure information is stored available for every stakeholder. For example, a well-known process in the automotive industry, the Advanced Product Quality Process ( APQP process) is a standardized approach to make sure parts or products are introduced with the right quality for the customer.

Deliverables at any stage in the process can be reviewed or consumed by another stakeholder. The result is most of the time a collection of approved documents (Office-type, Design & Test files) stored centrally. This is what I would call a coordinated data approach.

In complex environments, besides the project governance, there will be product structures and Bill of Materials, where each object in such a structure will be the placeholder for related information. In case of a product structure it can be its specifications per component, in case of a Bill of Materials, it can be its design specification (usually in CAD models) and its manufacturing specifications, in case of an MBOM.

An example of structures used in Enovia

Although these structures contain information about the product composition themselves, the related information makes the content understandable/realizable.

Again it is a coordinated approach, and most PLM systems and implementations are focusing on providing these structures.

Sometimes with their own system only – you need to follow the vendor portfolio to get the full benefit  or sometimes the system is positioned as an overlay to existing systems in the company, therefore less invasive.

Presentation from Martin Eigner – explaining the overlay concept based on Aras

Providing the single version of the truth is often associated with this approach. The question is: Is the green bin on the left the single version of the truth?

The Coordinated – Single Version of the Truth – problem

The challenge of a coordinated approach is that there is no thorough consistency checking if the data delivered is representing the real truth. Through serious review procedures, we do our best to make sure every deliverable has the required content and quality. As information inside these deliverables is not connected to the outside world, there will be discrepancies between reality and what has been stored. Still, we feel comfortable enough as an organization to pretend we know where the risks are. Until the costly impossible happens !

The connected enterprise

The ultimate dream of a digital enterprise is that everything relevant is connected in context. This means no more documents or files but a very granular information model for linking data and keeping it in context. We can apply algorithms and automation to connected data and use Artificial Intelligence to make sense of massive amounts of data.

Connected data allows us to share combined sets of information that are relevant to a particular role. Real-time dashboarding is one of the benefits of such an infrastructure. There are still a lot of challenges with this approach. How do we know which information is valid in the context of other information? What are the rules that describe a valid product or project baseline at a particular time?

Although all data is stored as unique information objects in a network of information, we cannot apply the old mechanisms for a coordinated approach all the time. Generated reports from a connected environment can still serve as baselines or records related to a specific state, such as when the design was approved for manufacturing, we can generate approved Product Baselines structures or Bill of Materials structures.

However, this linearity in lifecycle for passing information through an enterprise will not exist anymore. It might be there are various design alternatives and the delivery process is already part of the design phase. Through integrated virtual simulation and testing, we reach a state that the product satisfies the market for that moment and the delivery process is known at the same time

Almost immediately and based on first experiences from the field, new features can be added virtually tested and validated for the next stage. We need to design new PLM infrastructures that can support this granularity and therefore complexity.

The connected – Single Version of the Truth – problem

The concepts I described related to the connected enterprise made me realize that this is analogue to how the brain works. Our brain is a giant network of connected information, dynamically maintaining associations, having different abstraction levels and always pretending there is one truth.

If you want to understand a potential model of the brain, please read On Intelligence from Jeff Hawkins. With the possible upcoming of the Quantum Computer, we might be able to create performing brain models.

In my earlier post: Are we blocking our future,  I referred to the book; The Idiot Brain: What Your Head is Really Up To from Dean Burnett, where Dean is stating that due to the complexity of stored information our brain continuously adapts “non-compliant” information to make sure the owner of the brain feels comfortable.

What we think that is the truth might be just the creation from the brain, combining the positive parts into a compelling story and suppressing or deleting information that does not fit.  Although it sounds absurd, I believe if we are able to create a connected digital enterprise we will face the same symptoms.  Due to the complexity of connected information, we are looking for the best suitable version, and as all became so complex, ordinary human beings will no longer be able to distinguish this

 

Conclusion:

As part of the preparation for the upcoming PDT Europe 2018, I was investigating the topics coordinated and connected enterprise to discover potential transformation steps. We all need to explore the future with an open mind, and the challenge is: WHERE and HOW FAST can we transform from coordinated to connected? I am curious if you have experiences or thoughts on this topic.

 

 

As I am preparing my presentation for the upcoming PDT Europe 2017 conference in Gothenburg, I was reading relevant experiences to a data-driven approach. During PDT Europe conference we will share and discuss the continuous transformation of PLM to support the Lifecycle Model-Based Enterprise. 

One of the direct benefits is that a model-based enterprise allows information to be shared without the need to have documents to be converted to a particular format, therefore saving costs for resources and bringing unprecedented speed for information availability, like what we are used having in a modern digital society.

For me, a modern digital enterprise relies on data coming from different platforms/systems and the data needs to be managed in such a manner that it can serve as a foundation for any type of app based on federated data.

This statement implies some constraints. It means that data coming from various platforms or systems must be accessible through APIs / Microservices or interfaces in an almost real-time manner. See my post Microservices, APIs, Platforms and PLM Services. Also, the data needs to be reliable and understandable for machine interpretation. Understandable data can lead to insights and predictive analysis. Reliable and understandable data allows algorithms to execute on the data.

Classical ECO/ECR processes can become highly automated when the data is reliable, and the company’s strategy is captured in rules. In a data-driven environment, there will be much more granular data that requires some kind of approval status. We cannot do this manually anymore as it would kill the company, too expensive and too slow. Therefore, the need for algorithms.

What is understandable data?

I tried to avoid as long as possible academic language, but now we have to be more precise as we enter the domain of master data management. I was triggered by this recent post from Gartner: Gartner Reveals the 2017 Hype Cycle for Data Management. There are many topics in the hype cycle, and it was interesting to see Master Data Management is starting to be taken seriously after going through inflated expectations and disillusionment.

This was interesting as two years ago we had a one-day workshop preceding PDT Europe 2015, focusing on Master Data Management in the context of PLM. The attendees at that workshop coming from various companies agreed that there was no real MDM for the engineering/manufacturing side of the business. MDM was more or less hijacked by SAP and other ERP-driven organizations.

Looking back, it is clear to me why in the PLM space MDM was not a real topic at that time. We were still too much focusing and are again too much focusing on information stored in files and documents. The only area touched by MDM was the BOM, and Part definitions as these objects also touch the ERP- and After Sales-  domain.

Actually, there are various MDM concepts, and I found an excellent presentation from Christopher Bradley explaining the different architectures on SlideShare: How to identify the correct Master Data subject areas & tooling for your MDM initiative. In particular, I liked the slide below as it comes close to my experience in the process industry

Here we see two MDM architectures, the one of the left driven from ERP. The one on the right could be based on the ISO-15926 standard as the process industry has worked for over 25 years to define a global exchange standard and data dictionary. The process industry was able to reach such a maturity level due to the need to support assets for many years across the lifecycle and the relatively stable environment. Other sectors are less standardized or so much depending on new concepts that it would be hard to have an industry-specific master.

PLM as an Application Specific Master?

If you would currently start with an MDM initiative in your company and look for providers of MDM solution, you will discover that their values are based on technology capabilities, bringing data together from different enterprise systems in a way the customer thinks it should be organized. More a toolkit approach instead of an industry approach. And in cases, there is an industry approach it is sporadic that this approach is related to manufacturing companies. Remember my observation from 2015: manufacturing companies do not have MDM activities related to engineering/manufacturing because it is too complicated, too diverse, too many documents instead of data.

Now with modern digital PLM, there is a need for MDM to support the full digital enterprise. Therefore, when you combine the previous observations with a recent post on Engineering.com from Tom Gill: PLM Initiatives Take On Master Data Transformation I started to come to a new hypotheses:

For companies with a model-based approach that has no MDM in place, the implementation of their Product Innovation Platform (modern PLM) should be based on the industry-specific data definition for this industry.

Tom Gill explains in his post the business benefits and values of using the PLM as the source for an MDM approach. In particular, in modern PLM environments, the PLM data model is not only based on the BOM.  PLM now encompasses the full lifecycle of a product instead of initially more an engineering view. Modern PLM systems, or as CIMdata calls them Product Innovation Platforms, manage a complex data model, based on a model-driven approach. These entities are used across the whole lifecycle and therefore could be the best start for an industry-specific MDM approach. Now only the industries have to follow….

Once data is able to flow, there will be another discussion: Who is responsible for which attributes. Bjørn Fidjeland from plmPartner recently wrote: Who owns what data when …?  The content of his post is relevant, I only would change the title: Who is responsible for what data when as I believe in a modern digital enterprise there is no ownership anymore – it is about sharing and responsibilities

 

Conclusion

Where MDM in the past did not really focus on engineering data due to the classical document-driven approach, now in modern PLM implementations, the Master Data Model might be based on the industry-specific data elements, managed and controlled coming from the PLM data model

 

Do you follow my thoughts / agree ?

 

 

classificationIn my previous post describing the various facets of the EBOM, I mentioned several times classification as an important topic related to the PLM data model. Classification is crucial to support people to reuse information and, in addition, there are business processes that are only relevant for a particular class of information, so it is not only related to search/reuse support.

In 2008, I wrote a post about classification, you can read it here. Meanwhile, the world has moved on, and I believe more modern classification methods exist.

Why classification ?

searchFirst of all classification is used to structure information and to support retrieval of the information at a later moment, either for reuse or for reference later in the product lifecycle. Related to reuse, companies can save significant money when parts are reused. It is not only the design time or sourcing time that is reduced. Additional benefits are lower risks for errors (fewer discoveries), reduced process and approval time (human overhead), reduced stock (if applicable), and more volume discount (if applicable) and reduced End-Of-Life handling.

An interesting discussion about reuse started by Joe Barkai can also be found on LinkedIn here, including interesting comments

Classification can also be used to control access to certain information (mainly document classification), or classification can be used to make sure certain processes are followed, e.g. export control, hazardous materials, budget approvals, etc. Although I will speak mainly about part classification in this post, classification can be used for any type of information in the PLM data model.

Classification standards

din4000Depending on the industry you are working in, there are various classification standards for parts. When I worked in the German-speaking countries (the DACH-länder) the most discussed classification at that time was DIN4000 (Sachmerkmal-liste), a must have standard for many of the small and medium sized manufacturing companies. The DIN 4000 standard had a predefined part hierarchy and did not describe the necessary properties per class. I haven’t met a similar standard in other countries at that time.

Another very generic classification I have seen are the UNSPC standard, again a hierarchical classification supporting everything in the universe but no definition of attributes.

15926Other classification standards like ISO13399, RosettaNET, ISO15926 and IFC exist to support collaboration and/or the supply chain. When you want to exchange data with other disciplines or partners. The advantage of a standard definition (with attributes) is that you can exchange data with less human processing (saving labor costs and time – the benefit of a digital enterprise).

I will not go deeper into the various standards here as I am not the expert for all the standards. Every industry has its own classification standards, a hierarchical standard, and if more advanced the hierarchy is also supported by attributes related to each class. But let´s go into the data model part.

Classification and data model

clip_image002The first lesson I learned when implementing PLM was that you should not build your classification hard-coded into the PLM, data model. When working with SmarTeam is was very easy to define part classes and attributes to inherit. Some customers had more than 300 classes represented in their data model just for parts. You can imagine that it looks nice in a demo. However when it comes to reality, a hard-coded classification becomes a pain in the model. (left image, one of the bad examples from the past)

1 – First of all, classification should be dynamic, easy to extend.

2 – The second problem however with a hard-coded classification was that once a part is defined for the first time the information object has a fixed class. Later changes need a lot of work (relinking of information / approval processes for the new information).

3 – Finally, the third point against a hard-coded classification is that it is likely that parts will be classified according to different classifications at the same time. The image bellow shows such a multiple classification.

multiclass

So the best approach is to have a generic part definition in your data model and perhaps a few subtypes. Companies tend to differentiate still between hardware (mechanical / electrical) parts and software parts.

Next a part should be assigned at least to one class, and the assignment to this class would bring more attributes to the part. Most of the PLM systems that support classification have the ability to navigate through a class hierarchy and find similar parts.

When parts are relevant for ERP they might belong to a manufacturing parts class, which add particular attributes required for a smooth PLM – ERP link. Manufacturing part types can be used as templates for ERP to be completed.

This concept is also shared by Ed Lopategui as commented to my earlier post about EBOM Part types. Ed states:

Think part of the challenge moving forward is we’ve always handled these as parts under different methodologies, which requires specific data structures for each, etc. The next gen take on all this needs to be more malleable perhaps. So there are just parts. Be they service or make/buy or some combination – say a long lead functional standard part and they would acquire the properties, synchronizations, and behaviors accordingly. People have trouble picking the right bucket, and sometimes the buckets change. Let the infrastructure do the work. That would help the burden of multiple transitions, where CAD BOM to EBOM to MBOM to SBOM eventually ends up in a chain of confusion.

I fully agree with his statement and consider this as the future trend of modern PLM: Shared data that will be enriched by different usage through the lifecycle.

Why don’t we classify all data in PLM?

There are two challenges for classification in general.

  • The first one is that the value of classification only becomes visible in the long-term, and I have seen several young companies that were only focusing on engineering. No metadata in the file properties, no part-centric data management structure and several years later they face the lack of visibility what has been done in the past. Only if one of the engineers remembers a similar situation, there is a chance of reuse.
  • The second challenge is that through a merger or acquisition suddenly the company has to manage two classifications. If the data model was clean (no hard-coded subclasses) there is hope to merge the information together. Otherwise, it might become a painful activity to discover similarities.

SO THINK AHEAD EVEN IF YOU DO NOT SEE THE NEED NOW !

Modern search based applications

There are ways to improve classification and reuse by using search-based application which can index archives and try to find similarity in properties / attributes. Again if the engineers never filled the properties in the CAD model, there is little to nothing to recover as I experienced in a customer situation. My PLM US peer, Dick Bourke, wrote several articles about search-based applications and classification for engineering.com, which are interesting to read if you want to learn more: Useful Search Applications for Finding Engineering Data

So much to discuss on this topic, however I reached my 1000 words again Sad smile

Conclusion

Classification brings benefits for reuse and discovery of information although benefits are long-term. Think long-term too when you define classifications. Keep the data model simple and add attributes groups to parts based on functional classifications. This enables a data-driven PLM implementation where the power is in the attributes not longer in the part number. In the future, search-based applications will offer a quick start to classify and structure data.

 

imageSomeone notified me that not everyone subscribed to my blog necessary will read my posts on LinkedIn. Therefore I will repost the upcoming weeks some of my more business oriented posts from LinkedIn here too. This post was from July 3rd and an introduction to all the methodology post I am currently publishing.

image

The importance of a (PLM) data model

thinkWhat makes it so hard to implement PLM in a correct manner and why is this often a mission impossible? I have been asking myself this question the past ten years again and again. For sure a lot has to do with the culture and legacy every organization has. Imagine if a company could start from scratch with PLM. How would they implement PLM nowadays?

My conclusion for both situations is that it all leads to a correct (PLM) data model, allowing companies to store their data in an object-oriented manner. In this way reflecting the behavior the information objects have and the way they mature through their information lifecycle. If you making compromises here, it has an effect on your implementation, the way processes are supported out-of-the-box by a PLM system or how information can be shared with other enterprise systems, in particular, ERP. PLM is written between parenthesis as I believe in the future we do not talk PLM or ERP separate anymore – we will talk business.

Let me illustrate this academic statement.

A mid-market example

imageWhen I worked with SmarTeam in the nineties, the system was designed more as a PDM system than a PLM system. The principal objects were Projects, Documents, and Items. The Documents had a sub-grouping in Office documents and CAD documents. And the system had a single lifecycle which was very basic and designed for documents. Thanks to the flexibility of the system you could quickly implement a satisfactory environment for the engineering department. Problems (and customizations) came when you wanted to connect the data to the other departments in the company.

The sales and marketing department defines and sells products. Products were not part of the initial data model, so people misused the Project object for that. To connect to manufacturing a BOM (Bill of Material) was needed. As the connected 3D CAD system generated a structure while saving the assemblies, people start to consider this structure as the EBOM. This might work if your projects are mechanical only.

However, a Document is not the same as a Part. A Document has a complete different behavior as a Part. Documents have continuous iterations, with a check-in/checkout mechanism, where the Part definition remains unchanged and gets meanwhile a higher maturity.

The correct approach is to have the EBOM Part structure, where Part connect to the Documents. And yes, Documents can also have a structure, but it is not a BOM. SmarTeam implemented this around 2004. Meanwhile, a lot of companies had implemented their custom solution for EBOM by customization not matching this approach. This created a first level of legacy.

When SmarTeam implemented Part behavior, it became possible to create a multidisciplinary EBOM, and the next logical step was, of course, to connect the data to the ERP system. At that time, most implementations have been pushing the EBOM to the ERP system and let it live there further. ERP was the enterprise tool, SmarTeam the engineering tool. The information became disconnected in an IT-manner. Applying changes and defining a manufacturing BOM was done manually in the ERP system and could be done by (experienced) people that do not make mistakes.

Next challenge comes when you want to automate the connection to ERP. In that case, it became apparent that the EBOM and MBOM should reside in the same system. (See old and still actual post with comments here: Where is the MBOM) In one system to manage changes and to be able to implement these changes quickly without too much human intervention. And as the EBOM is usually created in the PLM system, the (commercial/emotional) PLM-ERP battle started. “Who owns the part definition”, “Who owns the MBOM definition” became the topic of many PLM implementations. The real questions should be: “Who is responsible for which attributes of the Part ?” and “Who is responsible for which part of the MBOM definition ?” as data should be shared not owned.

The SmarTeam evolution shows how a changing scope and an incomplete/incorrect data model leads to costly rework when aligning to the mainstream. And this is happening with many implementation and other PLM systems. In particular when the path is to grow from PDM to PLM. An important question remains what is going to be mainstream in the future. More on that in my conclusion.

A complex enterprise example

In the recent years, I have been involved in several PLM discussions with large enterprises. These enterprises suffer from their legacy. Often the original data management was not defined in an object-oriented manner, and the implementation has been expanding with connected and disconnected systems like a big spaghetti bowl.

The main message most of the time is:

“Don’t touch the systems it as it works for us”.

The underlying message is;

“We would love to change to a modern approach, but we understand it will be a painful exercise and how will it impact profitability and execution of our company”

The challenge these companies have is that it extremely hard to imagine the potential to-be situation and how it is affected by the legacy. In a project that I participated several years ago the company was migrating from a mainframe database towards a standard object-oriented (PLM) data model. The biggest pain was in mapping data towards the object-oriented data model. As the original mainframe database had all kind of tables with flags and mixed Part & Document data, it was almost impossible to make a 100 % conversion. The other challenge was that knowledge of the old system had vaporized. The result at the end was a customized PLM data model, closer to current reality, still containing legacy “tricks” to assure compatibility.

All these enterprises at a particular time have to go through such a painful exercise. When is the best moment? When business is booming, nobody wants to slow-down. When business is in a lower gear, costs and investments are minimized to keep the old engine running efficiently. I believe the latter would be the best moment to invest in making the transition if you believe your business will still exist in 10 years from now.

Back to the data model.

Businesses should have today a high-level object-oriented data model, describing the main information objects and their behavior in your organization. The term Master Data Management is related to this. How many companies have the time and skills to implement a future-oriented data model? And the data model must stay flexible for the future.

knowledgeCompare it to your brain, which also stores information by its behavior and by learning the brain understands what it logically related. The internal data model gets enriched while we learn.

Once you have a business data model, you are able to implement processes on top of it. Processes can change over time, therefore, avoid hard-coding specific processes in your enterprise systems. Like the brain, we can change our behavior (applying new processes) still it will be based on the data model stored inside our brain.

Conclusion:

A lot of enterprise PLM implementations are in a challenging situation due to legacy or incomplete understanding and availability of an enterprise data model. Therefore cross-department implementations and connecting others systems are considered as a battle between systems and their proprietary capabilities.

image

The future will be based on business platforms and realizing this take years – imagine openness and usage of data standards. An interesting conference to attend in the near future for this purpose is the PDT2015 conference in Stockholm.

Meanwhile I also learned that a  one-day Master Data Management workshop will be held before the PDT2015 conference starts on the 12th of October. A good opportunity to deep-dive for three days !

In my series of blog posts related to the (PLM) data model, I talked about Product, BOMs and Parts. This time I want to focus on the EBOM and (CAD) Documents relation. This topic became relevant with the introduction of 3D CAD.

Before companies were using 3D CAD systems, there was no discussion about EBOM or MBOM (to my knowledge). Engineering was producing drawings for manufacturing and not every company was using the mono-system (for each individual part a specifying drawing). Drawings were mainly made to assist production and making a drawing for an individual part was a waste of engineering time. Parametric drawings were used to specify similar parts. But now we are in the world of 3D!

imageWith the introduction of 3D CAD systems for the mainstream in the nineties (SolidWorks, Solid Edge, Inventor) there came a need for PDM systems managing the individual files from a CAD assembly. The PDM system was necessary to manage all the file versions. Companies that were designing simple products sometimes remained working file-based, introducing the complexity of how to name a file and how to deal with revisions. Ten years ago I was investigating data management for the lower tiers of the automotive supply chain. At that time still 60 % of the suppliers were using CATIA were working file-based. Data management was considered as an extra complexity still file version control was a big pain.

This has changed for several reasons:

  • More and more OEMs were pushing for more quality control of the design data (read PDM)
  • Products became more modular, which means assemblies can be used as subassemblies in other products, pushing the need for where used control
  • Products are becoming more complex and managing only mechanical CAD files is not enough anymore – Electronics & Software – mechatronics – became part of the product

Most PDM systems at that time (I worked with SmarTeam) were saving the 3D CAD structure as a quantity-based document structure, resembling a lot a structure called the EBOM.

CAD DOC structure

 

This is one of the most common mistakes made in PLM implementations.

The CAD structure does not represent the EBOM !!!

Implementers started to build all kind of customizations to create automatically from the CAD structure a Part structure, the EBOM. Usually these customizations ended up as a mission impossible, in particular when customers started to ask for bidirectional synchronization. They expected that when a Part is removed in the EBOM, it would be deleted in the CAD assembly too.

And then there was the issue that companies believed the CAD Part ID should be equal to the Part ID. This might be possible for a particular type of design parts, but does not function anymore with flexible parts, such as a tube or a spring. When this Part is modeled in a different position, it created a different CAD Document, breaking the one-to-one relation.

Finally another common mistake that I have seen in many PDM implementations is the addition of glue, paint and other manufacturing type of parts to the CAD model, to be able to generate a BOM directly from the CAD.

imageFrom the data model perspective it is more important to understand that Parts and CAD documents are different type of objects. In particular if you want to build a PLM implementation where data is shared across all disciplines. For a PDM implementation I care less about the data model as the implementation is often not targeting enterprise continuity of data but only engineering needs.

A CAD Document (Assembly / Part / Drawing / …) behaves like a Document. It can be checked-in and checked out any time a change is made inside the file. A check-in operation would create a new version of the CAD Document (in case you want to trace the history of changes).

Meanwhile the Part specified by the CAD Document does not change in version when the CAD Document is changed. Parts usually do not have versions; they remain in the same revision as long as the specifying CAD Document matures.

Moving from PDM to PLM

For a PLM implementation it is important to think “Part-driven” which means from an initial EBOM, representing the engineering specification of the Product, maturing the EBOM with more and more design specification data. Design specification data can be mechanical assemblies and parts, but also electrical parts. The EBOM from a PCB might come from the Electrical Design Application as in the mechanical model you will not create every component in 3D.

And once the Electrical components are part of the EBOM, also the part definition of embedded software can be added to the BOM. For example if software is needed uploaded in flash memory chips. By adding electrical and software components to the EBOM, the company gets a full overview of the design maturity of ALL disciplines involved.

The diagram below shows how an EBOM and its related Documents could look like:

EBOM.docs

 

This data model contains a lot of details:

  • As discussed in my previous post – for the outside world (the customer) there is a product defined without revision
  • Related to the Product there is an EBOM (Part assembly) simplified as a housing (a mechanical assembly), a connector (a mechanical art) and a PCB (a mechanical representation). All these parts behave like Mechanical Parts; they have a revision and status.
  • The PCB has a second representation based on an electrical schema, which has only (for simplification) two electrical parts, a resistor and a memory chip. As you can see these components are standard purchasable parts, they do not have a revision as they are not designed.
  • The Electrical Part Flash Memory has a relation to a Software Part which is defined by Object Code (a zip-file?) which of course is specified by a software specification (not in the diagram). The software object code has a version, as most of the time software is version managed, as it does not follow the classical rules of mechanical design.

Again I reached my 1000 words, a sign to stop explaining this topic. For sure there are a lot of details to explain to this data model part too.

Most important:

  • A CAD structure is not an EBOM (it can be used to generate a part of the EBOM)
  • CAD documents and EBOM parts have a different behavior. CAD documents have versions, Parts do not have versions (most of the time
  • The EBOM is the place where all disciplines synchronize their data, providing during the development phase a single view of the design status.

Let me know if this was to abstract and feel free to ask questions. Important for this series of blog post is to provide a methodology baseline for a real PLM data model.

I am looking forward to your questions or remarks to spark up the discussion.

image

As described in my latest LinkedIn post if you want to install PLM successful there are two important points to address from the implementation point of view:

  • An explicit data model not based on system or tools capabilities, but on the type of business the company is performing. There is a difference in an engineering to order company, a built to order company or a configure to order company.
  • In PLM (and Business) it is all about enabling an efficient data flow through the organization. There is no ownership of data. It is about responsibilities for particular content per lifecycle stage combined with sharing

Historically PLM implementations started with capturing the CAD data and related EBOM as this is what the CAD-related PLM vendors were pushing for and this was often for the engineering department the biggest pain. The disadvantage of this approach is that it strengthens the silo-thinking process. The PLM system becomes an engineering tool instead of an enterprise system.

I believe if you really want to be able to implement PLM successful in a company, start from a common product/part information backbone. This requires the right business objects and, therefore, the right data modeling. The methodology described below is valid for build to order and configure to order companies, less applicable for engineering to order.

BusinessModels

In a build to order company there are the following primary information objects:

  • A Product ( representing the customer view of what is sold to the outside world)
  • An EBOM ( representing a composition of Parts specifying the Product at a particular time)
  • An MBOM (representing the manufacturing composition of the Product at a given time)

And, of course, there are for all the information objects related Documents. Various types and when you can work more advanced, the specification document, can be the source for individually extracted requirements (not in this post)

Let´s follow an End to End scenario from a typical Build to Order company process.

Quoting phase

A potential customer sends an RFP for a product they need. The customer RFP contains information about how the product should behave (Specification / Requirements) and how it should be delivered (packaging). A basic data model for this RFP would be:

DataModel-1

Note the following details:

  • All information objects have a meaningless number. The number is only there to support unique identification and later integration with other systems. The meaning should come from the other attribute data on the object and its relations. (A blog post on its own)
  • The Product can have instead of the meaningless number the number provided by the customer. However, if this number is not unique to the company, it might be just another attribute of the product
  • In general Products do not have revisions. In time, there might be other BOMs related to the product. Not in this post, products might have versions and variants. And products might be part of a product family. In this case, I used a classification to define a classification code for the product, allowing the company to discover similar products from different customers done. This to promote reuse of solutions and reuse of lessons learned.
  • The customer object represents the customer entity and by implementing it as a separate object, you will be able to see all information related to this customer quickly. This could be Products (ordered / in RFQ / etc.) but also other relevant information (Documents, Parts, …)
  • The initial conceptual BOM for the customer consists of two sub-BOMs. As the customer wants the products to be delivered in a 6-pack, a standard 6-pack EBOM is used. Note: the Status is Released and a new conceptual EBOM is defined as a placeholder for the BOM definition of the Product to design/deliver.
  • And for all the Parts in the conceptual EBOM there can be relations towards one or more documents. Usually, there is one specifying document (the CAD model) and multiple derived documents (Drawings, Illustrations, …)
  • Parts can have a revision in case the company wants to trace the evolution of a Part. Usually when Form-Fit-Function remains the same, we speak about a revision. Otherwise, the change will be a new part number. As more and more the managed information is no longer existing on the part number, companies might want to use a new part number at any change, storing in an attribute what its predecessor was.
  • Documents have versions and revisions. While people work on a document, every check-in / check-out moment can create a new version of the file(s), providing tractability between versions. Most of the time at the end there will be a first released version, which is related to the part specified.
  • Do not try to have the same ID and Revision for Parts and Documents. In the good old days of 2D drawings this worked, in the world of 3D CAD this is not sustainable. It leads to complexity for the user. Preferably the Part and the specifying Document should have different IDs and a different revision mechanism.

And the iterations go on:

Now let´s look at the final stage of the RFQ process. The customer has requested to deliver the same product also in single (luxury) packaging as this product will be used for service. Although it is exactly the same physical product to produce, the product ID should be different. If the customer wants unambiguous communication, they should also use a different product ID when ordering the product for service or for manufacturing. The data model for this situation will look as follows (assuming the definitions are done)

DataModel-2

Note the following details:

  • The Part in the middle (with the red shadow) – PT000123 represents the same part for both, the product ordered for manufacturing, as well as the product ordered for service, making use of a single definition for both situations
  • The Part in the middle has now a large set of related documentation. Not only CAD data but also test information (how to test the product), compliance information and more.
  • The Part in the middle on its own also has a deeper EBOM structure which we will explore in an upcoming post.

I reached my 1000 words and do not want to write a book. So I will conclude this post. For experienced PLM implementers probably known information. For people entering the domain of PLM, either as a new student or coming from a more CAD/PDM background an interesting topic to follow. In the next post, I will continue towards the MBOM and ERP.

Let me know if this post is useful for you – and of course – enhancements or clarifications are always welcomed. Note: some of the functionality might not be possible in every PLM system depending on its origin and core data model

%d bloggers like this: