You are currently browsing the category archive for the ‘CTO’ category.

myplmSorry guys, I am aware of the fact that the definition of PLM is very ambiguous. Every vendor, implementor and probably PLM consultant has a favorite definition. Just to illustrate this statement,  read Brain Soaper´s recent post: What are the top 5 things to know about PLM ?

Interesting Brian starts with stating the definition of PLM is priority #1, however as you can see from the comment session, it is all about having inside your company a common definition of PLM.

And now I start writing about digital PLM, again a definition. You might have read in my blog about classical PLM and modern PLM.

Classical PLM

classical PLMFor me, classical PLM is the way PLM has been implemented in the past 15 years, often as an extension of engineering with the purpose of centralizing and sharing information.

In particular for CAD data, classical PLM is focusing on managing files in a controlled way, through check-in and check-out mechanisms. On top of file management, classical PLM provides more data-driven functionality, like project management, process governance (workflows / approvals / ECx processes) and BOM management (to link to ERP).

Classical PLM can still bring great benefits to a company as time for searching, paper-based processes and data retyping in ERP can be avoided, leading to reuse and fewer errors. The ROI time for a classical PLM implementation lays between two years to three years; my observations from the past. This time can still vary a lot as not every company or implementor/vendor uses the ideal approach to implement PLM, due to cultural issues, wrong expectations or lack of experience from both parties.

The connotations I have with classical PLM are:
linear, rigid, mechanical,(old) automotive, previous century

Modern PLM = Digital PLM

InfoInContextModern PLM is based on the vision that all information should be managed and stored as data objects, not necessary in a single system. Still the PLM infrastructure, using structured and unstructured data, should give each user in the organization with almost real-time information in context of other relevant information.

My non-stop blog buddy Oleg recently wrote a post in that context: Data as a platform & future manufacturing intelligence. Oleg is nicely describing some of the benefits of a data-driven approach.

Accenture provides insight with their infographic related to Digital PLM. Read it here as it is very concise and gives you a quick impression what Digital PLM means for an organization. Here is my favorite part, showing the advantages.

accenture digital PLM

The substantial advantages from digital PLM are all coming from the fact that information is stored as data objects, all having their individual versions, relations and status. The advantage of data elements is that they are not locked in a document or specific file format. Information can flow to where or whom needed without translation.

The connotations I have with digital PLM are:
real-time, data continuity, flexible, software and future.

 

Still some caution:

Reported ROI numbers for digital PLM are significant larger than classical PLM and I observed some facets of that. Digital PLM is not yet established and requires a different type of workforce. See other blog post I wrote about this theme: Modern PLM brings Power to the People.

But what about digital PLM – where is the word digital relevant ?

ETO – model-based engineering

Where to focus first depends very much on your company´s core business process. Companies with an Engineering To Order (ETO) process will focus on delivering a single product to their customer and most of the time the product is becoming more like a system, interacting with the outside world.

Big challenges in ETO are to deliver the product as required, to coordinate all disciplines preferable in a parallel and real-time manner – in time – on budget. Here a virtual model that can be accessed and shared with all stakeholders should be the core. The construction industry is introducing BIM for this purpose (a modern version of DMU). The virtual model allows the company to measure progress, to analyze and simulate alternatives without spending money for prototypes. In the ideal world engineering and simulation are done on the same model, not losing time and quality on data translations and iterations.

The virtual model linked to requirements, functions and the logical definition allows virtual testing – so much cheaper and faster and therefore cost efficient. Of course this approach requires a change in how people work together, which is characteristic for any digital business. Breakdown the silos.

Typical industries using the ETO model: Construction, Energy, Offshore, Shipbuilding, Special Equipment

 

CTO – model-based manufacturing

In a Configure To Order (CTO) business model you do not spend time for engineering anymore. All options and variants are defined and now the focus is on efficient manufacturing. The trend for CTO companies is that they have to deliver more and more variants in a faster and more demanding global market. Here the connectivity between engineering data and manufacturing data becomes one of the cornerstones of digital PLM. Digital PLM needs to make sure that all relevant data for execution (ERP and MES) is flowing through the organization without reformatting or reworking the data.

The digital thread is the dream. Industry 4.0 is focusing on this part. Also in the CTO environment it is crucial to work with a product model, so all downstream disciplines can consume the right data. Although in CTO the company´s attention might go to MES and ERP, it is crucial that the source of the product model is well specified and under control from (dgital) PLM.

Typical CTO industries are: Automotive, Consumer Goods, High-Tech, Industrial Equipment

BTO – models everywhere

flexibleIf your company has a Build To Order main delivery process, the optimum for digital PLM lies in the middle of ETO and CTO, depending on the type of products your company delivers.

In BTO there is always engineering to do. It can be customer specific engineering work (only once) or it can be changing/ adding new features to the product.

Modularity of the product portfolio might be the answer for the first option, where the second option requires strong configuration management on the engineering side, similar to the ETO model. Although the dream of many BTO companies is to change a CTO company, I strongly believe change in technology and market requirements will always be faster than product portfolio definition.

pointETO, BTO and CTO are classical linear business models. The digital enterprise is changing these models too. Customer interaction (myProduct), continuous upgrade and feedback of products (virtual twin), different business models (performance as a service) all will challenges organizations to reconsider their processes.

Digital PLM utilizing a model-based or model-driven backbone will be the (potential) future for companies as data can be flowing through the organization, not locked in documents and classical processes. In my upcoming blog post I will spend some more time on the model-based enterprise.

Conclusion:
It depends on your company´s core business process where the focus on a model-based enterprise supported by (digital) PLM benefits the most. In parallel business models are changing which means the future must be flexible.

Digital PLM should be one of your company´s main initiatives in the next 5 years if you want to stay competitive (or relevant)

 

What do you think ? Am I too optimistic or too pessimistic ?

image

As described in my latest LinkedIn post if you want to install PLM successful there are two important points to address from the implementation point of view:

  • An explicit data model not based on system or tools capabilities, but on the type of business the company is performing. There is a difference in an engineering to order company, a built to order company or a configure to order company.
  • In PLM (and Business) it is all about enabling an efficient data flow through the organization. There is no ownership of data. It is about responsibilities for particular content per lifecycle stage combined with sharing

Historically PLM implementations started with capturing the CAD data and related EBOM as this is what the CAD-related PLM vendors were pushing for and this was often for the engineering department the biggest pain. The disadvantage of this approach is that it strengthens the silo-thinking process. The PLM system becomes an engineering tool instead of an enterprise system.

I believe if you really want to be able to implement PLM successful in a company, start from a common product/part information backbone. This requires the right business objects and, therefore, the right data modeling. The methodology described below is valid for build to order and configure to order companies, less applicable for engineering to order.

BusinessModels

In a build to order company there are the following primary information objects:

  • A Product ( representing the customer view of what is sold to the outside world)
  • An EBOM ( representing a composition of Parts specifying the Product at a particular time)
  • An MBOM (representing the manufacturing composition of the Product at a given time)

And, of course, there are for all the information objects related Documents. Various types and when you can work more advanced, the specification document, can be the source for individually extracted requirements (not in this post)

Let´s follow an End to End scenario from a typical Build to Order company process.

Quoting phase

A potential customer sends an RFP for a product they need. The customer RFP contains information about how the product should behave (Specification / Requirements) and how it should be delivered (packaging). A basic data model for this RFP would be:

DataModel-1

Note the following details:

  • All information objects have a meaningless number. The number is only there to support unique identification and later integration with other systems. The meaning should come from the other attribute data on the object and its relations. (A blog post on its own)
  • The Product can have instead of the meaningless number the number provided by the customer. However, if this number is not unique to the company, it might be just another attribute of the product
  • In general Products do not have revisions. In time, there might be other BOMs related to the product. Not in this post, products might have versions and variants. And products might be part of a product family. In this case, I used a classification to define a classification code for the product, allowing the company to discover similar products from different customers done. This to promote reuse of solutions and reuse of lessons learned.
  • The customer object represents the customer entity and by implementing it as a separate object, you will be able to see all information related to this customer quickly. This could be Products (ordered / in RFQ / etc.) but also other relevant information (Documents, Parts, …)
  • The initial conceptual BOM for the customer consists of two sub-BOMs. As the customer wants the products to be delivered in a 6-pack, a standard 6-pack EBOM is used. Note: the Status is Released and a new conceptual EBOM is defined as a placeholder for the BOM definition of the Product to design/deliver.
  • And for all the Parts in the conceptual EBOM there can be relations towards one or more documents. Usually, there is one specifying document (the CAD model) and multiple derived documents (Drawings, Illustrations, …)
  • Parts can have a revision in case the company wants to trace the evolution of a Part. Usually when Form-Fit-Function remains the same, we speak about a revision. Otherwise, the change will be a new part number. As more and more the managed information is no longer existing on the part number, companies might want to use a new part number at any change, storing in an attribute what its predecessor was.
  • Documents have versions and revisions. While people work on a document, every check-in / check-out moment can create a new version of the file(s), providing tractability between versions. Most of the time at the end there will be a first released version, which is related to the part specified.
  • Do not try to have the same ID and Revision for Parts and Documents. In the good old days of 2D drawings this worked, in the world of 3D CAD this is not sustainable. It leads to complexity for the user. Preferably the Part and the specifying Document should have different IDs and a different revision mechanism.

And the iterations go on:

Now let´s look at the final stage of the RFQ process. The customer has requested to deliver the same product also in single (luxury) packaging as this product will be used for service. Although it is exactly the same physical product to produce, the product ID should be different. If the customer wants unambiguous communication, they should also use a different product ID when ordering the product for service or for manufacturing. The data model for this situation will look as follows (assuming the definitions are done)

DataModel-2

Note the following details:

  • The Part in the middle (with the red shadow) – PT000123 represents the same part for both, the product ordered for manufacturing, as well as the product ordered for service, making use of a single definition for both situations
  • The Part in the middle has now a large set of related documentation. Not only CAD data but also test information (how to test the product), compliance information and more.
  • The Part in the middle on its own also has a deeper EBOM structure which we will explore in an upcoming post.

I reached my 1000 words and do not want to write a book. So I will conclude this post. For experienced PLM implementers probably known information. For people entering the domain of PLM, either as a new student or coming from a more CAD/PDM background an interesting topic to follow. In the next post, I will continue towards the MBOM and ERP.

Let me know if this post is useful for you – and of course – enhancements or clarifications are always welcomed. Note: some of the functionality might not be possible in every PLM system depending on its origin and core data model

observation In my previous post, BOM for Dummies related to Configure To Order, I promised to come back on the special relation between the items in the BOM and the CAD data. I noticed from several posts in PLM and PDM groups that also the importance of CAD data is perceived in a different manner, depending on the background of the people or the systems they are experienced with.

So I would like to start with some general statements based on these observations.

planning People who are talking about the importance of CAD data and product structures are usually coming from a background in PDM. In an environment where products are designed, the focus is around data creation, mostly CAD data. The language around parts in the BOM is mostly targeting design parts. So in a PDM environment CAD data is an important topic – therefore PDM people and companies will talk about CAD data and vaults as the center of information.

erp_bom

When you are working in a PLM environment, you need a way to communicate around a product, through its whole lifecycle, not only the design phase but also supporting manufacturing phases, the possible changes of an existing product through engineering changes, the traceability of as-built data and more. In a PLM environment, people have the physical part (often called the ERP part) in mind, when they talk about a part number.

As PLM covers product information across various departments and disciplines, the information carrier for product information cannot be the CAD data. The BOM, usually the mBOM, is the main structure used to represent and produce the product. Most parts in the mBOM have a relation to a CAD document (in many companies still the 2D drawing). Therefore PLM people and companies understanding PLM will talk about items and products and their lifecycle as their center of information.

CAD data in relation to Engineering to Order

The above generalizations have to be combined with the different main business processes. In a strict Engineering To Order environment, where you design and build a solution only once for a specific customer, there is no big benefit of going through an eBOM and mBOM transition.

During the design process the engineer already has manufacturing in mind, which will be reflected in the CAD structure they build – sometime hybrid representing both engineering and manufacturing items. In such an environment CAD data is leading to build a BOM structure.

And in cases where engineering is done in one single 3D CAD system, the company might use the PDM system from this vendor to manage their Bill of Materials. The advantage of this approach is that PDM is smoothly integrated with the design environment. However it restricts in a certain matter the future as we will see in further reading.

pointNot everyone needs the Engineering to Order process !

Moving to an integrated, multi-disciplinary engineering process or changing the main process from Engineering To Order to Built To Order / Configure To Order will cause major challenges in the company.

I have seen in the recent past, several companies that would like to change their way of working from a CAD centric Engineering To Order process towards a more Built to Order or Configure To Order process. The bottle neck of making this switch was every time that engineering people think in CAD structures and all knowledge is embedded in the CAD data. They now want to configure their products in the CAD system.

For Configure to Order you have to look at a different way to your CAD data:

Questions to ask yourself as a company are:

  • When I configure my products around a CAD structure, what should I do with data from other disciplines (Electrical/Tooling/Supplier data) ?
  • When I upgrade my 3D CAD system to a new version, do I need to convert all old CAD data to the newest versions in order to keep my configurations alive?
  • When configuring a new customer solution, do I need to build my whole product in CAD in order to assure it is complete?
  • In Configure to Order the engineering BOM and manufacturing BOM are different. Does this mean that when I go through a new customer order, all CAD data need to be handled, going through eBOM and mBOM transition again?

For me it is obvious that only in an Engineering to Order environment the CAD data are leading for order fulfillment. In all other typical processes, BTO (Built to Order), CTO (Configure to Order) and MTS (Make to Stock),  product configuration and definition is done around items and the CAD data is important associated data for the product definition and manufacturing

In the case of order fulfillment in a Configure to Order process, the CAD structure is not touched as configuration of the product is available based on items. Each item in the mBOM has it relations to CAD data or other specifying information.

In the case of Built To Order, a huge part of the product is already configured, like in Configure To Order. Only new interfaces or functionality will go through a CAD design process. This new design might be released through a process with an eBOM to mBOM transition. In cases where the impact or the amount of data created in engineering is not huge, it is even possible to configure the changes immediately in an mBOM environment.

old_process A second point, which is also under a lot of discussion in the field ( PLM interest groups), is that PDM is easily to introduce as a departmental solution. The engineering BOM is forwarded to manufacturing and there further (disconnected) processed.  The step from PDM to PLM is always a business change.

When PDM vendors talk about ERP integration, they often mean the technical solution of connecting the two systems, not integrating the processes around the BOM (eBOM/mBOM transition) 0r an integrated engineering change (ECR/ECO). See how easy it is according to some PDM vendors:

or
PLM requires an adaptation of all departments to work different and together around a single product definition. Especially in a mid-market company, this is a big issue, as all product knowledge is stored in the CAD data and the knowledge how to produce the product is stored in the mBOM on the ERP side. These environments are often disconnected.
Conclusion: In the context of PDM the importance of CAD data is clear and for companies following a strict Engineering To Order process the main source of product knowledge. Companies following the Built To Order / Configure To Order process should configure their products around items to keep flexibility towards the future.

Companies with the intention to move to Built To Order or Configure To Order should not invest too much in CAD data configuration as it creates a roadblock for the future.

In my next post I will address the question that comes up from many directions, addressed by Jim Brown and others, as discussed  in one of his recent posts around a PLM standard definition and more ….

sleep This is the third post on Bill of Material handling for different types of companies, this time the focus on Configure To Order (CTO). In the CTO process, products are assembled and configured based on customer requirements. This means there is no more engineering needed when customer requirements are known. CTO examples are, the ordering process of a car with all its options, or ordering a personal computer over the internet.

So what has Configure To Order to do with PLM as there is no engineering?

The main PLM activity takes places when designing the configurable product. Designing a product that is configurable, requires a complete different approach as compared to Engineering to Order or Build to Order. Although we see a similar Configure to Order activity in the R&D departments of companies that follow the Build to Order process. They are also designing products or modules that can be used as-is in customer specific orders as part of the solution.

dashboard The challenge of CTO is to design products that are modular, and where options and variants are designed on a common platform with common interfaces. If you look to the dashboard of a car you will see placeholders for additional options (in case you have the minimal car version) and also you might see that for example the radio display in a basic car version differs from the complete board computer in the luxury version. The common platform is one dashboard, fitting to numerous options.

An engineering department will not focus on designing and defining each of the possible combinations of options as this would be impossible to manage. What can be managed is the common platform (the baseline) and all different options on top of this baseline.

So what happens with the BOM?

The initial design of configurable products goes through similar steps as the BTO process, which means starting from a conceptual BOM, moving to an Engineering BOM (eBOM) and finally produce a BOM for manufacturing (mBOM). The difference is that in the CTO process the mBOM is not developed for just one product, but contains all definitions for all possible products. In this situation we talk about a generic mBOM.

Only when a customer order exists, the generic mBOM is resolved into a specific mBOM for this customer order, which then can be sent to the ERP system for execution.

filter In a generic BOM the relations are managed by filters. These filters define the effectivity of the link, in simple words if the relation between two parts in the BOM is valid (and shown) or not. There are various ways to define effectivity – with again a differentiation in usage

  • revision based effectivity – which means the relation between two items is valid in case the revisions match
  • date effectivity – which means the relation is valid during a certain time interval

Both methods are used most of the time for non-configurable products. The revision and date effectivity are used to be able to track the product history through time and therefore to have full traceability. But this does not work if you want to configure every time a customer specific order.

In that case we use unit or option based filtering.

  • unit effectivity – which means the relation between two items is valid for a unit (or a range of units) produced. For example a batch of products or a unique product with a serial number
  • option effectivity – which means the relation between two items is valid in case a certain condition is valid. Which condition depends on the configuration rules for this option. Example of options are: color, version, country

It is clear that unit and option based filtering of a BOM can lead to a conceptual complex product definition which goes beyond the BOM for Dummies target.  Below an illustration of the various filter concepts (oops the animated gif does not work – i will investigate):

CTO

The benefit of this filtering approach is that there is a minimum of redundancy of data to manage. This makes it a common practice in the aerospace and automotive industry. An example describing all the complexity can be found for example here, but I am sure on this level there are enough publications and studies available.

And what about the CAD ?

I will write a separate post on this topic, as all the possible interactions and use cases with CAD are a topic on its own. You can imagine, having the 3D virtual world combined with a configurable BOM brings a lot of benefits

What PLM functions are required to support Configure to Order ?

  • Project management – not so much focus here as the delivery project for a customer does not require much customer interaction. Of course, the product development processes requires advanced capabilities which I will address later in a future post.
  • Document management – same approach as for project management. The product related documentation needs to exist and secured. Customer specific documentation can be generated often automatically.
  • Product Management – managing all released and available components for a solution, related to their Bill of Materials. Often part of product management is the classification of product families and its related modules
  • Item management – The main activities here are in the mBOM area. Capabilities for BOM generation (eBOM/mBOM), baseline and compare using filtering (unit based / option based) in order to support the definition if the manufactured product
  • Workflow processes – As we are dealing with standardized components in the BOM, the Engineering Change Request (ECR) and Engineering Change Order (ECO) processes will be the core for changes. And as we want to manage controlled manufacturing definition, the Manufacturer Change Order process and Standard Item Approval process are often implemented

Optional:

  • Requirements Management – specially for complex products, tracking of individual requirements and their implementation, can save time and costs during delivery to understand and handle the complex platform
  • Service Management – as an extension of item management. When a customer specific order has been delivered it might be still interesting for the company that delivered the product to keep traceability of the customer configuration for service options – managing the Service and As-Built BOM
  • Product Configurator – the reason I write it as optional, is because the target is order execution, which is not a PLM role anymore. The ERP system should be able to resolve the full mBOM for an order. The PLM product configuration definition is done through Product and Item management. Depending on the customer environment the role of configurator might be found in PLM in case ERP does not have the adequate tools.

Conclusion:

It is hard to describe the Configure To Order process in the scope of BOM for Dummies. As various detailed concepts exist per industry there is no generic standard. This is often the area where the PLM system, the PLM users and implementers are challenged the most: to make it workable, understandable and maintainable

Next time some industry specific observations for a change

%d bloggers like this: