You are currently browsing the tag archive for the ‘mid-market’ tag.

Last time in the series Learning from the past to understand the future, we zoomed in on how the 3D CAD-structure in the mid-market had to evolve. In a typical Engineering To Order (ETO) scenario, it makes sense to extract from the 3D CAD-structure a BOM-structure to collect all the individual parts that are needed for manufacturing. Combined with the drawings generated based on the 3D CAD assemblies/parts, the complete manufacturing information could be provided. Let’s have a look.

The BOM in ERP (part 1)

To understand what most mid-market companies have been doing, I created the image below. When you click on it, you will have an enlarged version.

Note: for educational purposes an extremely simplified example

There is a lot to explain here.

First, on the right we see the 3D CAD assembly, two phantom assemblies, grouping the wheels and the axle. And at the end, the individual parts, i.e. chassis, axle, and wheel. The 3D CAD-structure is an instance-based structure; therefore, there are no quantities in the structure (all quantity 1)

For the individual parts, there are drawings. Also, for the product, we have an assembly drawing. The drawings are essential as we want to have them in the ERP-system for manufacturing.

Finally, the physical parts, now with a different ID than the drawing as we learned this one-to-one relation created a lot of extra work. The physical parts are often called Items or Materials (SAP naming). Unfortunately, for engineering, there is a different meaning behind Materials. Still, SAP’s data model was not built with an engineering mindset.

The physical part structure, which we call the BOM contains quantities. Most PDM-CAD-integrations can filter out phantom assemblies and summarize the parts on the same level

I am still reluctant to call the Part-structure an EBOM as the design of the product has been mainly focusing on extracting manufacturing information, parts, and drawings.

The BOM in ERP (part 2)

In customized PDM-implementations, some implementers created an interface from the BOM-structure to ERP, so the ERP-system would have the basic definition of the parts and a copy of the relevant drawings.

Now manufacturing could create the manufacturing definition without the need to go into the PDM-system.

Some “clever” – Dick Bourke would say “smart – therefore lazy” – proposed to “draw” also manufacturing entities in the 3D CAD-structure, so the PDM-CAD-interface would automatically deliver manufacturing parts too inside the ERP. In the example below, we added paint for the body and grease needed for the axels.

Although “smart, a new problem was introduced here – the 3D CAD-structure, instance-based, always has quantities 1. The extracted BOM would have rounded numbers when considering design parts. Now the grease comes with an estimate of  0.025 kg, assuming quantities are based on SI-units. We could also add other manufacturing information to this BOM, like 0.3-liter paint. Anyway, the result would look like below:

Important to notice from the diagram here: There are placeholders for grease and paint “drawn” in the 3D CAD-structure – parts without a geometrical definition and, therefore, not having an associated drawing. However, these parts have a material specification, and therefore in the BOM-structure, they appear as Materials.

Next in the BOM-structure, the engineers would enter the expected/required quantity – which is no longer a rounded number.

At this stage, you cannot call the BOM on the left an EBOM. It is a kind of hybrid structure, combining engineering and manufacturing data. A type of BOM we discover a lot in companies that started with a type of ETO-product.

The ETO-product

Many companies that developed specialized machinery have started with a base product, from where they developed the custom solution – their IP. Next, with more and more customers, the original solution was extended by creating either new or changed capabilities.

I worked a lot with companies that moved to the full definition of their products in 3D CAD, creating a correct 3D CAD-structure per customer order. Instead of creating new BOM variants, companies were often tempted/forced to make the configuration inside the 3D CAD-model.

The 3D CAD vendor often provided functionality to have multiple configurations of the same part/product inside a single file. A nice feature for designers as there are fewer files to maintain, however, a crime for data management.

Every time one of the configurations of the part would change, or a new configuration was added, the file has to be revised.

And if the change was at level five of a 3D CAD-structure, many assembly files needed to be updated. The versioning problem illustrates the challenge of managing configurations inside a 3D CAD-file, meanwhile creating complexity for the PDM/PLM-system.

Last week Tech-Clarity published the highlights of their survey: Bringing Custom-Engineered Products to Market with a link to the full report, sponsored by Propel.

As you can imagine, this survey is more about PLM collaboration, breaking down the silos and acting agile. Unfortunately, the report does not expose required methodologies, like modularity and “common sense” engineering practices that we discuss here. Still worthwhile to read as the report addresses precisely the type of companies I am referring too here.

If we look at the methodology of custom-engineered products, let us look at how their “best practice” from the past is blocking the future.

When a new customer request is coming in, sales engineering is looking for the best match of delivered products. Hopefully, 80-90 % remains the same, and engineering has to focus only on the differences.

First, the best-match 3D CAD-structure is copied to a new project. As you can see most 3D CAD-systems provide the functionality to create a derived structure from an original 3D CAD-structure. From there, a traditional ETO-process starts as described at the beginning of this post. We complete the 3D CAD-structure with manufacturing in mind, generate the BOM and drawings, and we can deliver. In the case of purchase parts, the generated BOM often contains already the supplier part number in the 3D CAD-structure as we are focusing on this single delivery.

The disadvantage of this approach that in theory, we have to check if the structure that we reused is really the best so far, otherwise we introduce errors again.

The second disadvantage is that if one supplier part in the structure becomes obsolete and needs to be revised, the company has to go through all the 3D CAD-structures to fix it.

Also, having supplier parts in the 3D CAD-structure makes it more difficult to standardize, as the chosen supplier part matched the criteria for that customer at that time. Will it match the criteria also in other situations?

From ETO to BTO to CTO

Many companies that started with custom-engineered products, the ETO-approach, want to move towards a Configure To Order (CTO) approach – or if not possible at least Build To Order (BTO). More reuse, less risk,  instead of creating every time a new solution for the next customer, as discussed before.

This is not a mission impossible; however, often, I have seen that companies do not set the right priorities to move towards a configure to order environment. There are a few changes needed to become a configure to order company (if possible):

  1. Analyze your solution and define modules and options. Instead of defining a full solution, the target now is to discover a commonality between the various solutions. Based on commonality, define modules and options in such a manner that they can be used in different situations. Crucial for these modules is that there is a standard interface to the rest of the product. Every company needs to master this specific methodology for their products
  2. Start defining products from a logical structure, defining how products, modules and options are compatible and which combinations are allowed (or preferred). For companies that are not familiar with logical structure, often a configured EBOM is used to define the solutions. Not the optimal way; however, this was the first approach most companies took ten years ago. I will explain the configured EBOM below.
  3. A product definition and its modules now should start from a real EBOM, not containing manufacturing characteristics. The EBOM should represent the logical manner of how a product is defined. You will notice this type of EBOM might be only 2 – 3 levels deep. At the lowest level, you have the modules that have their own lifecycle and isolated definition.
  4. You should no longer use supplier part numbers in your EBOMs. As the engineering definition of a module or option should not depend over time from a single supplier. We will discuss in the next post the relation between EBOM parts and the Approved Manufacturer List (AML)

To conclude for today

Changing from ETO to CTO requires modularity and a BOM-driven approach. Starting from a 3D CAD-structure can still be done for the lowest levels – the modules, the options. In a configure to order process, it might not be relevant anymore to create a full 3D-representation of the product.

However, when we look forward, it would be greatly beneficial to have the 3D-representation of every specific solution delivered. This is where concepts as augmented/virtual reality and digital twin come in.

Next time more on the BOM-structures – as we have just touched the upcoming of the EBOM – enough to clarify next week(s).

In my last post related to Learning from the past to understand the future, I discussed what happened when 3D CAD became available for the mid-market. In the large automotive or aerospace & defense companies, 3D CAD has been introduced along the path of defining processes and selecting tools. In the mid-market 3D CAD started from the other side, first as a productivity tool, not thinking further to change methodologies or processes.

The approach starting with 3D CAD without changing processes, has created several complexities. Every company that is aiming to move towards a digital future needs to reduce complexity to remain competitive. Now let us focus on the relation between the 3D CAD-structure and a BOM.

The 3D CAD-structure

When building a product in a 3D CAD system, the concept is that you have individual parts designed in 3D.  Every single part has a unique identifier.

If possible, the (file) name would equal the physical part number.

Next, a group of parts could be stored as a subassembly. Such an assembly is sometimes called a phantom assembly, in case they only group together several 3D parts. The usage of this type of assemblies increased CAD productivity. For data management reasons, these assemblies need to have a unique identifier, preferably not with the same numbering scheme for physical part numbers. It would consume part numbers that would never be used during manufacturing.

Note: in the early days of connecting 3D CAD to ERP, there was a considerable debate about which system could generate the part number.

ERP has always been the leading system for parts definition, why change ? And why generate part numbers that might not be used later in production. “Wasting” part numbers was a bad practice as historically, the part number was like a catalog number: 6 to 7 digits.

Next, there is also another group of subassemblies that represent one or more primary components of a product. For example, a pump assembly, that might be the combination of the pump, the motor, and the base frame. This type of assembly appears most of the time high in the CAD-structure. They can be considered as a phantom assembly too, regarding a required identifier for this subassembly.

Finally, there might be parts in the CAD-structure that will not exist in reality as part but need to be created during the manufacturing process. Sheet metal parts are created during the manufacturing process. Cappings, strips and cables shown in the CAD-structure might come from materials that are purchased in standardized sizes (1 meter / 2 meter / 10 meter) and need to be cut during manufacturing. Here the instances in the CAD-structure will have a unique identifier. What type of identifier to use depends on the manufacturing process. It might be a physical part number, as it is a half-fabricate,  or it remains a unique identifier for the CAD-structure only.

The reason I am coming back to these identifiers is that as described before, companies wanted to keep a relation between the part number and the file name.

There was a problem with flexible parts. A rubber hose with a specific length could be shaped differently in an assembly based on its connection. Two different shapes would create two files and therefore break the rule of a part number equals file name. The 3D CAD vendors “solved” this issue by storing configurable views of the same part inside one file and allow the user to select the active view.

Later we will see that management of views inside the 3D CAD model is not a wrong choice. This, contrary to managing different configurations of a part/product inside a single file, which creates complexity in the PLM domain.

In the end, the product became an assembly with several levels of subassemblies. At that time, when I worked a lot with CAD-integrations, the average depth of 3D CAD-structures was 6 to 7 levels deep, with exceptions in both directions.

The entire product CAD-structure is mainly used for a final digital mock-up, to allow engineers to analyze the full product behavior.  One of my favorite YouTube movies is the one from Airbus – seven years ago, they described the power of a full digital mock-up used for the A380.

In ETO-processes, the 3D CAD-structure is unique for a given customer solution – like the A380.

In the case of large assemblies with a lot of parts and subassemblies,  there were situations where the full product could not be resolved anymore. For Airbus a must, for the mid-market not always easy to reach.  Graphics memory, combined with the way graphics were represented, are the major constraint. This performance issue is resolved in the gaming world, however then the 3D representation had no longer the required accuracy or definition.

The Version pop-up problem

Working with a 3D CAD structure created a new problem when designers were sharing parts and assemblies between themselves and suppliers. The central storage of the files required a versioning mechanism, supported by a check-in and check-out mechanism.

Depending on the type of 3D CAD integration, the PDM system generated a new minor revision of the file after check-in again. In this way, there was full traceability of the changes before release. The image below shows an example of how SmarTeam was dealing with minor and major revisions combined with lifecycle stages.

When revising a part, all assemblies that contained the changed part need to be updated too, in case you want to have traceability and preventing others from overwriting your version. Making sure this assembly file points to the right file again. In the cases of a 6-level deep CAD-structure, this has led to a lot of methodology problems on how to deal (or not to deal) with file changes.

In the case of a unique delivery for a customer, the ETO-process, the issue might not be so big. As everything in the 3D CAD-structure is work in progress, you only need to be sure during the release process of the 3D CAD-structure that all parts and assemblies are resolved to the latest version (and verified)

Making changes on an existing product is way more complicated, as assemblies are released, and parts exist in production.  In that case, the Bill of Material is the leading structure to control the versions and the change impact, as we will see.

Note: Most CAD- and PLM-vendors loved to show you their demos, where starting from the CAD-structure, a product gets created (the ETO-process). The reality is that most companies do not start from the CAD-structure, but from an existing Bill of Material. In 2010, I wrote a few posts, discussing the relation between CAD and the BOM:

to explain there is more than a CAD-driven scenario.

The EBOM

In most PDM-systems with CAD-integrations, it is possible to create a Bill of Materials from the 3D CAD-structure. The Bill of Materials will be based on the parts inside the 3D CAD-structure. There is often the option to filter out phantom assemblies.

The structures are not the same. The 3D CAD-structure is instance-based, where the extracted Bill of Materials will summarize the part quantities on the same level.  See the image below. There are four Wheel instances in the CAD structure, in the EBOM-structure, we have only one Wheel reference with quantity 4.

I named the structure on the right the EBOM as the structure represents the Bill of Materials from the engineering point of view. This definition is a little arbitrary, as we will see. In companies that started to develop products based on a conceptual BOM, often, this conceptual BOM was an “early” EBOM that had to be developed further. This EBOM was more representing a logical or modular structure driving the design, instead of an extract from the 3D CAD-structure. In the next post, I will zoom in on these differences. I want to conclude this time with a critical methodology needed to manage the 3D CAD structure changes in relation to an EBOM.

Breaking the rule Drawing ID (Model ID)  = Part ID

Although I have been writing mostly about the 3D CAD structure, I want to remind us that the 3D Model in the mid-market is mainly used for design purposes. The primary delivery for manufacturing or a supplier is still a 2D-drawing for most companies. The 3D Model might be “nice to have” for CAM- or quality usage. Still, in case of a dispute, the 2D Drawing will be leading.

For that reason, in many mid-market companies, there was the following relation below:

In an environment without file versioning through check-in/check-out, this relation was easy to maintain. In the electronic world, every change in the 3D Model (which could be an assembly) triggers a new file version and, therefore, most of the time, a new version of the drawing and the physical part. However, you do not want to have a physical part with many revisions, in particular when this part could be again part of a Bill of Material.

To solve this issue, the Physical Part and the related Drawing/Model should have different lifecycles. The relation between the Physical Part and the Drawing Model should no longer be based on numbers but on a relation in the PDM/PLM-system. One of the main characteristics of a PDM/PLM-system is that it allows users to navigate through relations to find information in context.  For example, solving a Where Used – question is a (few) mouse-click(s) in a PDM/PLM-system.

Click on the image to see the details.

Breaking this one-to-one numbering rule is a must if you want to evolve to an item-centric or data-driven PLM-environment. When to introduce this change and how to implement this new behavior is a methodology exercise, not an implementation of a new tool.

There is a lot to read about this topic as it is related to the Form-Fit-Function-discussion we had earlier this year. A collection of information can be found in these two LinkedIn-post, where the comments are providing the insights:

 

I will not dive deeper into this theme (reached 1700 words ☹) – next time I will zoom in on the EBOM and leave the world of 3D CAD behind (for a while)

 

 

To understand our legacy in the PLM-domain, what are the types of practices we created, I started this series of posts: Learning from the past to understand the future. My first post (The evolution of the BOM) focused on the disconnected world between engineering  – generation of drawings as a  deliverable – and execution MRP/ERP – the first serious IT-systems in a company.

At that time, due to minimal connectivity, small and medium-sized companies had, most of the time, an informal connection between engineering and manufacturing. I remember a statement at that time, PLM was just introduced. One person during a conference claimed:

“You guys make our lives so difficult with your systems. If we have a problem, we gather around the machine, and we fix it.”

PLM started at large enterprises

Of course, large enterprises could not afford such behavior as they operate globally. The leading enterprises for PDM/PLM were the Aerospace & Defense and Automotive companies. They needed consistent processes and formal ways of working to guarantee quality output.

In that sense, I was happy with the reaction from Jean-Jacques Urban-Galindo, who shared in the LinkedIn comments a reference to a relevant chapter of John Stark’s PLM book. In the pdf describing the evolution of CAD / PDM / PLM at PSA. Jean-Jacques was responsible at that time for Responsible for the re-engineering of the Product & Process Engineering processes using digital tools (CAD/CAM, DMU, and more).

Read the PSA story here: PLM at GROUPE PSA. It describes nicely where 3D CAD and EBOM are coming in.  In large enterprise like PSA, the need for tools are driven by the processes. When you read it to the end, you will also see the need for a design and a manufacturing view. A topic I will touch in future posts too.

The introduction of 3D CAD in the mid-market

Where large automotive and aerospace companies already invested in (expensive) 3D CAD hard and software, for the majority of the midsize companies, the switch from 2D CAD (AutoCAD mainly) towards 3D CAD (SolidWorks, Solid Edge, Inventor) started at the end of the 20th century.

It was the time that Microsoft NT became a serious platform beside the existing mainframe and mini-computer based CAD-systems. The switch to PCs went so fast that the disruption from DEC (Digital Equipment Company) is one of the cases discussed by Clayton Christensen in his groundbreaking book: The Innovator’s dilemma

3D CAD introduced a lot of new capabilities, like DMU (Digital Mock-Up), for clash detection, and above all, a better understanding of a product’s behavior. The introduction of 3D CAD introduced a new set of challenges to be resolved.

For example, the concept of reusing 3D CAD parts. Mid-market companies, most of the time, are buying productivity tools. Can I design my product faster and with higher quality in 3D instead of using only the 2D definitions?

Mid-market companies usually do not redesign their business processes – no people available for strategy – the pain of lack of strategy is felt in a different way compared to large enterprises—a crucial differentiator for the future of PLM.

Reuse of (3D) CAD parts / Assemblies

In the 2D CAD world, there was not so much reuse of CAD parts. Standard parts were saved in libraries or generated on demand by parametric libraries. Now with 3D CAD, designers might spend more time to define the part. The benefits come from the reuse of small sub-assemblies (modules) into a larger product assembly. Something not relevant in the 2D CAD world.

As every 3D CAD part had to have a file name, it became difficult to manage the file names without a system. How do you secure that the file with name Part01.xxx is unique? Another designer might also create an assembly, where the 3D CAD tool would suggest Part01.xxx as the name. And what about revisions? Do you store them in the filename, and how do you know you have the correct and latest version of the file?

Companies had already part naming rules for drawings, often related to the part’s usage similar to “intelligent” numbers I mentioned in my previous post.

With 3D CAD it became a little more complicated as now in electronic formats, companies wanted to maintain the relation:

Drawing ID = Part ID = File Name

The need for a PDM-system,

If you look to the image on the left, which I found in one of my old SmarTeam files, there is a part number combined with additional flags A-A-C, which also have meaning (I don’t know ☹ ) and a description.

 

The purpose of these meaningful flags was to maintain the current ways of working. Without a PDM-system, parts of the assembly could be shared with an OEM or a supplier. File-based 3D CAD without using a PDM-system was not a problem for small and medium enterprises.

The 3D CAD-system maintained the relations in the assembly files, including relations to the 2D Drawings. Despite the introduction of 3D CAD, the 2D Drawing remained the deliverable the rest of the company or supply chain, was waiting for. Preferably a drawing containing a parts list and balloon numbers, the same as it has been done before.  Why would you need a PDM-system?

PDM for traceability and reuse

If you were working in your 3D CAD-system for a single product, or on individual projects for OEMs, there was no significant benefit for a PDM-system. All deliveries needed for the engineering department were in the 3D CAD environment. Assembly files and drawing files are already like small databases, containing references to the source files of the part (image above).

A PDM-system at this stage could help you build traceability and prevent people from overwriting files. The ROI for this part only depends on the cost and risks of making mistakes.

However, when companies started to reuse parts or subassemblies, there was a need for a system that could manage the 3D models separately. This had an impact on the design methodology.

Now parts could be used in various products. How do you discover parts for reuse, and how do you know you have the last released version.  For sure their naming cannot be related anymore to a single product or project (a practice still used a lot)

This is where PDM-systems came in. Using additional attributes per file combined with relations between parts,  allowing companies to structure and deliver more details related to a part. A detailed description for internal usage, a part type (classification), and the part material were commonly used attributes. And not to forget the status and revision.

For reuse, it was important that the creators of content had a strategy to define a part for future reuse or discovery. Engineerings were not used to provide such services, filling in data in a PDM-system was seen as an overhead – bureaucracy.

As they were measured on the number of drawings they produced, why do extra work with no immediate benefits?

The best compromise was to have the designer fill in properties in the CAD-file when creating a part. Using the CAD-integration with the PDM-system could be used to fill attributes in the PDM-system.

This “beautiful” simple concept lead later to a lot of complexity.

Is the CAD-model the source of data, meaning designers should always start from CAD when designing a product. If someone added or modified data in the PDM-system, should we open the CAD-file to update some properties? Changing a file means it is a new version. What happens if the CAD-file is released, and I update some connected attributes in PDM?

To summarize this topic. Companies have missed the opportunity here to implement data governance. However, none of the silos (manufacturing preparation, service) recognized the need. Implementing new tools (3D CAD and PDM) did not affect the company’s way of working.

Instead of people, processes, tools, the only focus was on new tools and satisfying the people withing the same process.

Of course, when introducing PDM, which happened for mid-market companies at the beginning of this century, there was no PLM vision. Talking about lifecycle support was a waste of time for management. As we will discover in the future posts, large enterprises and small and medium enterprises have the same PLM needs. However, there is already a fundamentally different starting point. Where large enterprises are analyzing and designing business processes, the small and medium enterprises are buying tools to improve the current ways of working

The Future?

Although we have many steps to take in the upcoming posts, I want to raise your attention to an initiative from the PLM Interest Group together with Xlifecycle.com. The discussion is about what will be PLM’s role in digital transformation.

As you might have noticed, there are people saying the word PLM is no longer covering the right context, and all kinds of alternatives have been suggested. I recommend giving your opinion without my personal guidance. Feel free to answer the questionnaire, and we will be all looking forward to the results.

Find the survey here: Towards a digital future: the evolving role of PLM in the future digital world

 

Conclusion

We are going slow. Discovering here in this post the split in strategy between large enterprises (process focus) and small and medium enterprises (tool focus) when introducing 3D CAD. This different focus, at this time for PDM, is one of the reasons why vendors are creating functions and features that require methodology solving – however, who will provide the methodology.

Next time more on 3D CAD structures and EBOM

In my previous post, the PLM blame game, I briefly mentioned that there are two delivery models for PLM. One approach based on a PLM system, that contains predefined business logic and functionality, promoting to use the system as much as possible out-of-the-box (OOTB) somehow driving toward a certain rigidness or the other approach where the PLM capabilities need to be developed on top of a customizable infrastructure, providing more flexibility. I believe there has been a debate about this topic over more than 15 years without a decisive conclusion. Therefore I will take you through the pros and cons of both approaches illustrated by examples from the field.

PLM started as a toolkit

The initial cPDM/PLM systems were toolkits for several reasons. In the early days, scalable connectivity was not available or way too expensive for a standard collaboration approach. Engineering information, mostly design files, needed to be shared globally in an efficient manner, and the PLM backbone was often a centralized repository for CAD-data. Bill of Materials handling in PLM was often at a basic level, as either the ERP-system (mostly Aerospace/Defense) or home-grown developed BOM-systems(Automotive) were in place for manufacturing.

Depending on the business needs of the company, the target was too connect as much as possible engineering data sources to the PLM backbone – PLM originated from engineering and is still considered by many people as an engineering solution. For connectivity interfaces and integrations needed to be developed in a time that application integration frameworks were primitive and complicated. This made PLM implementations complex and expensive, so only the large automotive and aerospace/defense companies could afford to invest in such systems. And a lot of tuition fees spent to achieve results. Many of these environments are still operational as they became too risky to touch, as I described in my post: The PLM Migration Dilemma.

The birth of OOTB

Around the year 2000, there was the first development of OOTB PLM. There was Agile (later acquired by Oracle) focusing on the high-tech and medical industry. Instead of document management, they focused on the scenario from bringing the BOM from engineering to manufacturing based on a relatively fixed scenario – therefore fast to implement and fast to validate. The last point, in particular, is crucial in regulated medical environments.

At that time, I was working with SmarTeam on the development of templates for various industries, with a similar mindset. A predefined template would lead to faster implementations and therefore reducing the implementation costs. The challenge with SmarTeam, however, was that is was very easy to customize, based on Microsoft technology and wizards for data modeling and UI design.

This was not a benefit for OOTB-delivery as SmarTeam was implemented through Value Added Resellers, and their major revenue came from providing services to their customers. So it was easy to reprogram the concepts of the templates and use them as your unique selling points towards a customer. A similar situation is now happening with Aras – the primary implementation skills are at the implementing companies, and their revenue does not come from software (maintenance).

The result is that each implementer considers another implementer as a competitor and they are not willing to give up their IP to the software company.

SmarTeam resellers were not eager to deliver their IP back to SmarTeam to get it embedded in the product as it would reduce their unique selling points. I assume the same happens currently in the Aras channel – it might be called Open Source however probably it is only high-level infrastructure.

Around 2006 many of the main PLM-vendors had their various mid-market offerings, and I contributed at that time to the SmarTeam Engineering Express – a preconfigured solution that was rapid to implement if you wanted.

Although the SmarTeam Engineering Express was an excellent sales tool, the resellers that started to implement the software began to customize the environment as fast as possible in their own preferred manner. For two reasons: the customer most of the time had different current practices and secondly the money come from services. So why say No to a customer if you can say Yes?

OOTB and modules

Initially, for the leading PLM Vendors, their mid-market templates were not just aiming at the mid-market. All companies wanted to have a standardized PLM-system with as little as possible customizations. This meant for the PLM vendors that they had to package their functionality into modules, sometimes addressing industry-specific capabilities, sometimes areas of interfaces (CAD and ERP integrations) as a module or generic governance capabilities like portfolio management, project management, and change management.

The principles behind the modules were that they need to deliver data model capabilities combined with business logic/behavior. Otherwise, the value of the module would be not relevant. And this causes a challenge. The more business logic a module delivers, the more the company that implements the module needs to adapt to more generic practices. This requires business change management, people need to be motivated to work differently. And who is eager to make people work differently? Almost nobody,  as it is an intensive coaching job that cannot be done by the vendors (they sell software), often cannot be done by the implementers (they do not have the broad set of skills needed) or by the companies (they do not have the free resources for that). Precisely the principles behind the PLM Blame Game.

OOTB modularity advantages

The first advantage of modularity in the PLM software is that you only buy the software pieces that you really need. However, most companies do not see PLM as a journey, so they agree on a budget to start, and then every module that was not identified before becomes a cost issue. Main reason because the implementation teams focus on delivering capabilities at that stage, not at providing value-based metrics.

The second potential advantage of PLM modularity is the fact that these modules supposed to be complementary to the other modules as they should have been developed in the context of each other. In reality, this is not always the case. Yes, the modules fit nicely on a single PowerPoint slide, however, when it comes to reality, there are separate systems with a minimum of integration with the core. However, the advantage is that the PLM software provider now becomes responsible for upgradability or extendibility of the provided functionality, which is a serious point to consider.

The third advantage from the OOTB modular approach is that it forces the PLM vendor to invest in your industry and future needed capabilities, for example, digital twins, AR/VR, and model-based ways of working. Some skeptic people might say PLM vendors create problems to solve that do not exist yet, optimists might say they invest in imagining the future, which can only happen by trial-and-error. In a digital enterprise, it is: think big, start small, fail fast, and scale quickly.

OOTB modularity disadvantages

Most of the OOTB modularity disadvantages will be advantages in the toolkit approach, therefore discussed in the next paragraph. One downside from the OOTB modular approach is the disconnect between the people developing the modules and the implementers in the field. Often modules are developed based on some leading customer experiences (the big ones), where the majority of usage in the field is targeting smaller companies where people have multiple roles, the typical SMB approach. SMB implementations are often not visible at the PLM Vendor R&D level as they are hidden through the Value Added Reseller network and/or usually too small to become apparent.

Toolkit advantages

The most significant advantage of a PLM toolkit approach is that the implementation can be a journey. Starting with a clear business need, for example in modern PLM, create a digital thread and then once this is achieved dive deeper in areas of the lifecycle that require improvement. And increased functionality is only linked to the number of users, not to extra costs for a new module.

However, if the development of additional functionality becomes massive, you have the risk that low license costs are nullified by development costs.

The second advantage of a PLM toolkit approach is that the implementer and users will have a better relationship in delivering capabilities and therefore, a higher chance of acceptance. The implementer builds what the customer is asking for.

However, as Henry Ford said, if I would ask my customers what they wanted, they would ask for faster horses.

Toolkit considerations

There are several points where a PLM toolkit can be an advantage but also a disadvantage, very much depending on various characteristics of your company and your implementation team. Let’s review some of them:

Innovative: a toolkit does not provide an innovative way of working immediately. The toolkit can have an infrastructure to deliver innovative capabilities, even as small demonstrations, the implementation, and methodology to implement this innovative way of working needs to come from either your company’s resources or your implementer’s skills.

Uniqueness: with a toolkit approach, you can build a unique PLM infrastructure that makes you more competitive than the other. Don’t share your IP and best practices to be more competitive. This approach can be valid if you truly have a competing plan here. Otherwise, the risk might be you are creating a legacy for your company that will slow you down later in time.

Performance: this is a crucial topic if you want to scale your solution to the enterprise level. I spent a lot of time in the past analyzing and supporting SmarTeam implementers and template developers on their journey to optimize their solutions. Choosing the right algorithms, the right data modeling choices are crucial.

Sometimes I came into a situation where the customer blamed SmarTeam because customizations were possible – you can read about this example in an old LinkedIn post: the importance of a PLM data model

Experience: When you plan to implement PLM “big” with a toolkit approach, experience becomes crucial as initial design decisions and scope are significant for future extensions and maintainability. Beautiful implementations can become a burden after five years as design decisions were not documented or analyzed. Having experience or an experienced partner/coach can help you in these situations. In general, it is sporadic for a company to have internally experienced PLM implementers as it is not their core business to implement PLM. Experienced PLM implementers vary from size and skills – make the right choice.

 

Conclusion

After writing this post, I still cannot write a final verdict from my side what is the best approach. Personally, I like the PLM toolkit approach as I have been working in the PLM domain for twenty years seeing and experiencing good and best practices. The OOTB-box approach represents many of these best practices and therefore are a safe path to follow. The undecisive points are who are the people involved and what is your business model. It needs to be an end-to-end coherent approach, no matter which option you choose.

 

 

 

7years

Two weeks ago I got this message from WordPress, reminding me that I started blogging about PLM on May 22nd in 2008. During some of my spare time during weekends, I began to read my old posts again and started to fix links that have been disappearing.

Initially when I started blogging, I wanted to educate mid-market companies about PLM. A sentence with a lot of ambiguities. How do you define the mid-market and how do you define PLM are already a good start for a boring discussion. And as I do not want to go into a discussion, here are my “definitions”

Warning: This is a long post, full of generalizations and a conclusion.

PLM and Mid-market

The mid-market companies can be characterized as having a low-level of staff for IT and strategic thinking. Mid-market companies are do-ers and most of the time they are good in their domain based on their IP and flexibility to deliver this to their customer base. I did not meet mid-market companies with a 5-year and beyond business vision. Mid-market companies buy systems. They bought an ERP system 25-30 years ago (the biggest trauma at that time). They renewed their ERP system for the Y2K problem/fear and they switched from drawing board towards a 2D CAD system. Later they bought a 3D CAD system, introducing the need for a PDM system to manage all data.

PLM is for me a vision, a business approach supported by an IT-infrastructure that allows companies to share and discover and connect product related information through the whole lifecycle. PLM enables companies to react earlier and better in the go-to-market process. Better by involving customer inputs and experience from the start in the concept and design phases. Earlier thanks to sharing and involving other disciplines/suppliers before crucial decisions are made, reducing the amount of iterations and the higher costs of late changes.

PLM_profSeven years ago I believed that a packaged solution, combined with a pre-configured environment and standard processes would be the answer for mid-market companies. The same thought currently PLM vendors have with a cloud-based solution. Take it, us it as it is and enjoy.

Here I have changed my opinion in the past seven years. Mid-market companies consider PLM as a more complex extension of PDM and still consider ERP (and what comes with that system) as the primary system in the enterprise. PLM in mid-market companies is often seen as an engineering tool.

LESSON 1 for me:
The benefits of PLM are not well-understood by the mid-market

To read more:

PLM for the mid-market – mission impossible?

PLM for the SMB – a process or culture change ?

Culture change in a mid-sized company – a management responsibility

Mid-market PLM – what did I learn in 2009 ?

Implementing PLM is a change not a tool

Mid-market deadlocks for PLM

Who decides for PLM in a mid-market company ?

More on: Who decides for PLM in a mid-market company ?

Globalization and Education

globalIn the past seven years, globalization became an important factor for all type of companies. Companies started offshoring labor intensive work to low-labor-cost countries introducing the need for sharing product data outside their local and controlled premises. Also, acquisitions by larger enterprises and by some of the dominant mid-market companies, these acquisitions introduced a new area of rethinking. Acquisitions introduced discussions about: what are real best practices for our organization? How can we remain flexible, meanwhile adapt and converge our business processes to be future ready?

Here I saw two major trends in the mid-market:

Lack of (PLM) Education

dummies_logoTo understand and implement the value of PLM, you need to have skills and understanding of more than just a vendor-specific PLM system. You need to understand the basics of change processes (Engineering Change Request, Engineering Change Order, Manufacturing Change Order and more). And you need to understand the characteristics of a CAD document structure, a (multidisciplinary) EBOM, the MBOM (generic and/or plant specific) and the related Bill of Processes. This education does not exist in many countries and people are (mis-)guided by their PLM/ERP vendor, explaining why their system is the only system that can do the job.

Interesting enough the most read posts on my blog are about the MBOM, the ETO, BTO and CTO processes. This illustrates there is a need for a proper, vendor-independent and global accepted terminology for PLM

Some educational posts:

Bill of Materials for Dummies – ETO  ranked #1

ECR/ECO for Dummies ranked #2

BOM for Dummies – CTO  ranked #4

BOM for Dummies: BOM and CAD  ranked #7

BOM for Dummies – BTO

Where does PLM start beyond document management ?

The dominance of ERP

swissAs ERP systems were introduced long before PLM (and PDM), these systems are often considered by the management of a mid-market company as the core. All the other tools should be (preferably) seen as an extension of ERP and if possible, let´s implement ERP vendor´s functionality to support PLM – the Swiss knife approach – one tool for everything. This approach is understandable as at the board level there are no PLM discussions. Companies want to keep their “Let´s do it”-spirit and not reshuffle or reorganize their company, according to modern insights of sharing. Strangely enough, you see in many businesses the initiative to standardize on a single ERP system first, instead of standardizing on a single PLM approach first. PLM can bring the global benefits of product portfolio management and IP-sharing, where ERP is much more about local execution.

LESSON 2:
PLM is not understood at the board level, still considered as a tool

Some post related to PLM and ERP

Where is the MBOM ?  ranked #3

Connecting PLM and ERP (post 1)(post 2)(post 3) ranked #8

Can ERP vendors do PLM ?

PLM and ERP – the culture change

PLM and ERP – continued

5 reasons not to implement PLM – Reason #3 We already have an ERP system

The human factor

whyworryA lot of the reasons why PLM has the challenge to become successful have to do with its broad scope. PLM has an unclear definition and most important, PLM forces people to share data and work outside their comfort zones. Nobody likes to share by default. Sharing makes day-to-day life more complicated, sharing might create visibility on what you actually contribute or fix. In many of my posts, I described these issues from various viewpoints: the human brain, the innovators dilemma, the way the older generation (my generation) is raised and used to work. Combined with the fact that many initial PLM/PDM implementations have created so many legacies, the need to change has become a risk. In the discussion and selection of PLM I have seen many times that in the end a company decides to keep the old status quo (with new tools) instead of really having the guts to move toward the future. Often this was a result of investors not understanding (and willing to see) the long term benefits of PLM.

LESSON 3:
PLM requires a long-term vision and understanding, which most of the time does not fit current executive understanding (lack of education/time to educate) and priority (shareholders)

Many recent posts are about the human factor:

The Innovator´s dilemma and PLM

Our brain blocks PLM acceptance

PLM and Blockers

The PLM paradox for 2015

PLM and Global Warming

Τα πάντα ρεί

PLM is doomed, unless ……

How to get users excited or more committed to a new PLM system?

The digital transformation

econimistThe final and most significant upcoming change is the fact that we are entering a complete new era: From linear and  predictable towards fast and iterative, meaning that classical ways we push products to the market will become obsolete. The traditional approach was based on lessons learned from mechanical products after the second world-war. Now through globalization and the importance of embedded software in our products, companies need to deliver and adapt products faster than the classical delivery process as their customers have higher expectations and a much larger range to choose from. The result from this global competitiveness is that companies will change from delivering products towards a more-and-more customer related business model (continuous upgrades/services). This requires companies to revisit their business and organization, which will be extremely difficult. Business wise and human change require new IT concepts – platform? / cloud services? / Big data?

Older enterprises, mid-market and large enterprises will be extremely challenged to make this change in the upcoming 10 years. It will be a matter of survival and I believe the Innovator´s Dilemma applies here the most.

LESSON 4:
The digital transformation is apparent as a trend for young companies and strategic consultants. This message is not yet understood at the board level of many businesses.

 

Some recent post related to this fast upcoming trend:

From a linear world to fast and circular ?

Did you notice PLM is changing?

Documents or Intelligent Data ?

The difference between files and data-oriented – a tutorial (part 1)(part 2)(part 3)

PLM is dead, long live …… ?

PLM, Soccer and game changing

PLM and/or SLM? – (part 1)(part 2)

Breaking down the silos with data

ROI (Return On Investment)

No_roiI also wrote about ROI – a difficult topic to address as in most discussions related to ROI, companies are talking about the costs of the implementation, not about the tremendous larger impact a new business approach or model can have, once enabled through PLM. Most PLM ROI discussions are related to efficiency and quality gains, which are significant and relevant. However these benefits are relative small and not comparable with the ability to change your business (model) to become more customer centric and stay in business.

Some of the ROI posts:

To PLM or Not to PLM – measuring the planning phase  ranked #5

Free PLM Software does not help companies  ranked #6

PLM: What is the target?

PLM selection–additional thoughts

PLM Selection: Proof Of Concept observations

Where is my PLM Return On Investment (ROI) ?

A PLM success story with ROI

Conclusion

A (too) long post this time however perhaps a good post to mark 7 years of blogging and use it as a reference for the topics I briefly touched here. PLM has many aspects. You can do the further reading through the links.

From the statistics it is clear that the education part scores the best – see rankings. For future post, let me know by creating a comment what you are looking for in this blog: PLM Mid-Market, Education, PLM and ERP, Business Change, ROI, Digitalization, or …??

Also I have to remain customer centric – thanks for reading and providing your feedback

nochangecartoon

Above Image courtesy of the marketoonist.com – Tom Fishburne
Image related to digital transformation: The Economist – the onrushing wave

questionnaire The past two years I have been blogging about PLM, with a special focus on the mid-market. My previous post was about PLM selection (which PLM to choose) and thanks to Oleg (How To Choose PLM? (Visual guide)) this became a broader discussion. It made me realize that although we are all talking about PLM, I am not sure if we all have the same opinion about the mid-market.

To be aligned my previous definition of the mid-market:

Mid market company: For me the definition of a mid-market company does not have to do with revenue or the amount of people working for this company. I characterize a mid-market company as a company, where everyone has a focus on the company’s primary process. There is no strategic layer of people, who are analyzing the current business and defining new strategies for the future. In addition, the IT-staff is minimal, more seen as an overhead than as strategic. Mid-market companies have their strength in being flexible and reacting fast on changes, which might contradict with a long term strategic approach.

question

Now I am curious about your opinion. Therefore I published a small questionnaire on a Belgium website, to get a quick feedback and I am looking forward to your response. Although I do not consider it as scientific research, your (anonymous) response will enable me review my opinion and to focus on some specific topics.

Please take the time so answers this questionnaire from the link below:

PLM for the mid-market – your opinion

Thanks for your feedback and I will publish the results end of October

Jos Voskuil

observation Last weeks have been busy weeks and I have seen various PLM candidates all around Europe. As these companies were mid-market companies, I noticed again how difficult it is for these companies to follow the ideal path towards PLM.

For those reading my blog frequently they might remember my definition of mid-market and PLM. For newer readers I will give my definitions again, as everyone has their own definition.

Mid market company: For me the definition of a mid-market company does not have to do with revenue or the amount of people working for this company. I characterize a mid-market company as a company, where everyone has a focus on the company’s primary process. There is no strategic layer of people, who are analyzing the current business and defining new strategies for the future. In addition, the IT-staff is minimal, more seen as an overhead than as strategic. Mid-market companies have their strength in being flexible and reacting fast on changes, which might contradict with a long term strategic approach.

frog

As what happens if you are only in a reactive mode – it can be too late.

(the boiling frog)

PLM: For me PLM is not a product but a vision or business approach based on a collection of best practices (per industry). Main characteristics of PLM are centralizing all product knowledge (IP) throughout all the lifecycle stages and a focus on best practices and immediate visibility on all lifecycle stages.  Combining concept, planning, development, production planning and after sales / service into one integrated process. It is more than concurrent engineering, it is about sharing data and ownership of data through different departments. And this means business transformation, breaking through traditional barriers. Of course PLM vendors have a slight different definition in order to differentiate themselves from other vendors. For example more focus on a virtual product definition (CAD PLM vendors) or a focus on efficiency and one single platform (ERP PLM vendors)

myplm

Who will initiate this change ?

And these two definitions already raise the questions I want to reflect here as I experienced again in two recent visits that the pain to move to PLM is here.

First what is the result of a reactive mode, even when it is a quick reaction ?

jugleA reactive mode leads to a situation where a company will never be able to differentiate rapidly from their competition. As every change takes time to implement, it is logically that a real business change will not be implemented as a quick reaction. The company needs to have a long term vision. And this is one of the things I noticed talking with mid-market companies. Ask these questions: “Where do you want to be in five years from now” and “How do you make sure you achieve these goals (if goals exist)” and often you find the company is depending on the business instinct of the founder(s) and has no real answers for the long term future.

god_comp This is of course a result of the typical mid-market company, they have no internal people who will step outside the daily hectic and work on a change. And being reactive always means you are (a little) behind. And this was the situation in one of the companies that I have met recently. There was an initial understanding of the values that PLM could bring, but when talking about some of the basic principles of PLM, the answers was: In our company ERP is God. This means real PLM has no chance – you do not want to fight against God.

 

 

And now the discussion who can initiate the change towards PLM

wise Now another example of a mid-market company that had a long term PLM vision but got trapped in their own approach. The company has been growing fast and like many European companies, production is done in China. And this causes collaboration issues around communication and quality between Europe and China as the company only knows CAD data management and ERP. The engineering manager was assigned to solve these issues.He did not get a full strategic assignment to look at the complete picture, but the management pushes him to solve the current pains, having the PLM wishes still in mind.

And solving the current pains lead again to function / feature comparison with a short term justification, believing that in the future all will fit in the PLM vision, as the potential resellers for the new solution said: “Yes we can”. Have you ever heard a reseller say “No we cannot”

The result, the engineering manager has to make a decision based on the ‘blue eyes’ of the reseller as he does not get the mandate and power from his management to analyze and decide on a PLM strategy for the long term. For one of the resellers talking about the details of PLM was even more a disadvantage as it creates an impression that PLM is complex. It is easier to sell a dream. A similar situation as I described in my posts: Who decides for PLM in a mid-market company

My conclusion

Although I am aware that many mid-market companies implement basics of PLM, it is frustrating to see that lack of priority and understanding of the management in mid-market companies blocks the growth to full benefits for PLM. The management is not to blame, as most PLM messages either come from the high-end PLM vendors or from product resellers both not packaged for the mid-market. See PLM for the mid-market – a mission impossible ?

PLM is a cross-departmental solution and the management should look for partners who can explain the business values and share best practices for mid-market companies business  wise.
The partner is 50 % of the success for a PLM implementation.

Do you recoginize similar situations ? How would you address them ?

plm_cloud

My PLM blog cloud based on Wordie – see the virtualdutchman blog cloud

observation Two weeks ago I received through the PLM group on LinkedIn, the following question from Nathalie: “Do you know any specific examples of what some companies have done to get their users ready, excited or more committed to the new PLM system?”

When digging in my mind and planning to give a quick answer, I realized it was an interesting question with a contradiction embedded: users and excitement for a new PLM system.

This week I was attending the SmarTeam User Group meeting in the Netherlands, where an excellent presentation was given by Simon and Hessel from a Dutch company called  Meyn (Poultry processing) about their PLM implementation. They shared their excitement !

Combined with an interesting discussion on Oleg’s blog with Frank, I believe I have the ingredients to answer the above question more complete.

PLM is not exiting for users

myplm I think this is fact number one. When you go to tradeshows or PLM exhibitions, you see usually only 3D CAD demos, nobody tries to demonstrate PLM functions and features in detail. As a side step, I believe the best PLM system should be almost invisible for the user. Users want to work in their own environment with applications like CAD, Excel (BOM handling apps), Office, FEA tools, Simulation tools and more.

ERP has a more clear value proposal, if you want to define and schedule your manufacturing and manage the financial transactions, everyone has accepted that you need ERP. User acceptance is not relevant, users have to work with the provided interface as otherwise production or accounting will fail, there is no alternative.

In contrary, the clear value and definition of PLM are not clear to user. For that reason these users do not get excited when confronted with PLM. They have been surviving without implementing PLM, so they believe there is an alternative.

 

But we know there are PLM benefits?

My previous post – PLM in the mid-market a mission impossible? – lead to a discussion with Oleg and Frank coming with anew and interesting view point. Frank mentioned that in the German area, many mid-market companies do PLM without purchasing an enterprise PLM system from the known vendors.

coopThe discussion focused on granularity, as all of us believed that a set-by-step approach towards PLM best practices, driven by people who understand the company very well, is the key to success. For this approach you need people inside the customer’s organization who can formulate the vision assisted by consultants working very dedicated in that industry. It requires a different type of consultant as those active in the big enterprise projects.

Instead of implementing PLM as a standard process, in this approach the customer drives and leads the activities where they see benefits in their overall business process. To achieve this, the company must have has a clear vision, where they want to be in the next 5 – 10 years.

Next implementations steps should fit in this strategy and prioritized based on different parameters and these steps are not always with a focus on PLM.

And here lies the key for successful PLM implementations.

my_way The implementation might be based on an academic approach around a core PLM data model and best practices. Mid-market offerings are around an OOTB (Out-Of-The-Box) quick implementation –  the PLM system/implementer leads.

Something the management of likes to hear; quick and with little customization, which would translate in lower costs of implementation and disruption of the organization. But then, the end-users start to complain. There is too much change their standard way of working and they do not see the advantages – keying in more data in a system does not help them.

No_roi The introduction of PLM brings more complexity and as the new system has to prove itself, there is not big enthusiasm from the average user. The management can push, like in the ERP situation, but in general also the management is anxious to learn if this OOTB-approach brings the benefits and when it fails they ask the vendor where the estimated ROI can be found.

Concluding you will be lucky if users get excited form the OOTB approach.

sel_a In the second and granular approach, the company defines their strategy and vision, not necessary a 100 % PLM vision. This strategy need to be clear and shared with the employees in the company, especially for those who are affected by changes.

Next together with implementation partners, who bring in the know-how and possible software tools, a part of the company’s process is addressed and improved. It can be in any area, changing the CAD engine, automate BOM handling, connect sales to engineering or connect after sales/service to engineering.

Many of these areas of interest have different solutions, some are extensions of the CAD environment, some of them are extensions of the ERP environment and some of them are extensions of the IT-platform used in the company.

This approach is not sold by the PLM vendors, as they want to introduce their system as the IT-platform, wrap around the CAD and even capture the definition of the MBOM and initiation of the Item master.

A step-by-step approach based on different granular components, every time in the direction of the company’s strategy, plus all the time feed-back to the end-users on the positive impact of the change, is for me the key to success. In my previous post I was looking for a global provider for these required components.
With the step by step approach with granular solutions, we get users involved and excited.

 

And this brings me the to the presentation from Meyn

meyn The first time I got involved with Meyn was in October 2004. At that time they had chosen to move from their BaaN-2D CAD infrastructure to a new environment with BaaN – 3D CAD (CATIA). Simon presented their target strategy and vision: moving away from being an Engineering To Order company to become primarily a Configure To Order company.

ENOVIA SmarTeam was chosen to manage the 3D CAD and to connect the information to BaaN. Initially Meyn started in the classical PLM approach, but already after a few months, the understanding was there, they need have step-by-step approach, focused on results for the new CATIA users, without communicating around a complete PLM focused project.

So they followed a stepped approach, they called them waves.

Moving from Engineering to Order to Configure to Order is not software implementation. It requires rationalization of your products; convert them into modular, configurable parts. For this you need to be an engineering expert, not a software expert.

But when it comes to implementation of this concept in the software, you need both experts. And through this collaboration, a methodology for skeleton design was established which was driven by Meyn. And the reason the users were excited was, that they were doing real engineering, the benefits were significant visible.

roi Customer project related engineering time (typical ETO), which was in the beginning their core activity, became around 30 % of the time. More time could be spent on developing new machines in a modular way. With almost the same amount of engineers the turn-over of the company had more than doubled. A win-win environment which makes also the end-users excited.

Still the backend with ERP at Meyn remained almost the same similar to the time they were working in the 2D environment. And the most interesting conclusion at the end of the presentation was, they are still using the same slide with the vision and they can explain why each step was taken and justify it by measurable benefits.

And this brings me to the answer of the question

“Do you know any specific examples of what some companies have done to get their users ready, excited or more committed to the new PLM system”?

  • The management needs to have a clear vision where they want to be as a company in the future. This is not an IT-vision, but a business vision which explain why changes are needed. This vision should be clear to the employees. Communicate!
  • Where possible provide metrics!
  • Do not talk about a PLM system; it can be also in other tools. Talk about improvement steps in the business processes contributing to the vision. The PLM system is the information backbone, not the front-end. Management and implementers should talk business functionality not IT functions and features. Do not talk in applications!
  • Build step by step user scenarios with focus on methodology and user understanding. Implementations with a function-feature focus are hard to accept by the users. Talk business!
  • The management should present their vision again and again, supported by metrics what has been accomplished and what has been learned for the future – repeat!

Conclusion

There are thousands of mid-market companies that have a vision to improve their business. The PLM system should never be the topic of discussion with the end users; it is the change in working methods that is important, supported by various systems -CAD/ERP/CRM – and almost invisible …….. PLM

The company Meyn is an example of this approach. Simon and Hessel are working for Meyn as engineers improving their company’s business. Unfortunate it is not their business to explain all around the world, how PLM supports business change in a mid-market company. I was glad to attend their session last week.

eb In 2008 and 2009 several analysts predicted that the mid-market was now ready for PLM and that most of the PLM vendors were building a targeted offering for the mid-market. I was, and still am, a believer that mid-market companies will benefit from PLM in case ………… they implement it.
When you review my observations in my blog from the past two years, apparently this does not seem to happen. Therefore in the past months, I have been analyzing posts and discussions around the ‘old’ and ‘new’ PLM, I have been talking with representatives from various PLM and PDM vendors, and last but not least analyzed what was the implementation process of a PLM system in companies, where I could get these insights.

This all lead to this post, perhaps too big for a blog, too small for a report.

First the definitions

Before giving my opinion, first my definitions of PLM and mid-market (as everyone has their own definition):

plm PLM means for me the management of all product related data and processes, from the initial concept phase, through planning, development, production planning and after sales/service. When talking about PLM, I have always a circular process in mind. Experiences from products in the market are again inputs for new product development. Instead of a linear process where every department manages their own data, the challenge is that every discipline contributes and collaborates around the product data. This implies that a PLM implementation always requires a business change process for a customer

mid-market Mid-market companies are for those companies where there is no strategic layer available plus a minimized investment in IT-resources. This leads to organizations where most changes are happening inside departments and cross-departmental changes are hard to implement. The IT-department might be a facilitator here but usually IT people focus on architecture and infrastructure instead of business change. This implies that a PLM changed should come from external people.

 

And who are doing PLM?

On the enterprise level, there is a battle between the big three (Dassault Systems, Siemens and PTC) and they are challenged mostly by the two big ERP vendors (SAP and Oracle) and on the PLM front by Aras, competing through its Open Source model. Of course there are many other vendors. These observations come from the area where I am active.

cad_txt There are various ways to group these PLM vendors; one is from the CAD engine point of view: DS-CATIA / Siemens-NX / PTC-Pro/E. Although all claim to support a multi-CAD environment, the main focus in these companies is around the PLM integration with their primary CAD engine.

Where in the past, CAD independent PDM systems existed (Metaphase, MatrixOne), they could only survive in the major PLM industries by being integrated with CAD tools and were acquired for that reason. It will be interesting to see if Aras can play a major role in the PLM only domain, where others failed in the past due to lack of integration capabilities.

erp_txt SAP and Oracle took a different path; they have understood that PLM cannot be neglected in an enterprise, so they need to address it. SAP did this by developing a PLM module as a logical extension on their infrastructure. Oracle has chosen to add PLM to their portfolio by the acquisition of two different PLM vendors. Where SAP does not have the challenge to explain to customers a full integrated story, Oracle has to spend more time on marketing to make it look like a single platform, which will come in the future. Big question however for both companies: do they really understand PLM? Is it in their veins and core strategy or does it remain an extension to gain market share, especially as you have no connections to the design world? (Try to find PLM on their corporate website).

plm_txt Interesting to see how Aras will evolve. In their business model, the initial purchase of software is not needed, but once working with Aras you pay also for maintenance like with other PLM vendors. Their advantage is that switching from an existing legacy PLM vendor is less painful, as there are no initial software costs, which can be huge for an enterprise. I believe they have a good chance to succeed in industries where there is less a dependency on the CAD engine.So on the enterprise level the need for PLM is justified. Resources exist and are budgeted both at the customer level as at the supplier level. The PLM suppliers are either the PLM vendors themselves with service teams, or big, global service providers specialized in implementing the PLM software. They can do strategic PLM projects and support the required business change.

So why does it look like a mission impossible in the mid-market ?

The big enterprise vendors (PLM/ERP) believe that you can just strip down your enterprise software in a kind of prepackaged mode – PLM Out of the Box is a common heard expression. Also the analysts praise in their reports the mid-market approach from some of these vendors.

But do they really address the mid-market or only the high-end mid-market? Again it is all about the definition of where is the mid-market and in this post I stay with my definition of mid-market.

There are two main characteristics for this mid-market:

  1. Sales and implementation of software is done through Value Added Resellers and not through the vendors or big service companies. The software revenue per customer does not justify high expenses for global consultants with additional high expenses due to travel costs (and sometimes the local language issue). The local VAR is supposed to be the point of contact.
  2. Mid-market companies do not change their main company processes. Depending on the type of core process, let’s assume ETO or BTO, they have sales and engineering working close together on product/solution definition and they have manufacturing planning and production working close together on product/solution delivery. In term of functionality a PDM focus for sales/engineering and an ERP focus for manufacturing.

A mid-market company can be characterized as a two pillar company :

Who are successful in the mid-market ?

There are two software vendors, touching our PLM prospects , that really understand the mid-market, Autodesk and Microsoft.

Autodesk has a huge range of products and when we focus on the area of manufacturing, Autodesk does not talk about PLM. And I believe for several reasons.

ad Autodesk has never been a front-runner in making new technology and concepts available for the mainstream. They are more a company providing functionality for mainstream concepts, as compared to a company pushing new concepts and technology for premium pricing.
And this is what their customers like, as they also do not have internal strategic resources to push the company to new directions and surely no one wants to take the risk.

Thus risk avoidance and understandable concepts are key targets for mid-market companies.
Autodesk tries to avoid reaching beyond their engineering domain, the maximum they cover is presented in their Digital Prototyping solution. With their Vault product range they stay close to PDM, but do not go into the concepts of PLM, like mBOM handling. PLM is not established enough in the mid-market, so a no-go area for Autodesk.

Microsoft addresses the mid-market more from the IT-infrastructure. Slowly SharePoint has reached a certain status of an infrastructure component for content management – so why not for all the engineering data? SharePoint is the most relevant component related to PDM or PLM in my review and what I observed here is that the IT-manager often is the person who supports and enables a cross-departmental implementation of SharePoint. So not pushed from a strategic business level but from a strategic IT architecture approach.

md PLM providers and implementers jumped on this opening in the mid-market by providing PLM capabilities on top of SharePoint. This to get their software used in the mid-market. It does not mean they do PLM, it means they expand the visibility of engineering data across the organization. Microsoft apparently does not want to enter the area of managing CAD or engineering data. You see mainly investments in the Microsoft Dynamics software, where ERP and CRM are targeted. Again PLM is not established enough in the mid-market to provide common functionality, so a no-go area for Microsoft.

And the impact of a indirect sales channel….

CADVARs are the next challenge for PLM in the mid-market. The PLM Vendors, who work with VARs, expect that these VARs are an extension of their sales organization. And sales means here selling software . PLM means however also selling services and I learned in the hard way in my past that companies selling products and services within the same group of people are constant in internal conflict how to balance software and service budgets

Selling and implementing PLM software is also difficult in mid-market companies as these companies buy software because they want to solve a pain in one of their departments. It is not common that they have a holistic approach. So VARs trying to sell PLM are engineering centric – often with their roots in CAD Selling. And as their nature comes from product selling, they feel comfortable in selling data management and PDM as this remains close to product features easy to justify. PLM requires different people, who can guide a business change across departments at the customer.

varIt is very rare for VARs to have these skilled people in place due to lack of scale. You need to act local to be cost efficient and close to your customer. As a VAR has only visibility of a limited group of implementations, the consultancy practices often are not based on global experience and best practices, but defined on their own best practices, sometimes bring their ‘magic’ to be even more different than required, to differentiate from other VARs.

The companies implementing PLM for enterprises can afford to share global knowledge; VARs need to build up the knowledge locally, which leads to an extreme dependency on the person who is available. And to be affordable on the payroll a VAR, the consultant often is an experienced application engineer, who knows to satisfy his customer by providing services on top of the product.

And as PLM is not established enough in the mid-market, they will not invest and push for PLM which requires a long term experience build-up, so almost a no-go area for VARs

So no PLM in the mid-market?

I believe real PLM in my mid-market will be a rarity, based on a lucky coincidence of the right people, the right company and the right product at a certain time. It will not become a main stream solution in the mid-market as there is the design world and the ERP world.

PLM SaaS (Software As A Service) delivered by Arena or PLMplus will not bring the solution either for the mid-market. You might remove the IT complexity, but you are missing the resources (internal and external) for business change – who will be there to initiate and guide the change . PLM SaaS probably will be implemented as a PDM environment.

gw I give more credits for Social PLM (Facebook alike collaboration, Google Wave). This approach might bypass the classical way of working in companies and lead to new concepts, which probably will not be tagged PLM – will the new trigram be SPC (Social Product Collaboration) ?

Still it will not happen fast I believe. It requires a change of the management in mid-market companies. Most of the managers are representative of the older generation, not wanting to take the risk to jump on a new hype they haven’t made themselves familiar yet

 

Conclusion: PLM in the mid-market seems like a mission impossible and although PLM concepts are valuable for the mid-market as analysts report, the typical mid-market characteristics block PLM to become a common practice there.

I am looking forward to learn from your comments

This post is a reply on a post from YML, with whom I have been working in the past. At that time we had interesting discussions on various topics around PLM and I am happy to continue this discussion in blog space. Please read his post in order to understand the full reasoning below.

myplmFirst I want to make a statement to avoid misconception. I am a PLM evangelist and perhaps my definition of PLM is wider than what PLM vendors currently offer. For me PLM focuses not only on storing and managing the product data (PDM), but also on the whole process of how new products or improved products are created, designed, produced and supported.
From the Dassault Systemes and Autodesk (read Jim Brown’s comments on them) perspective, there is a lot of focus on the collaboration around the virtual product, however for me personally, when working with mid-market customers, I am mainly focusing on capturing design knowledge and IP plus creating visibility of knowledge inside a company, without being dependent on knowledge stored in people brains.

Interesting development in that area I am observing recently is in www.vuuch.com. An initiative to empower design discussions.

Now back to the reply on YML’s post:

So when Yann writes:

However I disagree with the statement that you use as foundation : Software vendor are proposing excellent product with a good ROI and SMB customer don’t understand it because they do not have a vision.

I must say: read my statement above – there is still work to be done. When I am talking about the lack of vision in the mid-market companies, I will provide an update based on some experiences I had the past few weeks, where lack of vision is blocking process improvements.

Next Yann is mentioning all the propriety formats of all vendors, PLM vendors, vendors of authoring tools (CAD, Content,…) and even limited version support. Yann makes a point for open-source solutions, which are part of the WEB 2.0 evolution. Interesting to see that at the same time Kurt Chen writes an interesting post on What Can PLM Offer for SMBs? in the same context.

My main comment on this topic is that I understand the beauty of open source, however I also believe that if you want to work with open source solutions, you need to have a 100 % clear concept of what the product should do (that is why Linux is successful – I believe PLM is not there yet) or you need companies that have strong IT-knowledge/support to adapt the software to their needs.However, this contradicts the fact that mid-market companies usually do not have these resources to invest in this kind of activity. So what would they do ? Hire consultancy firms or software companies (sometimes the original developer of the open source software) to adapt the software to their needs. This creates almost the same dependency as what customers would have with traditional PLM vendors – they rely on their software provider as the resource to drive PLM.

Then the question comes up:

Who would I trust to assist my mid-market company to evolve towards PLM ?

A company developing software or a company that has experience in my industry and perhaps does not deliver the best in class product (yet).I have met a company that decided to discontinue PLM software as the provider only brought programmers into the game, they tried to solve requests from the users and at the end – after 1.5 year of programming the system became so complex but crucial details were missing. An industry knowledgeable person with PLM knowledge would approach it different – first focusing on the process and then analyze where automation would bring benefits.Also Yann mentions:

It is interesting to note that nobody is blaming Ford, GM, … of not being able to see that they have good chance to go bankrupt in some month from now. It is interesting that many people blame now these companies of not being able re-invent/adapt their products to their market. when all of them where using PLM systems and had huge PLM projects on going

and additional:

In order to develop this agility SMB need to put very high in the list of capabilities for their information system the following features : Agility, re-configuration, continuous evolution / transformation, openness, ease of integration with unknown system, overall strategy of the PLM vendor

Here I disagree with the first quote. Ford and GM have no PLM implementations, they built a dinosaur type of implementation with focus on product development – yes provided by PLM vendors, but so rigid implemented that they lost the capabilities to be connected to the market.And I fully agree with the second quote – nothing to add. PLM should be implemented in such a way that it does not restrict a company in its flexibility – as innovation does not come from doing a process more efficient – it comes from doing things different 

So to conclude for today:

  • Yes, current PLM vendors are not perfect and there is a challenge to reach the mid-market
  • Open Source solutions make only sense if combined with industry knowledge
  • Agility, re-configuration, continuous evolution / transformation, openness, ease of integration with unknown systems should be the overall strategy of the PLM vendor (not only mid-market)

Thanks Yann, and enjoy your fishing, take notice of what could happen:

%d bloggers like this: