You are currently browsing the category archive for the ‘Proof of Concept’ category.

In my previous post, the PLM blame game, I briefly mentioned that there are two delivery models for PLM. One approach based on a PLM system, that contains predefined business logic and functionality, promoting to use the system as much as possible out-of-the-box (OOTB) somehow driving toward a certain rigidness or the other approach where the PLM capabilities need to be developed on top of a customizable infrastructure, providing more flexibility. I believe there has been a debate about this topic over more than 15 years without a decisive conclusion. Therefore I will take you through the pros and cons of both approaches illustrated by examples from the field.

PLM started as a toolkit

The initial cPDM/PLM systems were toolkits for several reasons. In the early days, scalable connectivity was not available or way too expensive for a standard collaboration approach. Engineering information, mostly design files, needed to be shared globally in an efficient manner, and the PLM backbone was often a centralized repository for CAD-data. Bill of Materials handling in PLM was often at a basic level, as either the ERP-system (mostly Aerospace/Defense) or home-grown developed BOM-systems(Automotive) were in place for manufacturing.

Depending on the business needs of the company, the target was too connect as much as possible engineering data sources to the PLM backbone – PLM originated from engineering and is still considered by many people as an engineering solution. For connectivity interfaces and integrations needed to be developed in a time that application integration frameworks were primitive and complicated. This made PLM implementations complex and expensive, so only the large automotive and aerospace/defense companies could afford to invest in such systems. And a lot of tuition fees spent to achieve results. Many of these environments are still operational as they became too risky to touch, as I described in my post: The PLM Migration Dilemma.

The birth of OOTB

Around the year 2000, there was the first development of OOTB PLM. There was Agile (later acquired by Oracle) focusing on the high-tech and medical industry. Instead of document management, they focused on the scenario from bringing the BOM from engineering to manufacturing based on a relatively fixed scenario – therefore fast to implement and fast to validate. The last point, in particular, is crucial in regulated medical environments.

At that time, I was working with SmarTeam on the development of templates for various industries, with a similar mindset. A predefined template would lead to faster implementations and therefore reducing the implementation costs. The challenge with SmarTeam, however, was that is was very easy to customize, based on Microsoft technology and wizards for data modeling and UI design.

This was not a benefit for OOTB-delivery as SmarTeam was implemented through Value Added Resellers, and their major revenue came from providing services to their customers. So it was easy to reprogram the concepts of the templates and use them as your unique selling points towards a customer. A similar situation is now happening with Aras – the primary implementation skills are at the implementing companies, and their revenue does not come from software (maintenance).

The result is that each implementer considers another implementer as a competitor and they are not willing to give up their IP to the software company.

SmarTeam resellers were not eager to deliver their IP back to SmarTeam to get it embedded in the product as it would reduce their unique selling points. I assume the same happens currently in the Aras channel – it might be called Open Source however probably it is only high-level infrastructure.

Around 2006 many of the main PLM-vendors had their various mid-market offerings, and I contributed at that time to the SmarTeam Engineering Express – a preconfigured solution that was rapid to implement if you wanted.

Although the SmarTeam Engineering Express was an excellent sales tool, the resellers that started to implement the software began to customize the environment as fast as possible in their own preferred manner. For two reasons: the customer most of the time had different current practices and secondly the money come from services. So why say No to a customer if you can say Yes?

OOTB and modules

Initially, for the leading PLM Vendors, their mid-market templates were not just aiming at the mid-market. All companies wanted to have a standardized PLM-system with as little as possible customizations. This meant for the PLM vendors that they had to package their functionality into modules, sometimes addressing industry-specific capabilities, sometimes areas of interfaces (CAD and ERP integrations) as a module or generic governance capabilities like portfolio management, project management, and change management.

The principles behind the modules were that they need to deliver data model capabilities combined with business logic/behavior. Otherwise, the value of the module would be not relevant. And this causes a challenge. The more business logic a module delivers, the more the company that implements the module needs to adapt to more generic practices. This requires business change management, people need to be motivated to work differently. And who is eager to make people work differently? Almost nobody,  as it is an intensive coaching job that cannot be done by the vendors (they sell software), often cannot be done by the implementers (they do not have the broad set of skills needed) or by the companies (they do not have the free resources for that). Precisely the principles behind the PLM Blame Game.

OOTB modularity advantages

The first advantage of modularity in the PLM software is that you only buy the software pieces that you really need. However, most companies do not see PLM as a journey, so they agree on a budget to start, and then every module that was not identified before becomes a cost issue. Main reason because the implementation teams focus on delivering capabilities at that stage, not at providing value-based metrics.

The second potential advantage of PLM modularity is the fact that these modules supposed to be complementary to the other modules as they should have been developed in the context of each other. In reality, this is not always the case. Yes, the modules fit nicely on a single PowerPoint slide, however, when it comes to reality, there are separate systems with a minimum of integration with the core. However, the advantage is that the PLM software provider now becomes responsible for upgradability or extendibility of the provided functionality, which is a serious point to consider.

The third advantage from the OOTB modular approach is that it forces the PLM vendor to invest in your industry and future needed capabilities, for example, digital twins, AR/VR, and model-based ways of working. Some skeptic people might say PLM vendors create problems to solve that do not exist yet, optimists might say they invest in imagining the future, which can only happen by trial-and-error. In a digital enterprise, it is: think big, start small, fail fast, and scale quickly.

OOTB modularity disadvantages

Most of the OOTB modularity disadvantages will be advantages in the toolkit approach, therefore discussed in the next paragraph. One downside from the OOTB modular approach is the disconnect between the people developing the modules and the implementers in the field. Often modules are developed based on some leading customer experiences (the big ones), where the majority of usage in the field is targeting smaller companies where people have multiple roles, the typical SMB approach. SMB implementations are often not visible at the PLM Vendor R&D level as they are hidden through the Value Added Reseller network and/or usually too small to become apparent.

Toolkit advantages

The most significant advantage of a PLM toolkit approach is that the implementation can be a journey. Starting with a clear business need, for example in modern PLM, create a digital thread and then once this is achieved dive deeper in areas of the lifecycle that require improvement. And increased functionality is only linked to the number of users, not to extra costs for a new module.

However, if the development of additional functionality becomes massive, you have the risk that low license costs are nullified by development costs.

The second advantage of a PLM toolkit approach is that the implementer and users will have a better relationship in delivering capabilities and therefore, a higher chance of acceptance. The implementer builds what the customer is asking for.

However, as Henry Ford said, if I would ask my customers what they wanted, they would ask for faster horses.

Toolkit considerations

There are several points where a PLM toolkit can be an advantage but also a disadvantage, very much depending on various characteristics of your company and your implementation team. Let’s review some of them:

Innovative: a toolkit does not provide an innovative way of working immediately. The toolkit can have an infrastructure to deliver innovative capabilities, even as small demonstrations, the implementation, and methodology to implement this innovative way of working needs to come from either your company’s resources or your implementer’s skills.

Uniqueness: with a toolkit approach, you can build a unique PLM infrastructure that makes you more competitive than the other. Don’t share your IP and best practices to be more competitive. This approach can be valid if you truly have a competing plan here. Otherwise, the risk might be you are creating a legacy for your company that will slow you down later in time.

Performance: this is a crucial topic if you want to scale your solution to the enterprise level. I spent a lot of time in the past analyzing and supporting SmarTeam implementers and template developers on their journey to optimize their solutions. Choosing the right algorithms, the right data modeling choices are crucial.

Sometimes I came into a situation where the customer blamed SmarTeam because customizations were possible – you can read about this example in an old LinkedIn post: the importance of a PLM data model

Experience: When you plan to implement PLM “big” with a toolkit approach, experience becomes crucial as initial design decisions and scope are significant for future extensions and maintainability. Beautiful implementations can become a burden after five years as design decisions were not documented or analyzed. Having experience or an experienced partner/coach can help you in these situations. In general, it is sporadic for a company to have internally experienced PLM implementers as it is not their core business to implement PLM. Experienced PLM implementers vary from size and skills – make the right choice.

 

Conclusion

After writing this post, I still cannot write a final verdict from my side what is the best approach. Personally, I like the PLM toolkit approach as I have been working in the PLM domain for twenty years seeing and experiencing good and best practices. The OOTB-box approach represents many of these best practices and therefore are a safe path to follow. The undecisive points are who are the people involved and what is your business model. It needs to be an end-to-end coherent approach, no matter which option you choose.

 

 

 

coopIn the past two years, I have been heavily involved in PLM Proof of Concepts sitting at both sides of the table. Supporting companies in their PLM selection, supporting a vendor explaining their value to the customer and supporting implementers assisting them with industry knowledge, all in the context of a PLM selection process.

The Proof of Concept is crucial in a PLM selection process as it is the moment where the first glimpse of reality comes to the table.

Different size of companies, different consultants all have a different view on the importance of the Proof of Concept. Let me share you my thoughts after a quick recap on the PLM selection process.

The PLM selection process

1. Build a vision

visionIt is important that a company understands what they want to achieve in the next five to ten years, before starting a PLM selection process. Implementing PLM means a business transformation, even if you are a small company. If the management does not understand a vision is required, there is a potential risk upcoming, as PLM without a change in the way people work, will not deliver the expected results.

2. Issue an RFI to potential candidates

rfi-plmOnce you have a PLM vision, it is time to get in touch with potential suppliers. The RFI (Request for Information) phase is the phase where you can educate yourself better by challenging the suppliers to work with you on the future solutions.

3. Discuss with selected candidates

discussFrom the RFI responses you understand which companies are attractive because they match your vision, your budget or industry. Have a first interaction with the selected companies and let them demo their standard environment targeted to your vision.

4. POC

test conceptIn this stage, you check with the preferred companies their ability to deliver and your ability to work together. The POC phase should give you the understanding of the scope for the upcoming PLM project and help you to understand who and how the project can be executed. More details about this step below.

5. RFP

No_roiAlthough some companies start with an RFP before the POC, for me it makes most sense to verify the details after you have a proper understanding of the To-Be solution. The RFP is often the base for the contractual scope and therefore should be as accurate as possible

In the past, I wrote in more detail about the PLM selection process. Two posts:  PLM selection: Don’t do this and PLM selection: Do this. Have a read if you want to understand this part in more depth. Now let´s focus on the POC .

POC targets

  • As described before, the target of the Proof of Concept should be to get a better understanding of the potential To-Be processes and obtain an impression of the capabilities of the implementer and the preferred PLM software.

The result should be that you have more realistic expectations of what can be achieved and the challenges your company will face.

  • From there, you can evaluate the risks, address them and build an achievable roadmap to implement. It is important that the focus is not just on the cost of the implementation.
  • To sell PLM inside your company, you need to realign with the vision and explain, to all people involved,the value of “Why PLM”.

Explaining the value is complex, as not everyone needs the same message. The management will focus on business benefits where users will focus how it impacts their daily life.  If you forget to explain the value, the PLM projects, it is considered again as just another software purchase.

POC DO’s

businessMake sure the Proof of Concept is driven by validating future business scenarios, focusing on the To-Be solution. The high-level scenarios should be demonstrated and explained to the business people. In this stage, it is important people realize the benefits and the value of the new processes.

sales eventThe POC is also an internal sales event. The goal should be to get more enthusiastic and supportive business people in your company for the upcoming PLM project. Identify the champions you will need to lean on during the implementation.

balanceTest the implementer. To my opinion the critical success of a PLM implementation depends on the implementation team, not on the software. Therefore, the POC phase is the best moment to learn if you can work with the implementer. Do they know your business? Do they have experience with your business? The more you are aligned, the higher the chance you will be successful as a team

commitShow commitment to engage. Often I have seen POC engagements where the company demanded the implementer or vendor a Proof of Concept for free. This creates an unbalanced situation during the Proof of Concept as the vendor or implementer can not invest time and resources in the process as expected without any commitment from the company. By paying a certain fee for the POC, a company can demonstrate to the implementer /vendor that this POC is valuable for you and you can request the same response from them.

POC DON’Ts

no detailsThe Proof of Concept is not a detailed function/feature check to identify each mouse-click or option in the system. During the implementation, these details might come up. It is important in a Proof of Concept to understand the big picture and not to get lost in the details. As human beings we tend to focus on what does not work, not realizing that probably over eighty-ninety percent works according the needs

ultimateDo not expect the ultimate To-Be scenario demonstrated during the Proof of Concept. The Proof of Concept is a learning stage for both the company and the implementer to imagine the best possible scenario. PLM systems are generic and likely they will not provide a similar configuration and functionality matching your environment. At this stage validate if the primary capabilities are there and if there are gaps.

plm vendorDo not run a POC with a vendor (only). This might be one of the most critical points for a POC. A PLM software vendor’s target is to sell their software and for that reason they often have dedicated presales teams that will show you everything in a smooth manner, overwhelming you with all the beauty of the software. However after the POC this team is gone and you will have to align yourself again with the implementation partner, trying to match again your business needs and their understanding.

imageRealize – you get what you are asking for. This is more a Do-and-Don’t message packed together. A Proof of Concept phase is a point where companies get to know each other. If you are not focused, do not expect the implementer / vendor to be committed. A PLM implementation is not product. It is a business transformation supported by products and services. Do not treat PLM implementers and vendors in the same way, as your customers treat you (in case you deliver products).

Conclusion

There are still many more thoughts about the Proof of Concept . Ideally you run two POCs in parallel, either with two implementers of the preferred software (if possible) or with two different implementers representing different software.

Ideally, as I know it is a challenge, especially for small and medium-sized businesses, where people are running to keep the business on-going.

Still remember, PLM is a business transformation, targeting to improve your business in the upcoming five to ten years, avoiding you are running out of business.

Your thoughts ?

As a bonus a short anecdote that I posted in 2010 still relevant:

plm heaven or hell

Some time ago a Christian PLM Sales professional died (let’s call him Jack) and according to his believe he faced Saint Peter at the gates of Heaven and Hell.
Saint Peter greeted Jack and said: “Jack, with the PLM Sales you have done good and bad things to the world. For that reason, I cannot decide if you should go to Heaven or to Hell. Therefore, I allow you to make the choice yourself”.

Jack replied: “But Saint Peter, how can I make such an important decision for the rest of my eternal life. It is too difficult!”

Saint Peter replied: “No problem Jack, take a look at Heaven and Hell, take your time and then tell me your decision.”

Jack entered Heaven and he was surprised about the quietness and green atmosphere there. Angels were singing, people were eating from golden plates with the best food ever, people were reading poetry and everything was as peaceful as you could imagine. In the distance, he could see God surrounded by some prophets talking about the long-term future. After some time, Jack had seen it and went to Hell to have a view there.

And when he opened the gates of Hell, he was astonished. Everywhere he looked there were people partying, having fun. It reminded him off these sales kick-offs, he had in the past, exotic places with lots of fun. In the distance, he could see the Devil as DJ playing the latest dance music – or was it DJ Tiësto?

Jack did not hesitate and ran back to Saint Peter, no time to lose. “Saint Peter,” he said “I want to go to Hell, no doubt. And pity I did not know it before”

“So be it, ” said Saint Peter “go for it.”

And then once Jack entered Hell, it was suddenly all fire around him, people were screaming of pain and suffering and also Jack felt the first flames.

“Devil!!”  He screamed “what happened to what I have seen before?”

With a sarcastic voice, the devil replied: “That? That was a proof of concept.”

%d bloggers like this: