You are currently browsing the tag archive for the ‘open source’ tag.

In my previous post, the PLM blame game, I briefly mentioned that there are two delivery models for PLM. One approach based on a PLM system, that contains predefined business logic and functionality, promoting to use the system as much as possible out-of-the-box (OOTB) somehow driving toward a certain rigidness or the other approach where the PLM capabilities need to be developed on top of a customizable infrastructure, providing more flexibility. I believe there has been a debate about this topic over more than 15 years without a decisive conclusion. Therefore I will take you through the pros and cons of both approaches illustrated by examples from the field.

PLM started as a toolkit

The initial cPDM/PLM systems were toolkits for several reasons. In the early days, scalable connectivity was not available or way too expensive for a standard collaboration approach. Engineering information, mostly design files, needed to be shared globally in an efficient manner, and the PLM backbone was often a centralized repository for CAD-data. Bill of Materials handling in PLM was often at a basic level, as either the ERP-system (mostly Aerospace/Defense) or home-grown developed BOM-systems(Automotive) were in place for manufacturing.

Depending on the business needs of the company, the target was too connect as much as possible engineering data sources to the PLM backbone – PLM originated from engineering and is still considered by many people as an engineering solution. For connectivity interfaces and integrations needed to be developed in a time that application integration frameworks were primitive and complicated. This made PLM implementations complex and expensive, so only the large automotive and aerospace/defense companies could afford to invest in such systems. And a lot of tuition fees spent to achieve results. Many of these environments are still operational as they became too risky to touch, as I described in my post: The PLM Migration Dilemma.

The birth of OOTB

Around the year 2000, there was the first development of OOTB PLM. There was Agile (later acquired by Oracle) focusing on the high-tech and medical industry. Instead of document management, they focused on the scenario from bringing the BOM from engineering to manufacturing based on a relatively fixed scenario – therefore fast to implement and fast to validate. The last point, in particular, is crucial in regulated medical environments.

At that time, I was working with SmarTeam on the development of templates for various industries, with a similar mindset. A predefined template would lead to faster implementations and therefore reducing the implementation costs. The challenge with SmarTeam, however, was that is was very easy to customize, based on Microsoft technology and wizards for data modeling and UI design.

This was not a benefit for OOTB-delivery as SmarTeam was implemented through Value Added Resellers, and their major revenue came from providing services to their customers. So it was easy to reprogram the concepts of the templates and use them as your unique selling points towards a customer. A similar situation is now happening with Aras – the primary implementation skills are at the implementing companies, and their revenue does not come from software (maintenance).

The result is that each implementer considers another implementer as a competitor and they are not willing to give up their IP to the software company.

SmarTeam resellers were not eager to deliver their IP back to SmarTeam to get it embedded in the product as it would reduce their unique selling points. I assume the same happens currently in the Aras channel – it might be called Open Source however probably it is only high-level infrastructure.

Around 2006 many of the main PLM-vendors had their various mid-market offerings, and I contributed at that time to the SmarTeam Engineering Express – a preconfigured solution that was rapid to implement if you wanted.

Although the SmarTeam Engineering Express was an excellent sales tool, the resellers that started to implement the software began to customize the environment as fast as possible in their own preferred manner. For two reasons: the customer most of the time had different current practices and secondly the money come from services. So why say No to a customer if you can say Yes?

OOTB and modules

Initially, for the leading PLM Vendors, their mid-market templates were not just aiming at the mid-market. All companies wanted to have a standardized PLM-system with as little as possible customizations. This meant for the PLM vendors that they had to package their functionality into modules, sometimes addressing industry-specific capabilities, sometimes areas of interfaces (CAD and ERP integrations) as a module or generic governance capabilities like portfolio management, project management, and change management.

The principles behind the modules were that they need to deliver data model capabilities combined with business logic/behavior. Otherwise, the value of the module would be not relevant. And this causes a challenge. The more business logic a module delivers, the more the company that implements the module needs to adapt to more generic practices. This requires business change management, people need to be motivated to work differently. And who is eager to make people work differently? Almost nobody,  as it is an intensive coaching job that cannot be done by the vendors (they sell software), often cannot be done by the implementers (they do not have the broad set of skills needed) or by the companies (they do not have the free resources for that). Precisely the principles behind the PLM Blame Game.

OOTB modularity advantages

The first advantage of modularity in the PLM software is that you only buy the software pieces that you really need. However, most companies do not see PLM as a journey, so they agree on a budget to start, and then every module that was not identified before becomes a cost issue. Main reason because the implementation teams focus on delivering capabilities at that stage, not at providing value-based metrics.

The second potential advantage of PLM modularity is the fact that these modules supposed to be complementary to the other modules as they should have been developed in the context of each other. In reality, this is not always the case. Yes, the modules fit nicely on a single PowerPoint slide, however, when it comes to reality, there are separate systems with a minimum of integration with the core. However, the advantage is that the PLM software provider now becomes responsible for upgradability or extendibility of the provided functionality, which is a serious point to consider.

The third advantage from the OOTB modular approach is that it forces the PLM vendor to invest in your industry and future needed capabilities, for example, digital twins, AR/VR, and model-based ways of working. Some skeptic people might say PLM vendors create problems to solve that do not exist yet, optimists might say they invest in imagining the future, which can only happen by trial-and-error. In a digital enterprise, it is: think big, start small, fail fast, and scale quickly.

OOTB modularity disadvantages

Most of the OOTB modularity disadvantages will be advantages in the toolkit approach, therefore discussed in the next paragraph. One downside from the OOTB modular approach is the disconnect between the people developing the modules and the implementers in the field. Often modules are developed based on some leading customer experiences (the big ones), where the majority of usage in the field is targeting smaller companies where people have multiple roles, the typical SMB approach. SMB implementations are often not visible at the PLM Vendor R&D level as they are hidden through the Value Added Reseller network and/or usually too small to become apparent.

Toolkit advantages

The most significant advantage of a PLM toolkit approach is that the implementation can be a journey. Starting with a clear business need, for example in modern PLM, create a digital thread and then once this is achieved dive deeper in areas of the lifecycle that require improvement. And increased functionality is only linked to the number of users, not to extra costs for a new module.

However, if the development of additional functionality becomes massive, you have the risk that low license costs are nullified by development costs.

The second advantage of a PLM toolkit approach is that the implementer and users will have a better relationship in delivering capabilities and therefore, a higher chance of acceptance. The implementer builds what the customer is asking for.

However, as Henry Ford said, if I would ask my customers what they wanted, they would ask for faster horses.

Toolkit considerations

There are several points where a PLM toolkit can be an advantage but also a disadvantage, very much depending on various characteristics of your company and your implementation team. Let’s review some of them:

Innovative: a toolkit does not provide an innovative way of working immediately. The toolkit can have an infrastructure to deliver innovative capabilities, even as small demonstrations, the implementation, and methodology to implement this innovative way of working needs to come from either your company’s resources or your implementer’s skills.

Uniqueness: with a toolkit approach, you can build a unique PLM infrastructure that makes you more competitive than the other. Don’t share your IP and best practices to be more competitive. This approach can be valid if you truly have a competing plan here. Otherwise, the risk might be you are creating a legacy for your company that will slow you down later in time.

Performance: this is a crucial topic if you want to scale your solution to the enterprise level. I spent a lot of time in the past analyzing and supporting SmarTeam implementers and template developers on their journey to optimize their solutions. Choosing the right algorithms, the right data modeling choices are crucial.

Sometimes I came into a situation where the customer blamed SmarTeam because customizations were possible – you can read about this example in an old LinkedIn post: the importance of a PLM data model

Experience: When you plan to implement PLM “big” with a toolkit approach, experience becomes crucial as initial design decisions and scope are significant for future extensions and maintainability. Beautiful implementations can become a burden after five years as design decisions were not documented or analyzed. Having experience or an experienced partner/coach can help you in these situations. In general, it is sporadic for a company to have internally experienced PLM implementers as it is not their core business to implement PLM. Experienced PLM implementers vary from size and skills – make the right choice.

 

Conclusion

After writing this post, I still cannot write a final verdict from my side what is the best approach. Personally, I like the PLM toolkit approach as I have been working in the PLM domain for twenty years seeing and experiencing good and best practices. The OOTB-box approach represents many of these best practices and therefore are a safe path to follow. The undecisive points are who are the people involved and what is your business model. It needs to be an end-to-end coherent approach, no matter which option you choose.

 

 

 

image This week I was happy to participate in the PLM INNOVATION 2011 conference in London. It was an energizer, which compared to some other PLM conferences, makes the difference. The key of the success, to my opinion was that there was no vendor dominance. And that participants were mainly discussing around their PLM implementation experiences not about products.

Additional as each of the sessions were approximate 30 minutes long, it forced the speakers to focus on their main highlights, instead of going into details. Between the sessions there was significant time to network or to setup prescheduled meetings with other participants. This formula made it for me an energizing event as every half hour you moved into a next experience.

In parallel, I enjoyed and experienced the power of the modern media. Lead by Oleg,  a kind of parallel conference took place on Twitter around the hash tag #plminnovation2011. There I met, and communicated with people in the conference (and outside) and felt sorry I was not equipped with all the modern media (iPhone/Pad type  equipment) to interact more intensive during these days.

Now some short comments/interpretations on the sessions I was able to attend

Peter Bilello, president of Cimdata opened the conference in the way we are used from Cimdata, explaining the areas and values of PLM, the statistics around markets, major vendors and positive trends for the near future. Interesting was the discussion around the positioning of PLM and ERP functionality and the coverage of these functionalities between PLM and ERP vendors.

Jean-Yves Mondon, EADS’ head of PLM Harmonization (Phenix program) , illustrated by extracts of an interview with their CEO Louis Gallois, how EADS relies on PLM as critical for their business and wants to set standards for PLM in order to have the most efficient interoperability of tools and processes coming from multiple vendors

image Due to my own session and some one-to-one sessions, I missed a few parallel sessions in the morning and attended  Oleg Shilovitsky’s session around the future of engineering software. Oleg discussed several trends and one of the trends I also see as imminent, it the fact that the PLM world is changing from databases towards networks. It is not about capturing all data inside one single system, but to be able to find the right information through a network of information carriers.

This suits also very well with the new generation of workers (generation-Y) who also learned to live in this type of environments and collect information through their social networks.

The panel discussion with 3 questions for panelist could have been a little better in case the panelist would have had the time to prepare some answers, although some of the improvisations were good. I guess the audience choose Graham McCall’s response on the question: “What will be the Next Biggest Disappointment” as the best. He mentioned the next ‘big world-changing’ product launch from a PLM vendor.

myplm Then I followed the afternoon session from Infor, called Intelligent PLM for Manufacturing. The problem with this session I had (and I have this often with vendor sessions) was that Venkat Rajaj did exactly wrong what most vendors do wrong. They create their own niche definition – Product Lifecycle Intelligence (is there no intelligence in PLM) , being the third software company (where are they on Cimdata’s charts)  and further a lot of details on product functions and features. Although the presentation was smooth and well presented, the content did not stick.

image A delight that day was the session from Dr. Harminder Singh, associate fellow at Warwick Business School, about managing the cultural change of PLM. Harminder does not come from the world of software or PLM and his outsider information and looks, created a particular atmosphere for those who were in the audience and consider cultural change as an important part of PLM. Here we had a session inspired by a theme not by product or concept. I was happy to have a longer discussion with Harminder that day as I also believe PLM has to do with culture change – it is not only technology and management push as we would say. Looking forward to follow up here.

The next day we started with an excellent session from Nick Sale from TaTa Technologies. Beside a Nano in the lobby of the conference he presented all the innovation and rationalization related to the Nano car and one of his messages was that we should not underestimate the power of innovation coming from India. An excellent sponsor presentation as the focus was on the content.

In the parallel track I was impressed how Philips Healthcare implemented their PLMD architecture with three layers. imageGert-Jan Laurenssen explained they have an authoring layer, where they do global collaboration within one discipline. A PDM layer where they manage the interdisciplinary collaboration, which of course in the case of Healthcare is a mix of mechanical, electrical and software. And above these two layers they connect to the layer of transactional systems, that need the product definition data. Impressive was their implementation speed for sure due to some of the guidelines Gert-Jan gave – see Oleg’s picture from his slide here.  Unfortunate I did not have the time to discuss deeper with Gert-Jan as I am curious about the culture change and the amount of resources they have in this project. Interesting observation was that the project was driven by IT-managers and Engineering managers, confirming the trend that PLM more and more becomes business focussed instead of IT-focused.

Peter Thorne from Cambashi brought in his session called Trends and Maximizing PLM investments an interesting visual historical review on engineering software investments using Google Earth as the presentation layer. Impressing to see the trends visualized this way and scary the way Europe is not really a major area of investment and growth.

Keith Connolly explained in his session how S&C Electric integrated their PLM environment with ERP. Everything sounded so easy and rational but as I know the guys from S&C for a longer time, I know it is a result of having a clear vision and working for many years towards implementing this vision.

image Leon Lauritsen from Minerva gave a presentation around Open Source PLM and he did an excellent job around explaining where Open Source PLM could/should become attractive. Unfortunate his presentation quickly went into the direction of Open Source PLM equals Aras and he continued with a demo of Aras capabilities. I would have preferred to have a longer presentations around the Open Source PLM business model instead of spending time on looking at a product.

I believe Aras has a huge potential, for sure in the mid-market and perhaps beyond, but I keep coming back on my experiences I also have with SmarTeam: An open and easy to install PLM system with a lot of features is a risk in the hand of IT-people with no focus on business. Without proper vision and guiding (coming from ????? ) it will become again an IT-project, for cheaper to the outside world (as internal investments often are not so clear), but achieving the real PLM goals depends on how you implement.

image After lunch we really reached to the speed of light with David Widgren, who gave us the insight of data management at CERN. Their problematic, somehow a single ‘product’ – the accelerators and all its equipment  plus a long lifecycle (20 years development before operational), surviving all technologies and data formats requires them to think all time on pragmatic data storage and migration. In parallel as the consumers of data are not familiar with the complexity of IT-systems they build lots of specific interfaces for specific roles to provide the relevant information in a single environment. Knowing a lot of European funds are going there, David is a good ambassador for the CERN, explaining in a comic manner he is working at the coolest place on Earth.

image Last session I could attend was Roger Tempest around Data Management. Roger is a co-founder of the PLMIG (PLM Interest group) and they strive for openness, standards and  interoperability for PLM systems. I was disappointed by this session as I was not able to connect to the content. Roger was presenting his axioms as it seemed. I had the feeling he would come down the stage with his 10 commandments. I would be interested to understand where these definitions came from. Is it a common understanding or it it just again another set of definitions coming from another direction and what is the value or message for existing customers using particular PLM software.

I missed the closing keynote session from John Unsworth from Bentley. I learned later this was also an interesting session but cannot comment it.

My conclusion:
An inspiring event, both due to its organization and agenda and thanks to the attendees who made a real PLM centric event. Cannot wait for 2012

observation Last weeks have been busy weeks and I have seen various PLM candidates all around Europe. As these companies were mid-market companies, I noticed again how difficult it is for these companies to follow the ideal path towards PLM.

For those reading my blog frequently they might remember my definition of mid-market and PLM. For newer readers I will give my definitions again, as everyone has their own definition.

Mid market company: For me the definition of a mid-market company does not have to do with revenue or the amount of people working for this company. I characterize a mid-market company as a company, where everyone has a focus on the company’s primary process. There is no strategic layer of people, who are analyzing the current business and defining new strategies for the future. In addition, the IT-staff is minimal, more seen as an overhead than as strategic. Mid-market companies have their strength in being flexible and reacting fast on changes, which might contradict with a long term strategic approach.

frog

As what happens if you are only in a reactive mode – it can be too late.

(the boiling frog)

PLM: For me PLM is not a product but a vision or business approach based on a collection of best practices (per industry). Main characteristics of PLM are centralizing all product knowledge (IP) throughout all the lifecycle stages and a focus on best practices and immediate visibility on all lifecycle stages.  Combining concept, planning, development, production planning and after sales / service into one integrated process. It is more than concurrent engineering, it is about sharing data and ownership of data through different departments. And this means business transformation, breaking through traditional barriers. Of course PLM vendors have a slight different definition in order to differentiate themselves from other vendors. For example more focus on a virtual product definition (CAD PLM vendors) or a focus on efficiency and one single platform (ERP PLM vendors)

myplm

Who will initiate this change ?

And these two definitions already raise the questions I want to reflect here as I experienced again in two recent visits that the pain to move to PLM is here.

First what is the result of a reactive mode, even when it is a quick reaction ?

jugleA reactive mode leads to a situation where a company will never be able to differentiate rapidly from their competition. As every change takes time to implement, it is logically that a real business change will not be implemented as a quick reaction. The company needs to have a long term vision. And this is one of the things I noticed talking with mid-market companies. Ask these questions: “Where do you want to be in five years from now” and “How do you make sure you achieve these goals (if goals exist)” and often you find the company is depending on the business instinct of the founder(s) and has no real answers for the long term future.

god_comp This is of course a result of the typical mid-market company, they have no internal people who will step outside the daily hectic and work on a change. And being reactive always means you are (a little) behind. And this was the situation in one of the companies that I have met recently. There was an initial understanding of the values that PLM could bring, but when talking about some of the basic principles of PLM, the answers was: In our company ERP is God. This means real PLM has no chance – you do not want to fight against God.

 

 

And now the discussion who can initiate the change towards PLM

wise Now another example of a mid-market company that had a long term PLM vision but got trapped in their own approach. The company has been growing fast and like many European companies, production is done in China. And this causes collaboration issues around communication and quality between Europe and China as the company only knows CAD data management and ERP. The engineering manager was assigned to solve these issues.He did not get a full strategic assignment to look at the complete picture, but the management pushes him to solve the current pains, having the PLM wishes still in mind.

And solving the current pains lead again to function / feature comparison with a short term justification, believing that in the future all will fit in the PLM vision, as the potential resellers for the new solution said: “Yes we can”. Have you ever heard a reseller say “No we cannot”

The result, the engineering manager has to make a decision based on the ‘blue eyes’ of the reseller as he does not get the mandate and power from his management to analyze and decide on a PLM strategy for the long term. For one of the resellers talking about the details of PLM was even more a disadvantage as it creates an impression that PLM is complex. It is easier to sell a dream. A similar situation as I described in my posts: Who decides for PLM in a mid-market company

My conclusion

Although I am aware that many mid-market companies implement basics of PLM, it is frustrating to see that lack of priority and understanding of the management in mid-market companies blocks the growth to full benefits for PLM. The management is not to blame, as most PLM messages either come from the high-end PLM vendors or from product resellers both not packaged for the mid-market. See PLM for the mid-market – a mission impossible ?

PLM is a cross-departmental solution and the management should look for partners who can explain the business values and share best practices for mid-market companies business  wise.
The partner is 50 % of the success for a PLM implementation.

Do you recoginize similar situations ? How would you address them ?

plm_cloud

My PLM blog cloud based on Wordie – see the virtualdutchman blog cloud

observation The title of this post came in my mind when looking back on some of the activities I was involved in, in the past two weeks. I was discussing with several customers their progress or status of the current PLM integration. One of the trends was, that despite the IT department did their best to provide a good infrastructure for project or product related information, the users always found a problem ,why they could not use the system.

alm_1 I believe the biggest challenge for every organization implementing PDM and later PLM is, to get all users aligned to store their information in a central location and to share it with others. Only in this manner a company can achieve the goal of having a single version of the truth.

With single version of the truth I mean – if I look in the PLM system I find there all the needed data to explain me the exact status of a product or a project.
If it is not in the PLM system, it does not exist !

How many companies can make that statement ?

If your company does not have the single version of the truth implemented yet , you might be throwing away money and even bring your company at risk in the long term. Why ? Let’s look at some undisclosed examples I learned in the past few weeks:No_roi

  • A company ordering 16 pumps which on arrival where not the correct ones –
    1 M Euro lost
  • During installation at a drilling site the equipment did not fit and had many clashes – 20 M Dollar lost, due to rework and penalties
  • 7000 K Euro lost due to a wrong calculation based on the wrong information
  • A major bid lost due to high price estimation due to lack of communication between the estimator and the engineering department
  • 500 K Euro penalty for delivering the wrong information (and too late)

All the above examples – and I am sure it is just a tip of what is happening around the world – were related to the power & process industry, where of course high-capital projects run and the losses might look small related to the size of the projects.

But what was the source of all this: Users

locked Although the companies were using a PLM system, in one company a user decided that some of the data should not be in the system, but should be in his drawer, to assure proper usage (according to his statement, as otherwise when the data is public available, people might misuse the data) – or was it false job security as at the end you loose your job by this behavior.
People should bring value in collaboration not in sitting on the knowledge.

save Another frequently heard complaint is that users decide the PLM system is too complex for them and it takes too much time for them to enter data. And as engineers have not been bothered by any kind of strict data management, as ERP users are used to work with, their complaints are echoed to the PLM implementer. The PLM implementer can spend a lot of time to customize or adapt the system to the user’s needs.
But will it be enough ? It is always subjective and from my experience, the more you customize the higher the future risks. What about upgrades or changes in the process ?
And can we say NO to the next wish of this almighty user ?

Is the PLM system to blame ?

PxMThe PLM system is often seen as the enemy of the data creator, as it forces a user in a certain pattern.  Excel is much easier to use, some home-made macros and the user feels everything is under control (as long as he is around).

Open Source PLM somehow seems to address this challenge, as it does not create the feeling, that PLM Vendors only make their money from complex, unneeded functionality. Everything is under own control for the customer, they decide if the system is good enough.

PLM On Demand has even a harder job to convince the unwilling user, therefore they also position themselves as easy to use, friend of the user and enemy of the software developer. But at the end it is all about users committing to share and therefore adapt themselves to changes.

So without making a qualification of the different types of PLM systems, for me it is clear that:

point The first step all users in a company should realize is that, by working together towards a single version of the truth for all product or project related data, it brings huge benefits.  Remember the money lost due to errors because another version of data existed somewhere. This is where the most ROI for PLM is reported

pointNext step is to realize, it is a change process and by being open minded towards change, either motivated or pushed by the management, the change will make everyone’s work more balanced – not in the first three months but in the longer term.

Conclusion: Creating the single version of the truth for project or product data is required in any modern organization, to remain competitive and profitable. Reaching this goal might not be as easy for every person or company but the awards are high when reaching this very basic goal.

At the end it is about human contribution – not what the computer says:

observation I realized that time is flying when you are busy, and I promised to publish the conclusion from my previous post: More on who decides for plm in a mid market company. In my two previous posts, I described the difficulties companies have to select the right PLM system. So far I discussed the two extremes, the silent approach where a possible bottom up approach was discussed and as the opposite where an ‘academical’ approach was followed.

Now it is time to get the answers on the academical approach.

These were the questions to be answered in the previous post:

  • How much time has passed since the management decided PLM was good for their organization?
  • How independent is the consultancy firm?
  • Did they consider open source PLM as a solution?
  • What was the ranking of the PLM vendors?

How much time has passed since the management decided PLM was good for their organization?

planning The whole process of selecting a PLM system often takes more than one or two years, starting from the first activities till the final conclusion to start. I believe this is unavoidable, as especially in mid-market companies the business values that PLM can bring are not always discussed and realized on the strategic level.

However, I believe the recent years PLM has been recognized by analysts, by software vendors and many young companies as a necessity for innovation and in the long term remaining competitive.  And this is not only in the classical domains where PLM started – automotive / aero / industrial equipment.  PLM value is everywhere in different industries, even apparel for example.

For companies that are now in the decision process, I believe 2009 and early 2010 are the years to decide, because a recovery of the economy might put back the focus on execution and not on strategy and they might miss the management focus for PLM. And as I wrote in a previous post, companies who made the best pit stop will benefit upmost.

For companies still in doubt:  It is now or never

How independent is the consultancy firm?

It is clear that real independent consultancy firms do not exist – even if a consultant wants to be independent, there are three challenges to meet:

  • How can a consultant evaluate or judge PLM systems they have not seen?
  • How much experience does the consultant have in your business?
  • How much work is there required in the project for the consultant?

sel_a As you can imagine, reviewing the above challenges, you will realize that consultants usually specialize in systems, where their expertise it required – as they also want to make a living.  Consultants cannot afford to be an academic institute, as coming back to the previous point, all consultancy work at the end will be paid by the customer.

So to conclude on this point, if you want to be cost-effective, a company should do already a pre-selection based on systems and possible implementation partners, that fit naturally to their type of business and then evaluate how consultancy can be achieved.

What you will find out is that the major ‘expensive’ packages have loads of consultants to offer en the more and more you go into a mid-market environment, consultants become rare. For software from PLM vendors you will usually find a reseller network with people close to your offices that can support you. For Open Source software you will need to find the consultancy services through their software delivery program.

Anyway remember: 50 % of the success of a PLM implementation is based on the right implementation approach and partner not on the PLM functions and features.

Did they consider open source PLM as a solution?

search No, because the consultant was not familiar with it, and discouraged the company to look at it. In general Open Source PLM, like PLM On-Demand are interesting trends to follow and should not be neglected.  However the focus and approach for this type of solutions is different. I will not generalize at this moment as also I have no clear picture where Open Source PLM or PLM on Demand would be a big differentiator. I will try to evaluate and report it in future posts.

Comments from Open Source PLM Vendors or On Demand PLM Vendors are welcome to complete the PLM selection approach.

What was the ranking of the PLM vendors?

Ranking was done by the management, the selection team and the design department. These were the results plus their major comment:

Management

1. The slide show PLM provider – they liked the business pitch

2. The CAD supplier with PLM features and gadgets – good guys – we know them

3. The PLM provider who showed everything – too much handling of data – too complex

Selection Team

1. The PLM Provider who showed everything – they really did it

2. The CAD supplier with PLM features and gadgets– we understand where they are going

3. The slide show PLM provider – do they really have a solution?

The Designers

1. The CAD supplier with PLM features and gadgets– he knows what we want

2. The slide show PLM provider– could be a good solution too

3. The PLM Provider who showed everything – too complex, it will limit our productivity

slideplmAs the management had the final vote, they decided for the slide show PLM Provider, as they felt most comfortable with them.

The reason to drop the CAD supplier was that they were too afraid this provider does not know all about PLM. Both management and users felt the PLM provider that showed everything was too complex, this opposite to the project team where the members were very familiar with PLM capabilities after two years investigation and many demos and trade shows.

Conclusion: Selecting PLM, even in an academical manner is a subjective process. As in general the customer does not exactly knows what he needs and often the PLM provider shows too much in detail, the real journey starts at implementation time. And in this stage you need an experienced implementation partner who can match and communicate the expectations

observationLast week I saw once more a post, where free PLM software was offered and combined with the open source aura it should be THE solution for companies that want to implement PLM during this economical downturn. I believe this is a big mistake and for the following reasons:

WYPIWYG (What You Pay Is What You Get)

I learned that the WYPIWYG rule usually applies in the software world. Free software is nice, but does not guarantee that in case some functionality is missing or corrupt, that it will be fixed. So in case a company wants to implement the free PLM software, what to do if you feel something important for your business is missing ? You can ask the software provider to implement it for you – but will this be done ? Probably only when it is easy to achieve it will be done, but no commitment as the software is for free.

To assure it can be done, the software vendor will say it is open source software, so it can be changed if you want it. But who is going to make the change ? The mid-market company that thought to have selected an economical solution is not an IT-company – so who to hire?  The open source software development company ? And this is what their  business model is based on – they have the expertise with their software, so probably they are the best to adapt the open source software – not for free of course – and they learn from that but the customer pays.
Conclusion: there is no such thing as a free lunch.

It does not mean that all open source software is bad. Linux has shown that for an operating system it makes sense. Operating systems are 100 % in the scope of IT. PLM is something different. PLM systems indeed need to provide an IT backbone to assure data collaboration and replication globally. However PLM is probably more focused on business process changes and NOT on IT.

 

PLM requires people with business skills and not software developers

From my experience, PLM projects fail in case there are no business knowledgeable people available. It did not only happen with free PLM software or open source software. Some years ago, ERP vendors started to provide free PLM software to their customers to keep PLM companies on a distance. Like free PLM software it looked nice business wise,  the software is free when you buy their ERP system. But who is going to implement it ?

This free PLM software availability has changed in the past years for ERP vendors. Also ERP vendors see PLM as a growth market for their business, so they started also to invest in PLM, providing PLM consultancy and no longer for free PLM functionality. However in one of the projects I was involved, it is clear that PLM and ERP are complementary approaches. Interesting is that none of the PLM vendors focus on ERP, apparently ERP vendors believe they can master PLM. I won’t say it is impossible however I believe if there is no real PLM vision on the top level of an ERP company, you cannot expect the competitive focus to exist.

 

Are CAD vendors providing PLM ?

Some CAD vendors have an embedded data management solution to manage their own data. This is usually more a PDM system and often the word PDM (Product Data Management) is too much for that. These systems manage their own CAD data but have no foundation for a multi-discipline Engineering BOM. For me, this is the base for PDM, as most companies have several disciplines working with different tools all around the same product. So CAD data management for me is not a the base for PDM, so for sure not for PLM.

 

PLM vendors bring real PLM value !

For me it is clear having worked with different vendors in the past:  an  ERP vendor, several PDM and PLM vendors, it is clear for me in order to bring committed value to a customer, you need first of all people with PLM skills – the ones that can differentiate between business process adaptation and software development. In order to implement PLM successful companies need to change the way they were working (read many of my previous posts about this – in particular this one). Software developers tend not to take this approach, but they adapt or extend the software to support the old way of working.

Finally paying for PLM software guarantees that the development of this software has a continuation based on business drivers and best practices. A PLM software vendor has the drive to improve to stay in business, both by software capabilities but even more by providing industry best practices.

 

point

Therefor my conclusion is that free PLM software does not help mid-market companies.

Feel free to react as I believe it is an important topic in this market.

This post is a reply on a post from YML, with whom I have been working in the past. At that time we had interesting discussions on various topics around PLM and I am happy to continue this discussion in blog space. Please read his post in order to understand the full reasoning below.

myplmFirst I want to make a statement to avoid misconception. I am a PLM evangelist and perhaps my definition of PLM is wider than what PLM vendors currently offer. For me PLM focuses not only on storing and managing the product data (PDM), but also on the whole process of how new products or improved products are created, designed, produced and supported.
From the Dassault Systemes and Autodesk (read Jim Brown’s comments on them) perspective, there is a lot of focus on the collaboration around the virtual product, however for me personally, when working with mid-market customers, I am mainly focusing on capturing design knowledge and IP plus creating visibility of knowledge inside a company, without being dependent on knowledge stored in people brains.

Interesting development in that area I am observing recently is in www.vuuch.com. An initiative to empower design discussions.

Now back to the reply on YML’s post:

So when Yann writes:

However I disagree with the statement that you use as foundation : Software vendor are proposing excellent product with a good ROI and SMB customer don’t understand it because they do not have a vision.

I must say: read my statement above – there is still work to be done. When I am talking about the lack of vision in the mid-market companies, I will provide an update based on some experiences I had the past few weeks, where lack of vision is blocking process improvements.

Next Yann is mentioning all the propriety formats of all vendors, PLM vendors, vendors of authoring tools (CAD, Content,…) and even limited version support. Yann makes a point for open-source solutions, which are part of the WEB 2.0 evolution. Interesting to see that at the same time Kurt Chen writes an interesting post on What Can PLM Offer for SMBs? in the same context.

My main comment on this topic is that I understand the beauty of open source, however I also believe that if you want to work with open source solutions, you need to have a 100 % clear concept of what the product should do (that is why Linux is successful – I believe PLM is not there yet) or you need companies that have strong IT-knowledge/support to adapt the software to their needs.However, this contradicts the fact that mid-market companies usually do not have these resources to invest in this kind of activity. So what would they do ? Hire consultancy firms or software companies (sometimes the original developer of the open source software) to adapt the software to their needs. This creates almost the same dependency as what customers would have with traditional PLM vendors – they rely on their software provider as the resource to drive PLM.

Then the question comes up:

Who would I trust to assist my mid-market company to evolve towards PLM ?

A company developing software or a company that has experience in my industry and perhaps does not deliver the best in class product (yet).I have met a company that decided to discontinue PLM software as the provider only brought programmers into the game, they tried to solve requests from the users and at the end – after 1.5 year of programming the system became so complex but crucial details were missing. An industry knowledgeable person with PLM knowledge would approach it different – first focusing on the process and then analyze where automation would bring benefits.Also Yann mentions:

It is interesting to note that nobody is blaming Ford, GM, … of not being able to see that they have good chance to go bankrupt in some month from now. It is interesting that many people blame now these companies of not being able re-invent/adapt their products to their market. when all of them where using PLM systems and had huge PLM projects on going

and additional:

In order to develop this agility SMB need to put very high in the list of capabilities for their information system the following features : Agility, re-configuration, continuous evolution / transformation, openness, ease of integration with unknown system, overall strategy of the PLM vendor

Here I disagree with the first quote. Ford and GM have no PLM implementations, they built a dinosaur type of implementation with focus on product development – yes provided by PLM vendors, but so rigid implemented that they lost the capabilities to be connected to the market.And I fully agree with the second quote – nothing to add. PLM should be implemented in such a way that it does not restrict a company in its flexibility – as innovation does not come from doing a process more efficient – it comes from doing things different 

So to conclude for today:

  • Yes, current PLM vendors are not perfect and there is a challenge to reach the mid-market
  • Open Source solutions make only sense if combined with industry knowledge
  • Agility, re-configuration, continuous evolution / transformation, openness, ease of integration with unknown systems should be the overall strategy of the PLM vendor (not only mid-market)

Thanks Yann, and enjoy your fishing, take notice of what could happen:

%d bloggers like this: