You are currently browsing the category archive for the ‘open source’ category.

This post is based on a mix of interactions I had the last two weeks in my network, mainly on LinkedIn.  First, I enjoyed the discussion that started around Yoann Maingon post: Thoughts about PLM Business models. Yoann is quite seasoned in PLM, as you can see from his LinkedIn profile, and we have had interesting discussions in the past, and recently about a new PLM-system, he is developing Ganister PLM, based on a flexible Graph database.

Perhaps in that context, Yoann was exploring the various business models. Do you pay for the software (and maintenance), do you pay through subscription, what about a modular approach or a full license for all the functionality? All these questions made me think about the various business models that I encountered and how hard it is for a customer to choose the optimal solution.  And is the space for a new type of PLM? Is there space for free PLM? Some of my thoughts here:

PLM vendors need to be profitable

One of the most essential points to consider is that whatever PLM solution you are aiming to buy, make sure that your PLM vendor has a profitable business model. As once you started with a PLM solution, it is your company’s IP that will be stored in this environment, and you do not want to change every few years your PLM system. Switching PLM systems would be affordable if the PLM system would store their data in a standard format – I will share a more in-depth link under PLM and standards.

For the moment, you cannot state PLM vendors endorse standards. None of the real PLM vendors have a standardized data model, perhaps closest to standards are Eurostep, who have based that ShareAspace solution on top of the PLCS (ISO 10303) standard. However, ShareAspace is more positioned as a type of middleware, connecting between OEMs/Owner/Operators and their suppliers to benefit for standardized connectivity.

Coming back to the statement, PLM Vendors need to be profitable to provide a guarantee for the future of your company’s data is the first step. The major PLM Vendors are now profitable as during a consolidation phase starting 15 years ago, a lot of non-profitable PLM Vendors disappeared. Matrix One, Agile, Eigner & Partner PLM are the best-known companies that were bought for either their technology or market share. In that context, you might also look at OnShape.

Would they be profitable as a separate company, or would investors give up? To survive, you need to be profitable, so giving software away for free is not a good sign (see the software for free paragraph) as a company needs continuity.

PLM startups

In the past 10 years, I have seen and evaluated several new PLM companies. All of them did not really change the PLM paradigm, most of them were still focusing on being an engineering collaboration tools. Several of these companies have in their visionary statement that they are going to be the “Excel killer.” We all know Excel has the best user interface and capabilities to manipulate a collection of metadata.

Very popular is the BOM in Excel, extracted from the CAD-system (no need for an “expensive” PDM or PLM) or BOM used to share with suppliers and stakeholders (ERP is too rigid, purchasing does not work with PDM).

The challenge I see here is that these startups do not bring real new value. The cost of manipulating Excels is a hidden cost, and companies relying on Excel communication are the type of companies that do not have a strategic point of view. This is typical for Small and Medium businesses where execution (“let’s do it”) gets all the attention.

PLM startups often collect investor’s money because they promise to kill Excel, but is Excel the real problem? Modern PLM is about data sharing, which is an attitude change, not necessarily a technology change from Excel tables to (cloud) shared tables. However, will one of these “new Excel killers” PLMs be disruptive? I don’t think so.

PLM disruption?

A week ago, I read an interview with Clayton Christensen (thanks Hakan Karden), which I shared on LinkedIn a week ago. Clayton Christensen is the father of the Disruptive Innovation theory, and I have cited him several times in my blogs. His theory is, in my opinion, fundamental to understand how traditional businesses can be disrupted. The interview took place shortly before he died at the age of 67. He died due to complications caused by leukemia.

A favorite part of this interview is, where he restates what is really Disruptive Innovation as we often talk about disruption without understanding the context, just echoing other people:

Christensen: Disruptive innovation describes a process by which a product or service powered by a technology enabler initially takes root in simple applications at the low end of a market — typically by being less expensive and more accessible — and then relentlessly moves upmarket, eventually displacing established competitors. Disruptive innovations are not breakthrough innovations or “ambitious upstarts” that dramatically alter how business is done but, rather, consist of products and services that are simple, accessible, and affordable. These products and services often appear modest at their outset but over time have the potential to transform an industry.

Many of the PLM startups dream and position themselves as the new disruptor.  Will they succeed? I do not believe so if they only focus on replacing Excel, there is a different paradigm needed. Voice control and analysis perhaps (“Hey PLM if I change Part XYZ what will be affected”)?

This would be disruptive and open new options. I think PLM startups should focus here if they want my investment money.

PLM for free?

There are some voices that PLM should be free in an analogy to software management and collaboration tools. There are so many open-source software management tools, why not using them for PLM? I think there are two issues here:

  • PLM data is not like software data. A lot of PLM data is based on design models (3D CAD / Simulation), which is different from software. Designs are often not that modular as software for various reasons. Companies want to be modular in their products, but do they have the time and resources to reinvent their existing product. For software, these costs are so much lower as it is only a brain exercise. For hardware, the impact is significant. Bringing me to the second point.
  • The cost of change for hardware is entirely different compared to software. Changing software does not have an impact on existing stock or suppliers and, therefore, can be implemented once tested for its purpose. A hardware change impacts the existing production process. First, use the old parts before introducing the change, or do we accept the (costs) of scrap. Is our supply chain, or are our production tools ready to deliver continuity for the new version? Hardware changes are costly, and you want to avoid them. Software changes are cheap, therefore design your products to be configurable based on software (For example Tesla’s software controlling the features to be allowed)

Now imagine, with enough funding, you could provide a PLM for free.  Because of ease of deployment, this would be very likely a cloud offering, easy and scalable. However, all your IP is in that cloud too, and let’s imagine that the cloud is safer than on-premise, so it does not matter in which country your data is hosted (does it ?).

Next, the “free” PLM provider starts asking a small service fee after five years, as the promised ROI on the model hasn’t delivered enough value for the shareholders, they become anxious. Of course, you do not like to pay the fee. However, where is your data, and what happens when you do not pay?

If the PLM provider switches you off, you are without your IP. If you ask the PLM provider to provide your data, what will you get? A blob of XML-files, anything you can use?

In general, this is a challenge for all cloud solutions.

  • What if you want to stop your subscription?
  • What is the allowed Exit-strategy?

Here I believe customers should ask for clarity, and perhaps these questions will lead to a renewed understanding that we need standards.

PLM and standards

We had a vivid discussion in the blogging community in September last year. You can read more related to this topic in my post: PLM and the need for standards which describes the aspects of lock-in and needs for openness.

Finally, a remark related to the PLM-acronym. Another interesting discussion started around Joe Barkai’s post: Why I do not do PLM . Read the comments and the various viewpoint on PLM here. It is clear that the word PLM unites us all; however, the interpretation is different.

If someone in the street asks me what is your profession, I never mention I do PLM. I say: “I assist mainly manufacturing companies in redesigning their business processes using best practices and modern digital technologies”. The focus is on the business value, not on the ultimate definition of PLM

Conclusion

There are many business aspects related to PLM to consider. Yoann Maingon’s post started the thinking process, and we ended up with the PLM-definition. It all illustrates that being involved in PLM is never a boring journey. I am curious to learn about your journey and where we meet.

In my previous post, the PLM blame game, I briefly mentioned that there are two delivery models for PLM. One approach based on a PLM system, that contains predefined business logic and functionality, promoting to use the system as much as possible out-of-the-box (OOTB) somehow driving toward a certain rigidness or the other approach where the PLM capabilities need to be developed on top of a customizable infrastructure, providing more flexibility. I believe there has been a debate about this topic over more than 15 years without a decisive conclusion. Therefore I will take you through the pros and cons of both approaches illustrated by examples from the field.

PLM started as a toolkit

The initial cPDM/PLM systems were toolkits for several reasons. In the early days, scalable connectivity was not available or way too expensive for a standard collaboration approach. Engineering information, mostly design files, needed to be shared globally in an efficient manner, and the PLM backbone was often a centralized repository for CAD-data. Bill of Materials handling in PLM was often at a basic level, as either the ERP-system (mostly Aerospace/Defense) or home-grown developed BOM-systems(Automotive) were in place for manufacturing.

Depending on the business needs of the company, the target was too connect as much as possible engineering data sources to the PLM backbone – PLM originated from engineering and is still considered by many people as an engineering solution. For connectivity interfaces and integrations needed to be developed in a time that application integration frameworks were primitive and complicated. This made PLM implementations complex and expensive, so only the large automotive and aerospace/defense companies could afford to invest in such systems. And a lot of tuition fees spent to achieve results. Many of these environments are still operational as they became too risky to touch, as I described in my post: The PLM Migration Dilemma.

The birth of OOTB

Around the year 2000, there was the first development of OOTB PLM. There was Agile (later acquired by Oracle) focusing on the high-tech and medical industry. Instead of document management, they focused on the scenario from bringing the BOM from engineering to manufacturing based on a relatively fixed scenario – therefore fast to implement and fast to validate. The last point, in particular, is crucial in regulated medical environments.

At that time, I was working with SmarTeam on the development of templates for various industries, with a similar mindset. A predefined template would lead to faster implementations and therefore reducing the implementation costs. The challenge with SmarTeam, however, was that is was very easy to customize, based on Microsoft technology and wizards for data modeling and UI design.

This was not a benefit for OOTB-delivery as SmarTeam was implemented through Value Added Resellers, and their major revenue came from providing services to their customers. So it was easy to reprogram the concepts of the templates and use them as your unique selling points towards a customer. A similar situation is now happening with Aras – the primary implementation skills are at the implementing companies, and their revenue does not come from software (maintenance).

The result is that each implementer considers another implementer as a competitor and they are not willing to give up their IP to the software company.

SmarTeam resellers were not eager to deliver their IP back to SmarTeam to get it embedded in the product as it would reduce their unique selling points. I assume the same happens currently in the Aras channel – it might be called Open Source however probably it is only high-level infrastructure.

Around 2006 many of the main PLM-vendors had their various mid-market offerings, and I contributed at that time to the SmarTeam Engineering Express – a preconfigured solution that was rapid to implement if you wanted.

Although the SmarTeam Engineering Express was an excellent sales tool, the resellers that started to implement the software began to customize the environment as fast as possible in their own preferred manner. For two reasons: the customer most of the time had different current practices and secondly the money come from services. So why say No to a customer if you can say Yes?

OOTB and modules

Initially, for the leading PLM Vendors, their mid-market templates were not just aiming at the mid-market. All companies wanted to have a standardized PLM-system with as little as possible customizations. This meant for the PLM vendors that they had to package their functionality into modules, sometimes addressing industry-specific capabilities, sometimes areas of interfaces (CAD and ERP integrations) as a module or generic governance capabilities like portfolio management, project management, and change management.

The principles behind the modules were that they need to deliver data model capabilities combined with business logic/behavior. Otherwise, the value of the module would be not relevant. And this causes a challenge. The more business logic a module delivers, the more the company that implements the module needs to adapt to more generic practices. This requires business change management, people need to be motivated to work differently. And who is eager to make people work differently? Almost nobody,  as it is an intensive coaching job that cannot be done by the vendors (they sell software), often cannot be done by the implementers (they do not have the broad set of skills needed) or by the companies (they do not have the free resources for that). Precisely the principles behind the PLM Blame Game.

OOTB modularity advantages

The first advantage of modularity in the PLM software is that you only buy the software pieces that you really need. However, most companies do not see PLM as a journey, so they agree on a budget to start, and then every module that was not identified before becomes a cost issue. Main reason because the implementation teams focus on delivering capabilities at that stage, not at providing value-based metrics.

The second potential advantage of PLM modularity is the fact that these modules supposed to be complementary to the other modules as they should have been developed in the context of each other. In reality, this is not always the case. Yes, the modules fit nicely on a single PowerPoint slide, however, when it comes to reality, there are separate systems with a minimum of integration with the core. However, the advantage is that the PLM software provider now becomes responsible for upgradability or extendibility of the provided functionality, which is a serious point to consider.

The third advantage from the OOTB modular approach is that it forces the PLM vendor to invest in your industry and future needed capabilities, for example, digital twins, AR/VR, and model-based ways of working. Some skeptic people might say PLM vendors create problems to solve that do not exist yet, optimists might say they invest in imagining the future, which can only happen by trial-and-error. In a digital enterprise, it is: think big, start small, fail fast, and scale quickly.

OOTB modularity disadvantages

Most of the OOTB modularity disadvantages will be advantages in the toolkit approach, therefore discussed in the next paragraph. One downside from the OOTB modular approach is the disconnect between the people developing the modules and the implementers in the field. Often modules are developed based on some leading customer experiences (the big ones), where the majority of usage in the field is targeting smaller companies where people have multiple roles, the typical SMB approach. SMB implementations are often not visible at the PLM Vendor R&D level as they are hidden through the Value Added Reseller network and/or usually too small to become apparent.

Toolkit advantages

The most significant advantage of a PLM toolkit approach is that the implementation can be a journey. Starting with a clear business need, for example in modern PLM, create a digital thread and then once this is achieved dive deeper in areas of the lifecycle that require improvement. And increased functionality is only linked to the number of users, not to extra costs for a new module.

However, if the development of additional functionality becomes massive, you have the risk that low license costs are nullified by development costs.

The second advantage of a PLM toolkit approach is that the implementer and users will have a better relationship in delivering capabilities and therefore, a higher chance of acceptance. The implementer builds what the customer is asking for.

However, as Henry Ford said, if I would ask my customers what they wanted, they would ask for faster horses.

Toolkit considerations

There are several points where a PLM toolkit can be an advantage but also a disadvantage, very much depending on various characteristics of your company and your implementation team. Let’s review some of them:

Innovative: a toolkit does not provide an innovative way of working immediately. The toolkit can have an infrastructure to deliver innovative capabilities, even as small demonstrations, the implementation, and methodology to implement this innovative way of working needs to come from either your company’s resources or your implementer’s skills.

Uniqueness: with a toolkit approach, you can build a unique PLM infrastructure that makes you more competitive than the other. Don’t share your IP and best practices to be more competitive. This approach can be valid if you truly have a competing plan here. Otherwise, the risk might be you are creating a legacy for your company that will slow you down later in time.

Performance: this is a crucial topic if you want to scale your solution to the enterprise level. I spent a lot of time in the past analyzing and supporting SmarTeam implementers and template developers on their journey to optimize their solutions. Choosing the right algorithms, the right data modeling choices are crucial.

Sometimes I came into a situation where the customer blamed SmarTeam because customizations were possible – you can read about this example in an old LinkedIn post: the importance of a PLM data model

Experience: When you plan to implement PLM “big” with a toolkit approach, experience becomes crucial as initial design decisions and scope are significant for future extensions and maintainability. Beautiful implementations can become a burden after five years as design decisions were not documented or analyzed. Having experience or an experienced partner/coach can help you in these situations. In general, it is sporadic for a company to have internally experienced PLM implementers as it is not their core business to implement PLM. Experienced PLM implementers vary from size and skills – make the right choice.

 

Conclusion

After writing this post, I still cannot write a final verdict from my side what is the best approach. Personally, I like the PLM toolkit approach as I have been working in the PLM domain for twenty years seeing and experiencing good and best practices. The OOTB-box approach represents many of these best practices and therefore are a safe path to follow. The undecisive points are who are the people involved and what is your business model. It needs to be an end-to-end coherent approach, no matter which option you choose.

 

 

 

image
Confused? You won’t be after this episode of Soap. “

Who does not remember this tagline from the first official Soap series starting in 1977 and released in the Netherlands in 1979?

Every week the Campbells and the Tates entertained us with all the ingredients of a real soap: murder, infidelity, aliens’ abduction, criminality, homosexuality and more.

The episode always ended with a set of questions, leaving you for a week in suspense , hoping the next episode would give you the answers.

For those who do not remember the series or those who never saw it because they were too young, this was the mother of all Soaps.

What has it to do with PLM?

Soap has to do with strange people that do weird things (I do not want to be more specific). Recently I noticed that this is happening even in the PLM blogger’s world. Two of my favorite blogs demonstrated something of this weird behavior.

First Steve Ammann in his Zero Wait-State blog post: A PLM junkie at sea point-solutions versus comprehensive mentioned sailing from Ventura CA to Cabo San Lucas, Mexico on a 35 foot sailboat and started thinking about PLM during his night shift. My favorite quote:

Besides dealing with a couple of visits from Mexican coast guard patrol boats hunting for suspected drug runners, I had time alone to think about my work in the PLM industry and specifically how people make decisions about what type of software system or systems they choose for managing product development information. Yes only a PLM “junkie” would think about PLM on a sailing trip and maybe this is why the Mexican coast guard was suspicious.

Second Oleg in his doomsday blog post: The End of PLM Communism, was thinking about PLM all the weekend. My favorite quote:

I’ve been thinking about PLM implementations over the weekend and some perspective on PLM concepts. In addition to that, I had some healthy debates over the weekend with my friends online about ideas of centralization and decentralization. All together made me think about potential roots and future paths in PLM projects.

imageIt demonstrates the best thinking is done during out-of-office time and on casual locations. Knowing this from my long cycling tours in the weekend, I know it is true.
I must confess that I have PLM thoughts during cycling.

Perhaps the best thinking happens outside an office?

I leave the follow up on this observation to my favorite Dutch psychologist Diederik Stapel, who apparently is out of office too.

Now back to serious PLM

Both posts touch the topic of a single comprehensive solution versus best-of-breed solutions. Steve is very clear in his post. He believes that in the long term a single comprehensive solution serves companies better, although user performance (usability) is still an issue to consider. He provides guidance in making the decision for either a point solution or an integrated solution.

And I am aligned with what Steve is proposing.

Oleg is coming from a different background and in his current position he believes more in a distributed or network approach. He looks at PLM vendors/implementations and their centralized approach through the eyes of someone who knows the former Soviet Union way of thinking: “Centralize and control”.

imageThe association with communism which was probably not the best choice when you read the comments. This association makes you think as the former Soviet Union does not exist anymore, what about former PLM implementations and the future? According to Oleg PLM implementations should be more focused on distributed systems (on the cloud ?), working and interacting together connecting data and processes.

And I am aligned with what Oleg is proposing.

Confused? You want be after reading my recent experience.

I have been involved in the discussion around the best possible solution for an EPC contractor (Engineering Procurement Construction) in the Oil & Gas industry. The characteristic of their business is different from standard manufacturing companies. EPC contractors provide services for an owner/operator of a plant and they are selected because of their knowledge, their price, their price, their price, quality and time to deliver.

This means an EPC contractor is focusing on execution, making sure they have the best tools for each discipline and this is the way they are organized and used to work. The downside of this approach is everyone is working on its own island and there is no knowledge capitalization or sharing of information. The result each solution is unique, which brings a higher risk for errors and fixes required during construction. And the knowledge is in the head of experience people ….. and they retire at a certain moment.

So this EPC contractor wanted to build an integrated system, where all disciplines are connected and sharing information where relevant. In the Oil & Gas industry, ISO15926 is the standard. This standard is relative mature to serve as the neutral exchange standard of information between disciplines. The ideal world for best in class tools communicating with each other, or not ?

imageImagine there are 6 discipline tools, an engineering environment optimized for plant engineering, a project management environment, an execution environment connecting suppliers and materials, a delivery environment assuring the content of a project is delivered in the right stages and finally a knowledge environment, capitalizing lessons learned, standards and best practices.

This results in 6 tools and 12 interfaces to a common service bus connecting these tools. 12 interfaces as information needs to be send and received from the service bus per application. Each tools will have redundant data for its own execution.

image

What happens if a PLM provider could offer three of these tools on a common platform? This would result into 4 tools to install and only 8 interfaces. The functionality in the common PLM system does not require data redundancy but shares common information and therefore will provide better performance in a cross-discipline scenario.

In the ultimate world all tools will be on one platform, providing the best performance and support for this EPC contractor. However this is utopia. It is almost impossible to have a 100 % optimized system for a group of independent companies working together. Suppliers will not give up their environment and own IP to embed it in a customer´s ideal environment. So there is always a compromise to find between a best integrated platform (optimal performance – reduced cost of interfaces and cost of ownership) and the best connected environment (tools connection through open standards).

And this is why both Steve and Oleg have a viewpoint that makes sense. Depending on the performance of the tools and the interaction with the supplier network the PLM platform can provide the majority of functionality. If you are a market dominating OEM you might even reach 100 % coverage for your own purpose, although the modern society is more about connecting information where possible.

MY CONCLUSION after reading both posts:

  • Oleg tries to provoke, and like a soap, you might end up confused after each episode.
  • Steve in his post gives a common sense guidance, useful if you spend time on digesting it, not a soap.

Now I hope you are not longer confused and wish you all a successful and meaningful 2013. The PLM soap will continue in alphabetical order:

  • Will Aras survive 21-12-2012 and support the Next generation ?
  • Will Autodesk get of the cloud or have a coming out ?
  • Will Dassault get more Experienced ?
  • Will Oracle PLM customers understand it is not a database ?
  • Will PTC get out of the CAD jail  and receive $ 200 ?
  • Will SAP PLM be really 3D  and user friendly ?
  • Will Siemens PLM become a DIN or ISO standard ?

See the next episodes of my PLM blog in 2013

image

CoveyIt is interesting to read management books and articles and reflect the content in the context of PLM. In my previous post How the brain blocks PLM acceptance and in Stephen Porter´s (not yet finished) serial The PLM state: the 7 habits of highly effective PLM adoption, you can discover obvious points that we tend to forget in the scope of PLM as we are so focused on our discipline.

christensenThis summer holiday I was reading the Innovator´s Dilemma: When New Technologies Cause Great Firms to Fail by Clayton Christensen. Christensen is an associated professor at the Harvard Business School and he published this book already in 1997. Apparently not everyone has read the book and I recommend that if you are involved in the management of a PLM company to read it.

Sustaining technology

Christensen states there are two types of technologies. Leading companies are supporting their customers and try to serve them better and better by investing a lot in improving their current products. Christensen calls this sustaining technology as the aim is to improve existing products. Sustaining technologies lead to every time more and more effort to improve the current product performance and capabilities due to the chosen technology and solution concepts. These leading companies are all geared up around this delivery process and resources are optimized to sustain leadership, till ….

Disruptive technology

The other technology Christensen describes is disruptive technology, which initially is not considered as competition for existing technologies as it under performs in the same scope, so no way to serve the customer in the same way. The technology underperforms if you would apply to the same market, but it has unique capabilities that make it fit for another market. Next if the improvement path of disruptive technology can be faster than the improvement path for the sustaining technology, it is possible that their paths meet at a certain point. And although coming from a different set of capabilities, due to the faster improvement process the disruptive technology becomes the leading one and companies that introduced the disruptive technology became the new market leaders.

Why leading companies failed..

failChristensen used the disk drive industry as an example as there the change in technology was so fast that it was a perfect industry to follow it´s dynamics. Later he illustrates the concepts with examples from other industries where the leading firms failed and stopped to exist because disruptive technologies overtook them and they were not able to follow that path too.

Although the leading companies have enough resources and skills, he illustrates that it is a kind of logical path – big companies will always fail as it is in their nature to focus on sustaining technology. Disruptive technologies do not get any attention as they are targeting a different unclear market in the beginning and in addition it is not clear where the value from this disruptive technology comes from, so which manager wants to risk his or her career to focus on something uncertain in an existing company.

Christensen therefore advises these leading companies, if they expect certain technologies to become disruptive for their business, to start a separate company and take a major share position there. Leave this company focus on its disruptive technology and in case they are successful and cross the path of the sustaining technology embed them again in your organization. Any other approach is almost sure to fail, quote:

flyExpecting achievement-driven employees in a large organization to devote critical mass of resources, attention and energy to disruptive projects targeted at a small market is equivalent to flapping one´s arms in an effort to fly

As the book was written in 1997, it was not in the context of PLM. Now let´s start with some questions.

Is ERP in the stage of sustaining technology?

erp_txtHere I would say Yes. ERP vendors are extending their functional reach to cover more than the core functionality for two reasons: they need continuous growth in revenue and their customers ask for more functionality around the core. For sustaining technologies Christensen identifies four stages. Customers select a product for functionality, when other vendors have the same functionality reliability becomes the main differentiation. And after reliability the next phase is convenience and finally price.
From my personal observations, not through research, I would assume ERP for the major vendors is in the phase between convenience and price. If we follow Christensen´s analysis for SAP and Oracle it means they should not try to develop disruptive technologies inside their organization, neither should they try to downscale their product for the mid-market or add a different business model. Quote:

What goes up – does not go down. Moving to a high-end market is possible (and usually the target) – they will not go to small, poor defined low-end markets

How long SAP and Oracle will remain market leaders will depend on disruptive technologies that will meet the path of ERP vendors and generate a new wave. I am not aware of any trends in that area as I am not following the world of ERP closely

Is PLM in the stage of sustaining technology?

plm_txtHere I would say No because I am not sure what to consider as a clear definition of PLM. Different vendors have a different opinion of what a PLM system should provide as core technologies. This makes it hard to measure it along the lifecycle of sustaining technology with the phases: functionality, reliability, convenience and price.

Where the three dominant PLM providers (DS/PTC/Siemens) battle in the areas of functionality, reliability and convenience others are focusing on convenience and price.

Some generalized thoughts passed my mind:

  • DS and PTC somehow provoke their customers by launching new directions where they believe the customer will benefit from. This somehow makes it hard to call it sustaining technology.
  • · Siemens claiming they develop their products based on what customers are asking for. According to Christensen they are at risk in the long term as customers keep you captive and do not lead you to disruptive technologies.
  • · All three focus on the high-end and should not aim for smaller markets with the same technology. This justifies within DS the existence of CATIA and SolidWorks and in Siemens the existence of NX and SolidEdge. Unifying them would mean the end of their mid-market revenue and open it for others.

 

Disruptive technologies for PLM

Although PLM is not a sustained technology to my opinion, there are some disruptive technologies that might come into the picture of mainstream PLM.

open_sourceFirst of all there is the Open Source software model, introduced by Aras, which initially is not considered as a serious threat for the classical PLM players – “big customers will never rely on open source”. However the Open Source model allows product improvements to move faster than main stream, reaching at a certain point the same level of functionality, reliability and convenience. The risk for Open Source PLM is that it is customer driven, which according Christensen is the major inhibitor for disruptive steps in the future

cloudNext there is the cloud. Autodesk PLM and Kenesto are the two most visible companies in this domain related to PLM. Autodesk is operating from a comfort zone – it labels its product PLM, it does not try to match what the major PLM vendors try to do and they come from the small and medium mid-size market. Not too many barriers to come into the PLM mid-market in a disruptive manner. But does the mid-market need PLM? Is PLM a bad annotation for its cloud based product? Time will tell.

The management from Kenesto obviously has read the book. Although the initially concept came from PLM++ (bad marketing name), they do not to compete with mainstream PLM and aim their product at a different audience – business process automation. Then if their product picks up in the engineering / product domain, it might enter the PLM domain in a disruptive manner (all according to the book – they will become market leaders)

searchFinally Search Based Applications which are also a disruptive technology for the PLM domain. Many companies struggle with the structured data approach a classical PLM system requires and especially for mid-market companies this overhead is a burden. They are used to work in a cognitive manner, the validation and formalization is often done in the brain of experienced employees. Why cannot search based technology not be used to create structured data and replace or support the experienced brain?

If I open my Facebook page, I see new content related to where I am, what I have been saying or surfing for. Imagine an employee´s desktop that works similar, where your data is immediately visible and related information is shown. Some of the data might come from the structured system in the background, other might be displayed based on logical search criteria; the way our brain works. Some startups are working in this direction and Inforbix (congratulations Oleg & team) has already been acquired by Autodesk or Exalead by DS.

For both companies if they believe in the above concept, they should remain as long as possible independent from the big parent company as according to Christensen they will not get the right focus and priorities if they are part of the sustainable mainstream technology

Conclusion
This blog post was written during a relaxing holiday in Greece. The country here is in a crisis, they need disruptive politicians. They did it 3500 years ago and I noticed the environment is perfect for thinking as you can see below.

Meanwhile I am looking forward to your thoughts on PLM, in which state we are what the disruptive technologies are.

IMAG0235IMAG0233IMAG0231

PLM_inno_2012

I am just back from an exciting PLM Innovation 2012 conference. With a full program and around 250 participants, it was two intensive days of PLM interaction.

What I liked the most is that the majority of the audience was focusing on PLM business related topics. The mood of PLM has changed.

In this post, I will give an impression of the event, how I experienced it without going into the details of each session.

Several interesting sessions were in parallel so I could not attend them all, but MarketKey, the organizer of the conference confirmed that all presentations are filmed and will become available on-line for participants. So more excitement to come.

First my overall impression: Compared to last year’s conference there was more a focus on the PLM business issues and less on PLM IT or architecture issues (or was it my perception ?)

DAY 1

Gerard Litjens (CIMdata Director European Operations) opened the conference as CIMdata co-hosted the conference. In his overview he started with CIMdata’s PLM definition – PLM is a strategic business approach. (Everyone has his own definition as Oleg noticed too). Next he presented what CIMdata sees as the hottest topics. No surprises here: Extension from PLM to new industries, extending PDM towards PLM, Integration of Social Media, Cloud, Open Source, Enterprise integration and compliance.

abbNext speaker was Thomas Schmidt (Vice President, Head of Operational Excellence and IS – ABB’s Power Products Division) was challenging the audience with his key note speech: PLM: Necessary but not sufficient. With this title it seemed that the force was against him (thanks Oleg for sharing).

Thomas explained that the challenge of ABB is being a global company and at the same time acting as a ‘local’ company everywhere around the world. In this perspective he placed PLM as part of a bigger framework to support operational excellence and presented some major benefits from a platform approach. I believe the Q&A session was an excellent part to connect Thomas’s initial statements to the PLM focused audience.

Marc Halpern from Gartner gave his vision on PLM. Also Marc started with the Gartner definition of PLM, where they characterized PLM as a discipline. Gartner identified the following 5 major trends: Software everywhere in products, usage of social media for product development and innovation, using analytics tools to support the whole product lifecycle – after sales, service, connecting to the customer. Opportunities for existing products to deliver them through services (media content, transportation)

PLM_profNext I attended the Autodesk session, a PLM journey using the cloud, where I was eager to learn their approach towards PLM. Autodesk (Mike Lieberman) let Linda Maepa, COO from Electron Vault in the USA explain the benefits of the Autodesk PLM 360 solution. Electron Vault, a young, high-tech company, has implemented the solution within 2 weeks. And here I got disconnected . Also when the suggestion was raised that you do not need time to specify the requirements for the system (old-fashioned stuff),

I suddenly got into a trance and saw a TV advert from a new washing power, with numerous features (program management, new product introduction, …..) that was washing whiter than all the others and a happy woman telling it to the world. I believe if Autodesk wants to be considered as serious in the PLM world it should also work with existing customers and managing the change in these organizations. Usually it takes already more than two weeks to get them aligned and agree on the requirements. Unfortunate I did not have time during the breaks to meet Autodesk at their booth as I would love to continue the discussion about reality as my experience and focus is on mid-market companies. Waiting for a next opportunity.

After Autodesk, I presented in my session what are the main drivers for making the case for PLM. I also started with my favorite PLM definition (a collection of best practices – 2PLM) and explained that PLM starts with the management vision and targets for the future. Is it about efficiency, quality, time to market, knowledge capture or a more challenging task: creating the platform for innovation?

hydroQNext I followed the Energy tracks, where I listened to Charles Gagnon from Hydro Quebec, who gave an interesting lecture called: Implementing Open Innovation and Co-Development.

At first glance this is a sensitive topic. When you innovate it is all about creating new intellectual property, and the fear that when working with partners the IP might be out of the company, Charles explained how this process of collaborative innovation was started and monitored. At the end he reported they measured a significant gain in R&D value perceived when working with external partners. And they did not use a PLM system to manage Innovation (to be investigated how they could survive)

After the lunch I continued with Jonas Hagner from WinWinD, a young manufacturer of windmills that are targeted to operate in extreme climate conditions ( a niche market). They are both implementing PLM and ERP in parallel and they did not have to suffer from years of ERP before PLM and therefore could have a more balanced discussion around part information availability / part number and more. Still I believe they have the challenge to connect in an efficient manner the services of the windmills back to their R&D organization, to do a full PLM circle.

Karer consulting together with Siemens Energy presented the case how they have designed and starting the implement the interface between their PLM system (Teamcenter) and ERP system (SAP). What was disappointing to see was that the interface between Teamcenter and SAP was relative complex (bi-directional with engineering activities in both sides) . Almost 1½ years of development of this interface and one of the main reasons, because SAP was first and they start the engineering order in SAP.

ebom_mbom_problem

Apparently 2 years later Siemens Energy could not implement a clear distinct separation between PLM and ERP anymore and will not have to live with this complex interface. In the past I have written several times about this complexity that companies seem to accept due to political or historical reasons. Sad story for PLM – Where is the MBOM ?.

ebom_mbom_plm

The day finished with a closing keynote from Peter Bilello, explaining how a successful PLM implementation could look like. Many wise statements that everyone should follow in case you want to come to a successful implementation (and define correctly what success is)

Thanks to Autodesk we had a nice evening reception, discussion and evaluating with peers the first day.

Day 2

mioDay 2 started for me with an interesting lecture from Peter Fassbender, Head Design Center Fiat Latin America, describing how in Brazil the Fiat Mio experiment used modern social media techniques, like crowdsourcing, communities and user involvement to guide the innovation and development of a potential car. A unique experiment demonstrating that this type of projects are influence the brand reputation positively (if managed correct) and for me an example of what PLM could bring if R&D is connected to the outside world.

Christian Verstraete Chief Technologist – Cloud Strategy from HP gave an inspiring session about the open frontiers of innovation. The speed of business in the past 30 years has increased dramatically (you need to be from an older generation to be aware of this – the definition of response time has changed due to new technologies) Christian pushed everyone to think Out of the Box and to be innovative, which made me wonder how long will companies in the future build standard boring products. Will keep on innovating in this amazing pace as we did in the past 30 years ?

lf1Graeme Hackland, IT/IS director from the UK based Lotus F1 team presented the challenges a F1 team has to face every year due to changing regulations. I visited Lotus F1 last year and was impressed by the fact that over 500 engineers are all working around one carper year to optimize the car mainly for aerodynamics, but next to assure it performs during the years. Thousands of short interactions, changes to be implemented a.s.a.p. challenge the organization to collaborate in an optimum manner. And of course this is where PLM contributes. All the F1 fans could continue to dream and listen to Graeme’s stories but Jeremie Labbe from Processia brought us back to earth by explaining how Processia assisted Lotus F1 in a PLM value assessment as a next step.

Meanwhile I had some side discussions on various PLM topics and went back to the sessions, seeing how David Sherburne, Director of Global R&D Effectiveness from Carestream Health presented his case (open source PLM) and his analysis why an open source PLM model (based on Aras) is very appealing in their case. Indeed the business value perceived and significant lower operational costs for the software are appealing for his organization and for sure will influence the other PLM vendors in their pricing model.

Pierfrancesco Manenti, from IDC Manufacturing Insights gave a clear presentation indicating the future directions for PLM: managing operational complexity, not product complexity. As you could expect from IDC Manufacturing Insights all was well based on surveys in the manufacturing industry and clearly indicating that there is still a lot to do for companies to efficient share and work around a common product development and operational platform. New technologies (the four IT forces: mobility, cloud, social business and big data analytics) will help them to improve.

eso

The closing keynote came from Jason Spyromilio , who was director of the European Southern Observatory’s Very Large Telescope (http://www.eso.org) and he gave us the insights in designing (and building) the biggest eye on the sky. Precision challenges for such a huge telescope mirror, being built in the high mountains of Chili in an earthquake sensitive area demonstrate that all participants are required to contribute their IQ in order to realize such a challenge.

Conclusion: This PLM Innovation 2012 event doubled the 2011 event from a year ago in all dimensions. Thanks to the sponsors, the organization and high quality lectures, I expect next year we could double again – in participants, in content and innovation. It shows PLM is alive. But comming back to the title of this post: I saw some interesting innovation concepts – now how to enabale them with PLM ? 

Note: looking at the pictures in this postyou will notice PLM is everywhere. I published this post on February 29th – a unique day which happens only every 4 years. In May this year my blog will be 4 years old.

observation It has been silent from my side the past – more than two months. Extremely busy and sometimes surprised to see the amounts of post some of my colleagues could produce, with Oleg as the unbeaten number one. During this busy period I was able to observe some interesting trends listed below:

Going Social

Social Media and PLM is one of the favorite topics for both bloggers and some PLM vendors at this moment. New products for community based collaboration or social engineering  are promoted. Combined with discussions and statements how the new workforce (Generation Y) should get challenging jobs without doing the ‘old boring stuff’.

True arguments to initiate a change in the way we work.  And I agree, must of current PLM systems are not ‘intelligent’ enough to support engineers in a friendly manner. However is there an alternative at this moment ? Below a commercial (in Dutch) promoting that elderly workers are still required for quality.

I discussed the relation PLM and Social Media some time ago in my post Social Media and PLM explained for dummies. In addition my observation from the field, gives me the feeling that in most companies the management is still dominated by the older generation, and most of the time they decide on the tools they will be using.  No X and Y at this moment. Therefore I do not see a quick jump to social media integrated with PLM – yes the vision is there – but the readiness of most companies is not yet there.

Cloud

PLM and Cloud are also more and more mentioned by PLM vendors as a new solution specially for the mid-market.  And with an optimistic mind you can indeed believe that with a low investment (pay per use) mid-market companies can do their PLM on-line. But why are existing on-line PLM systems not booming at this time ? (Arena  / PLMplus / and the major PLM vendors) I believe that there are two key reasons for that:

  1. Implementing PLM is not equal to installing a  system. PLM is a vision to be implemented using a system. And the difficulty is that a vision does not match function and features from a product vendor. There is a need for a driving force inside the company that will support the business change. Where are the consultants and advocates (with experience) for this type of solutions ?knowledge_theft
  2. There is still a reluctance to store intellectual property somewhere on-line in a cloud without direct control  and ownership of data. Mid-market companies are not known to choose solutions ahead of the mass. In this type of companies cloud based CAD tools might be an entry point, but all product data – no way they say.

 

 

PLM or ERP

Before even talking about new technologies or fundamentals for PLM, I see the biggest challenge for PLM is still to get the recognition as the system for product knowledge (IP) and innovation. In too many companies ERP rules and PLM is considered as a way to capture CAD and engineering data. The main PLM vendors are not addressing this challenge – they neglect ERP (yes we can connect). And ERP vendors like SAP and Oracle are not known for their PLM messages and strategy  (yes we have PLM). As ERP is often the major IT-system historically, there is so often the (wrong) opinion that everything should be built and based on one system.

swiss

In some publications I have seen the Swiss knife as an example for the multi-functional system with all tools embedded. My question remains – who wants to work only with a Swiss knife when doing professional work ?

I like to have the right tools to do my job

The most important topic around my blog the past 3 years has been around the Manufacturing BOM – where should it be – and where is the MBOM current ?

Sweden – a reality check

Last week I attended the DS PLM forum in Gothenburg to present the vision of using a PLM system as the backbone for plant information management for owners/operators and how ENOVIA supports this vision.

PLM forum But I also learned Sweden is (one of) the most innovative countries (I need to verify the criteria used but can imagine there is a source of truth). What impressed me more where the presentations from  Staffan Lundgren from Sony Ericsson with the title “Living in a world of change – balancing stability and flexibility” and Magnus Olsson from Volvo Cars with the title “Driving operational excellence in a challenging business environment”. Both companies are well known for their robust image. From both speakers you could experience that they are not worried so much about Generation Y, their success is depending on a clear vision and a will to go there. And this basic drive is often missing – PLM is not a product you buy and then business continues as usual

Conclusion

PLM vendors made a lot of noise the past months (events / product announcements) and customers might get the impression that technology and software (or the price of software) are the main criteria for successful PLM. Although not unimportant, I would focus on the vision and to assure this vision is understood and accepted by the company. old_fashioned

Am I old fashioned ?

image This week I was happy to participate in the PLM INNOVATION 2011 conference in London. It was an energizer, which compared to some other PLM conferences, makes the difference. The key of the success, to my opinion was that there was no vendor dominance. And that participants were mainly discussing around their PLM implementation experiences not about products.

Additional as each of the sessions were approximate 30 minutes long, it forced the speakers to focus on their main highlights, instead of going into details. Between the sessions there was significant time to network or to setup prescheduled meetings with other participants. This formula made it for me an energizing event as every half hour you moved into a next experience.

In parallel, I enjoyed and experienced the power of the modern media. Lead by Oleg,  a kind of parallel conference took place on Twitter around the hash tag #plminnovation2011. There I met, and communicated with people in the conference (and outside) and felt sorry I was not equipped with all the modern media (iPhone/Pad type  equipment) to interact more intensive during these days.

Now some short comments/interpretations on the sessions I was able to attend

Peter Bilello, president of Cimdata opened the conference in the way we are used from Cimdata, explaining the areas and values of PLM, the statistics around markets, major vendors and positive trends for the near future. Interesting was the discussion around the positioning of PLM and ERP functionality and the coverage of these functionalities between PLM and ERP vendors.

Jean-Yves Mondon, EADS’ head of PLM Harmonization (Phenix program) , illustrated by extracts of an interview with their CEO Louis Gallois, how EADS relies on PLM as critical for their business and wants to set standards for PLM in order to have the most efficient interoperability of tools and processes coming from multiple vendors

image Due to my own session and some one-to-one sessions, I missed a few parallel sessions in the morning and attended  Oleg Shilovitsky’s session around the future of engineering software. Oleg discussed several trends and one of the trends I also see as imminent, it the fact that the PLM world is changing from databases towards networks. It is not about capturing all data inside one single system, but to be able to find the right information through a network of information carriers.

This suits also very well with the new generation of workers (generation-Y) who also learned to live in this type of environments and collect information through their social networks.

The panel discussion with 3 questions for panelist could have been a little better in case the panelist would have had the time to prepare some answers, although some of the improvisations were good. I guess the audience choose Graham McCall’s response on the question: “What will be the Next Biggest Disappointment” as the best. He mentioned the next ‘big world-changing’ product launch from a PLM vendor.

myplm Then I followed the afternoon session from Infor, called Intelligent PLM for Manufacturing. The problem with this session I had (and I have this often with vendor sessions) was that Venkat Rajaj did exactly wrong what most vendors do wrong. They create their own niche definition – Product Lifecycle Intelligence (is there no intelligence in PLM) , being the third software company (where are they on Cimdata’s charts)  and further a lot of details on product functions and features. Although the presentation was smooth and well presented, the content did not stick.

image A delight that day was the session from Dr. Harminder Singh, associate fellow at Warwick Business School, about managing the cultural change of PLM. Harminder does not come from the world of software or PLM and his outsider information and looks, created a particular atmosphere for those who were in the audience and consider cultural change as an important part of PLM. Here we had a session inspired by a theme not by product or concept. I was happy to have a longer discussion with Harminder that day as I also believe PLM has to do with culture change – it is not only technology and management push as we would say. Looking forward to follow up here.

The next day we started with an excellent session from Nick Sale from TaTa Technologies. Beside a Nano in the lobby of the conference he presented all the innovation and rationalization related to the Nano car and one of his messages was that we should not underestimate the power of innovation coming from India. An excellent sponsor presentation as the focus was on the content.

In the parallel track I was impressed how Philips Healthcare implemented their PLMD architecture with three layers. imageGert-Jan Laurenssen explained they have an authoring layer, where they do global collaboration within one discipline. A PDM layer where they manage the interdisciplinary collaboration, which of course in the case of Healthcare is a mix of mechanical, electrical and software. And above these two layers they connect to the layer of transactional systems, that need the product definition data. Impressive was their implementation speed for sure due to some of the guidelines Gert-Jan gave – see Oleg’s picture from his slide here.  Unfortunate I did not have the time to discuss deeper with Gert-Jan as I am curious about the culture change and the amount of resources they have in this project. Interesting observation was that the project was driven by IT-managers and Engineering managers, confirming the trend that PLM more and more becomes business focussed instead of IT-focused.

Peter Thorne from Cambashi brought in his session called Trends and Maximizing PLM investments an interesting visual historical review on engineering software investments using Google Earth as the presentation layer. Impressing to see the trends visualized this way and scary the way Europe is not really a major area of investment and growth.

Keith Connolly explained in his session how S&C Electric integrated their PLM environment with ERP. Everything sounded so easy and rational but as I know the guys from S&C for a longer time, I know it is a result of having a clear vision and working for many years towards implementing this vision.

image Leon Lauritsen from Minerva gave a presentation around Open Source PLM and he did an excellent job around explaining where Open Source PLM could/should become attractive. Unfortunate his presentation quickly went into the direction of Open Source PLM equals Aras and he continued with a demo of Aras capabilities. I would have preferred to have a longer presentations around the Open Source PLM business model instead of spending time on looking at a product.

I believe Aras has a huge potential, for sure in the mid-market and perhaps beyond, but I keep coming back on my experiences I also have with SmarTeam: An open and easy to install PLM system with a lot of features is a risk in the hand of IT-people with no focus on business. Without proper vision and guiding (coming from ????? ) it will become again an IT-project, for cheaper to the outside world (as internal investments often are not so clear), but achieving the real PLM goals depends on how you implement.

image After lunch we really reached to the speed of light with David Widgren, who gave us the insight of data management at CERN. Their problematic, somehow a single ‘product’ – the accelerators and all its equipment  plus a long lifecycle (20 years development before operational), surviving all technologies and data formats requires them to think all time on pragmatic data storage and migration. In parallel as the consumers of data are not familiar with the complexity of IT-systems they build lots of specific interfaces for specific roles to provide the relevant information in a single environment. Knowing a lot of European funds are going there, David is a good ambassador for the CERN, explaining in a comic manner he is working at the coolest place on Earth.

image Last session I could attend was Roger Tempest around Data Management. Roger is a co-founder of the PLMIG (PLM Interest group) and they strive for openness, standards and  interoperability for PLM systems. I was disappointed by this session as I was not able to connect to the content. Roger was presenting his axioms as it seemed. I had the feeling he would come down the stage with his 10 commandments. I would be interested to understand where these definitions came from. Is it a common understanding or it it just again another set of definitions coming from another direction and what is the value or message for existing customers using particular PLM software.

I missed the closing keynote session from John Unsworth from Bentley. I learned later this was also an interesting session but cannot comment it.

My conclusion:
An inspiring event, both due to its organization and agenda and thanks to the attendees who made a real PLM centric event. Cannot wait for 2012

globe

Like many people, the meditation of the dark Christmas days and the various 2009 reviews give you a push to look back and reflect.  What happened and what did not happen in 2009?

And what might happen in 2010?

Here my thoughts related to:

 

ERP-related PLM vendors

Here I think mainly about Oracle and SAP. They have already identified PLM as an important component for a full enterprise solution. They are further pushing their one-shop-stop approach . Where Oracle’s offering is based on a set of acquired and to-be-integrated systems, SAP has been extending their offering by more focus on their own development.

vision If you are one of those companies that require PLM, and believe all software should come from one vendor (beside Microsoft), it is hard to decide.

As there might be real PLM knowledge in the Oracle organization as an effect of the acquisitions, but is it easily accessible for you? Is it reflected in the company’s strategy ?
With SAP I am even more in doubt; here you might find more people with ERP blood having learned the PLM talk. Maybe for that reason, I saw mostly Oracle as a PLM option in my environment and very few SAP opportunities for real PLM.

I assume in 2010 Oracle will push stronger and SAP try harder.

 

CAD-related PLM vendors

In this group you find as the major players PTC, Siemens and Dassault Systems. Autodesk could be there too, but they refuse to do PLM and remain focused around design collaboration. All these PLM vendors are striving to get the PLM message towards the mid-market. They have solutions for the enterprise, but to my feeling, most of the enterprises in the traditional well-know PLM markets, like Automotive and Aerospace, are in a kind of stand-still due to economical and upcoming environmental crisis.

It is sure business will not be as usual anymore, but where will the sustainable future go? Here I believe answers will come from innovation and small mid-market companies. The bigger enterprises need time to react so before we see new PLM activities in this area it will take time.

search Therefore all PLM vendors move in directions outside engineering, like apparel, life sciences, and consumer packaged goods. These industries do not rely on the 3D CAD, but still can benefit from the key building blocks of PLM, like lifecycle management, program and portfolio management and quality/compliancy management.  The challenge I believe for the PLM vendors is: Will these CAD-focused organizations be able to learn and adapt other industries fast enough? Where does 3D fit – although Dassault has a unique vision here.

For the mid-market, the PLM vendors offer more OOTB (Out Of The Box) solutions, mostly based on limited capabilities or more common available Microsoft components like SharePoint and SQL Server. This is not so strange as according to my observation, most smaller mid-market companies have not really made or understood the difference internally between document management and product data management, including Bill Of Materials not to be managed in Excel.

I assume 2010 the CAD related PLM vendors initially will focus on the bigger enterprises and new industries, the smaller mid-market companies require a different approach

 

PLM-only vendors

This is an area which I expect to disappear in the future, although this is also the area where interesting developments start to happen. We see open source PLM software coming up with Aras leading and we see companies coming up with PLM on-demand software, Arena as the first company to sell this concept.

fish

The fact that the traditional PLM-only vendors disappeared in this area  (Eigner bought by Agile, Agile bought by Oracle, MatrixOne bought by Dassault Systems) indicates that the classical way of selling PLM-only was not profitable enough.

Either PLM needs to be integrated in companywide business processes (which I believe), or there will be PLM-only vendors that find a business model to stay alive.

Here I hope to see more clarity in 2010

 

Smaller mid-market companies

planning What I have seen in the past year is, that despite the economical crisis, PLM investments by these companies remained active. Maybe not in purchasing much more licenses or implementing new PLM features. Main investments here were around optimizing or slightly extending the PLM base. Maybe because there was time to sit still and analyze what could be changed, or maybe it was planned but due to work pressure, it was never executed. Anyway there was a lot of activity in this area not less than in 2008.

An interesting challenge for these mid-market companies will be to remain attractive for the new generation. They are not used to the classical ways of structured work as most of the current workforce is used to.
Social networking, social PLM, I have seen the thoughts, discussions and benefits, still trying to see where it will become reality.

2010 is another chance.

 

Sustainability and going green

frog This is an area where I am a little disappointed and this is perhaps not justified. I would expect with the lessons learned around energy and the upcoming shortage of natural resources, companies would take the crisis as a reason to change.

To my observation most of the companies I have seen are still trying to continue as usual, hoping that the traditional growth will come back. The climate conference in Copenhagen also showed that, we as human beings, do not feel pressured enough to adapt, by nature we are optimists (or boiling frogs).

Still there are interesting developments – I assume in the next few years we will see innovation coming – probably first from smaller companies as they have the flexibility to react. During the European Customer Conference in Paris, I heard Bernard Charles talking about the concept of a Bill Of Energy (The energy needed to create, maintain and demolish a product) As PLM consultants we already have a hard time explaining to our customers the various views on a BOM, still I like the concept, as a Bill Of Energy makes products comparable.

2010 the acceptance of Bill Of Energy

Here I want to conclude my post for this year. Thank you all for reading and sharing your thoughts and comments with this community. My ultimate conclusion for 2009 is, that is was a good PLM year for the mid-market, better as expected but the changes are going slow. Too slow – we will see next year.

2010

observation The title of this post came in my mind when looking back on some of the activities I was involved in, in the past two weeks. I was discussing with several customers their progress or status of the current PLM integration. One of the trends was, that despite the IT department did their best to provide a good infrastructure for project or product related information, the users always found a problem ,why they could not use the system.

alm_1 I believe the biggest challenge for every organization implementing PDM and later PLM is, to get all users aligned to store their information in a central location and to share it with others. Only in this manner a company can achieve the goal of having a single version of the truth.

With single version of the truth I mean – if I look in the PLM system I find there all the needed data to explain me the exact status of a product or a project.
If it is not in the PLM system, it does not exist !

How many companies can make that statement ?

If your company does not have the single version of the truth implemented yet , you might be throwing away money and even bring your company at risk in the long term. Why ? Let’s look at some undisclosed examples I learned in the past few weeks:No_roi

  • A company ordering 16 pumps which on arrival where not the correct ones –
    1 M Euro lost
  • During installation at a drilling site the equipment did not fit and had many clashes – 20 M Dollar lost, due to rework and penalties
  • 7000 K Euro lost due to a wrong calculation based on the wrong information
  • A major bid lost due to high price estimation due to lack of communication between the estimator and the engineering department
  • 500 K Euro penalty for delivering the wrong information (and too late)

All the above examples – and I am sure it is just a tip of what is happening around the world – were related to the power & process industry, where of course high-capital projects run and the losses might look small related to the size of the projects.

But what was the source of all this: Users

locked Although the companies were using a PLM system, in one company a user decided that some of the data should not be in the system, but should be in his drawer, to assure proper usage (according to his statement, as otherwise when the data is public available, people might misuse the data) – or was it false job security as at the end you loose your job by this behavior.
People should bring value in collaboration not in sitting on the knowledge.

save Another frequently heard complaint is that users decide the PLM system is too complex for them and it takes too much time for them to enter data. And as engineers have not been bothered by any kind of strict data management, as ERP users are used to work with, their complaints are echoed to the PLM implementer. The PLM implementer can spend a lot of time to customize or adapt the system to the user’s needs.
But will it be enough ? It is always subjective and from my experience, the more you customize the higher the future risks. What about upgrades or changes in the process ?
And can we say NO to the next wish of this almighty user ?

Is the PLM system to blame ?

PxMThe PLM system is often seen as the enemy of the data creator, as it forces a user in a certain pattern.  Excel is much easier to use, some home-made macros and the user feels everything is under control (as long as he is around).

Open Source PLM somehow seems to address this challenge, as it does not create the feeling, that PLM Vendors only make their money from complex, unneeded functionality. Everything is under own control for the customer, they decide if the system is good enough.

PLM On Demand has even a harder job to convince the unwilling user, therefore they also position themselves as easy to use, friend of the user and enemy of the software developer. But at the end it is all about users committing to share and therefore adapt themselves to changes.

So without making a qualification of the different types of PLM systems, for me it is clear that:

point The first step all users in a company should realize is that, by working together towards a single version of the truth for all product or project related data, it brings huge benefits.  Remember the money lost due to errors because another version of data existed somewhere. This is where the most ROI for PLM is reported

pointNext step is to realize, it is a change process and by being open minded towards change, either motivated or pushed by the management, the change will make everyone’s work more balanced – not in the first three months but in the longer term.

Conclusion: Creating the single version of the truth for project or product data is required in any modern organization, to remain competitive and profitable. Reaching this goal might not be as easy for every person or company but the awards are high when reaching this very basic goal.

At the end it is about human contribution – not what the computer says:

As a consultant working with mid-market companies,  I enjoyed reading this post from Al Dean and its related comments and posts. Although I must say Al’s statement:

observationPLM+ are looking to solve this by creating a rich application that engages the user, provides ease of implementation and ongoing maintenance (by allowing the user/admin, rather than costly consultant) and can be delivered over the web, in an on-demand manner (which saves hardware and infrastructure cost)

was a trigger to react, as I am a consultant.

The base of every PDM/PLM

First I believe the base of PLM and PDM is to agree inside your company that you share and centralize product data. This means not only files, but also Bill of Materials, Issues, etc, etc, ..

To share and centralize product data seems like an easy mission and this is what all PLM software as a base provides, and I assume PLM+ does the same, only they store the data in the cloud, like Arena.

coop Sharing data is not a natural process in all companies as there is always the culture to share the minimum and to keep the rest to prove your own value – the bigger the company the more this will happen. This is human nature and this differs case by case. To make people share data  is an area where either the management has to push, or in very small companies,  a power user. In larger companies, often an external consultant is doing this job, in the role of an ‘outsider’ who can moderate and explain the benefits for all, instead of the threats.

This is what consultants really do;  they do not install or administer systems.

PLM solutions can vary in the way they make  sharing of data available. Some solutions are very rigid in what they offer as data model, but most of the necessary entities and attributes are there. They are based on best practices and target the 80 %. Often this is good enough, if the customer has no alternative and has the power by themselves to enforce the system as their platform for sharing data.

80-20 More flexible PLM solutions have an advantage and in the same time this is their disadvantage. They can be extended beyond the 80 % scenario and both the implementer and the customer will be challenged to reach the 100 % satisfaction. However we all know from the 80-20 rule, this is where it gets complicated.

80 % of the project is done in 20 % of the time, or in other words: you spent 80 % of your time (and budget) on reaching the last 20 %

Once having reached a common platform for sharing all product related information, for me the real PLM is starting. This is where a company would implement processes, that streamline the product development or delivery process – it requires a cross departmental change.

And at this stage, it is often where  a consultant comes in. It is very rare, that in mid-market companies, management reserves time and resources to come with a strategic plan to implement PLM – I wrote about this in an older post. PLM requires a change in the way the company currently works.

So I am curious to learn how PLM+ and other On-Demand PLM software companies will try to address this step, as change is needed and someone has to push for it.

In my last three consecutive posts, I wrote about who decides on PLM in mid-market companies (a generalization from 15 years experience). There I claim that the selection for a PLM system is subjective, very much based on personal relations with the mid-market company. Again how PLM+ will address this in their business model, as there is a need for someone to push.

Open Source PLM software has somehow similar challenges. You need the drive from inside the customer to agree on sharing product data and next to extend. This is where the traditional PDM and PLM vendors  push their business in direct contacts. Of course Open Source PLM providers have their focus on after the initial installation of the platform to extend it with a consultative and service model.

 

Conclusion: As every PLM provider at the end needs revenue for  a living,  I am looking forward to see where On-Demand PLM will go and finds it place. What will be business model that makes people buy and create the change

Translate

Categories

  1. As a complement, even if more and more of the diversity of a product is managed at the software level…

  2. 1) A wiring diagram stores information (wires between ports of the electrical components) that does not exist in most of…

  3. BOM has NEVER been the sole "master" of the Product. The DEFINITION FILE is ! For example the wiring of…

%d bloggers like this: