You are currently browsing the category archive for the ‘Cloud’ category.

Everyone wants to be a game changer and in reality almost no one is a game changer. Game changing is a popular term and personally I believe that in old Europe and probably also in the old US, we should have the courage and understanding changing the game in our industries.

Why ? Read the next analogy.

1974

With my Dutch roots and passion for soccer, I saw the first example of game changing happening in 1974 with soccer. The game where 22 players kick a ball from side to side, and the Germans win in the last minute.

clip_image002My passion and trauma started that year where the Dutch national team changed the soccer game tactics by introducing totaalvoetbal.

The Dutch team at that time coached by Rinus Michels and with star player Johan Cruyff  played this in perfection.

Defenders could play as forwards and they other way around. Combined with the offside-trap; the Dutch team reached the finals of the world championship soccer both in 1974 and 1978. Of course losing the final in both situations to the home playing teams (Germany in 74 – Argentina in 78 with some help of the referee we believe)

This concept brought the Dutch team for several years at the top, as the changed tactics brought a competitive advantage. Other teams and players, not educated in the Dutch soccer school could not copy that concept so fast

image

At the same time, there was a game changer for business upcoming in 1974, the PC.

On the picture, you see Steve Jobs and Steve Wozniak testing their Apple 1 design. The abbreviation IT was not common yet and the first mouse device and Intel 8008 processor were coming to the market.

This was disruptive innovation at that time, as we would realize 20 years later. The PC was a game changer for business.

2006

Johan Cruyff remained a game changer and when starting to coach and influence the Barcelona team, it was his playing concept tika-taka that brought the Spanish soccer team and the Barcelona team to the highest, unbeatable level in the world for the past 8 years

clip_image002[6]Instead of having strong and tall players to force yourself to the goal, it was all about possession and control of the ball. As long as you have the ball the opponent cannot score. And if you all play very close together around the ball, there is never a big distance to pass when trying to recapture the ball.

This was a game changer, hard to copy overnight, till the past two years. Now other national teams and club teams have learned to use these tactics too, and the Spanish team and Barcelona are no longer lonely at the top.

Game changers have a competitive advantage as it takes time for the competition to master the new concept. And the larger the change, the bigger the impact on business.

Also, PLM was supposed to be a game changer in 2006. The term PLM became more and more accepted in business, but was PLM really changing the game ?

imagePLM at that time was connecting departments and disciplines in a digital manner with each other, no matter where they were around the globe. And since the information was stored in centralized places, databases and file sharing vaults, it created the illusion that everyone was working along the same sets of data.

The major successes of PLM in this approach are coming from efficiency through digitization of data exchange between departments and the digitization of processes. Already a significant step forward and bringing enough benefits to justify a PLM implementation.

Still I do not consider PLM in 2006 a real game changer. There was often no departmental or business change combined with it. If you look at the soccer analogy, the game change is all about a different behavior to reach the goal, it is not about better tools (or shoes).

The PLM picture shows the ideal 2006 picture, how each department forwards information to the next department. But where is PLM supporting after sales/services in 2006 ? And the connection between After Sales/Services and Concept is in most of the companies not formalized or existing. And exactly that connection should give the feedback from the market, from the field to deliver better products.

The real game changer starts when people learn and understand sharing data across the whole product or project lifecycle. The complexity is in the word sharing. There is a big difference between storing everything in a central place and sharing data so other people can find it and use it.

imagePeople are not used to share data. We like to own data, and when we create or store data, we hate the overhead of making data sharable (understandable) or useful for others. As long as we know where it is, we believe our job is safe.

But our job is no longer safe as we see in the declining economies in Europe and the US. And the reason for that:

Data is changing the game

In the recent years the discussion about BI (Business Intelligence) and Big Data emerged. There is more and more digital information available. And it became impossible for companies to own all the data or even think about storing the data themselves and share it among their dispersed enterprises. Combined with the rise of cloud-based platforms, where data can be shared (theoretically) no matter where you are, no matter which device you are using, there is a huge potential to change the game.

It is a game changer as it is not about just installing the new tools and new software. There are two major mind shifts to make.

  • It is about moving from documents towards data. This is an extreme slow process. Even if your company is 100 % digital, it might be that your customer, supplier still requires a printed and wet-signed document or drawing, as a legal confirmation for the transaction. Documents are comfortable containers to share, but they are killing for fast and accurate processing of the data that is inside them.
  • It is about sharing and combining data. It does not make sense to dump data again in huge databases. The value only comes when the data is shared between disciplines and partners. For example, a part definition can have hundreds of attributes, where some are created by engineering, other attributes created by purchasing and some other attributes directly come from the supplier. Do not fall in the ERP-trap that everything needs to be in one system and controlled by one organization.

imageBecause of the availability of data, the world has become global and more transparent for companies. And what you see here is that the traditional companies in Europe and the US struggle with that. Their current practices are not tuned towards a digital world, more towards the classical, departmental approach. To change this, you need to be a game changer, and I believe many CEOs know that they need to change the game.

The upcoming economies have two major benefits:

  • Not so much legacy, therefore, building a digital enterprise for them is easier. They do not have to break down ivory towers and 150 years of proud ownership.
  • The average cost of labor is lower than the costs in Europe and the US, therefore, even if they do not do it right at the first time; there is enough margin to spend more resources to meet the objectives.

imageThe diagram I showed in July during the PI Apparel conference was my interpretation of the future of PLM. However, if you analyze the diagram, you see that it is not a 100 % classical PLM scope anymore. It is also about social interaction, supplier execution and logistics. These areas are not classical PLM domains and therefore I mentioned in the past, the typical PLM system might dissolve in something bigger. It will be all about digital processes based on data coming for various sources, structured and unstructured. Will it still be PLM or will we call it different ?

The big consultancy firms are all addressing this topic – not necessary on the PLM level:

2012  Cap Gemini – The Digital advantage: …..

2013  Accenture – Dealing with digital technology’s disruptive impact on the workforce

2014  McKinsey – Why every leader should care about digitization and disruptive innovation

For CEOs it is important to understand that the new, upcoming generations are already thinking in data (generation Y and beyond). By nature, they are used to share data instead of owning data in many aspects. Making the transition to the future is, therefore, also a process of connecting and understanding the future generations.  I wrote about it last year: Mixing past and future generations with a PLM sauce

This cannot be learned from an ivory tower. The easiest way is not to be worried by this trend and continue working as before, losing business and margin slowly year by year.

As in many businesses people are fired for making big mistakes, doing nothing unfortunate is most of the time not considered as a big mistake, although it is the biggest mistake.

picongressDuring the upcoming PI Conference in Berlin I will talk about this topic in more detail and look forward to meet and discuss this trend with those of you who can participate.

The soccer analogy stops here, as the data approach kills the the old game.
In soccer, the maximum remains 11 players on each side and one ball. In business, thanks to global connectivity, the amount of players and balls involved can be unlimited.

clip_image002[8]A final observation:
In my younger days, I celebrated many soccer championships, still I am not famous as a soccer player.

Why ?

Because the leagues I was playing in, were always limited in scope: by age, local,regional, etc. Therefore it was easy to win in a certain scope and there are millions of soccer champions beside me. For business, however, there are almost no borders.

Global competition will require real champions to make it work !!!

questionaireThe last month I haven’t been able to publish much of my experiences as I have been in the middle of several PLM selection processes for various industries. Now in a quiet moment looking back, I understand it is difficult for a company to choose a PLM solution for the future.

I hope this post will generate some clarity and may lead to some further discussion with other experts in the audience. I wrote about the do’s and don’ts of PLM selection in 2010, and most of it is still actual; however, there is more. Some of the topics explained:

Do you really need PLM ?

image

This is where it starts. PLM is not Haarlemerolie, an old Dutch medicine that was a cure for everything since the 17th century. The first step is that you need to know what you want to achieve and how you are aiming to achieve it. Just because a competitor has a PLM system installed, does not mean they use it properly or that your company should do it too. If you do not know why your company needs PLM, stop reading and start investigating.

….

If you are still reading this, you are part of the happy few, as justifying the need for PLM is not easy. Numerous of companies have purchased a PLM system just because they think they needed PLM. Or there was someone convinced that this software would bring PLM.

Most of these cases there was the confusion with PDM. Simply stating: PDM is more a departmental tool (engineering – multidisciplinary) where PLM is a mix of software, infrastructure to connect all departments in a company and support the product through its entire lifecycle.

Implementing “real” PLM is a business change, as people have to start sharing data instead of pushing documents from department to department. And this business transformation is a journey. It is not a fun journey, nicely characterized in Ed Lopategui’s blog post, the PLM Trail.

Although I believe it is not always that dramatic, Ed set the expectations right. Be well prepared before you start.

Why do companies still want PLM, while it is so difficult to implement?

The main reason is to remain competitive. If margins are under pressure, you can try to be more efficient, get better and faster tools. But by working in the old way, you can only be a little better.

NoChangeMoving from a sequential, information pushing approach towards an on-line, global information sharing manner is a change in business processes. It is interaction between all stakeholders. Doing things different requires courage, understanding and trust you made the right choice. When it goes wrong, there are enough people around you to point fingers at why it went wrong – hindsight is so easy.

Doing nothing and becoming less and less competitive is easier (the boiling frog again) as in that case the outside world will be blamed, and there is nobody to point fingers at (although if you understand the issue you should make the organization aware the future is at stake)

Why is PLM so expensive?

Assuming you are still reading, and you and your management are aligned there is a need for PLM, a first investigation into possible solutions will reveal that PLM is not cheap.

No_roiWhen you calculate the overall investment required in PLM, the management often gets discouraged by the estimated costs. Yes, the benefits are much higher, but to realize these benefits, you need to have a clear understanding of your own business and a realistic idea how the future would look like. The benefits are not in efficiency. The main benefits come from capabilities that allow you to respond better and faster than by just optimizing your departments. I read a clarifying post recently, which is addressing this issue: Why PLM should be on every Executive’s agenda !

From my experience with PLM projects, it is surprising to learn that companies do not object to spend 5 to 20 times more money for an ERP implementation. It is related to the topic: management by results or management by means.

PLM is not expensive compared to other enterprise systems. It can become expensive (like ERP implementations) if you lose control. Software vendors have a business in selling software modules, like car resellers have a business in selling you all the comfort beyond the basics.

The same for implementation partners, they have a business in selling services to your company, and they need to find the balance between making money and delivering explainable value. Squeezing your implementation partner will cause a poor delivery. But giving them an open check means that, at a certain moment, someone will stand up and shutdown the money drain as the results are no longer justifiable. Often I meet companies in this stage, the spirit has gone. It is all about the balance between costs and benefits.

pm

This happens in all enterprise software projects, and the only cure is investing in your own people. Give your employees time and priority to work in a PLM project. People with knowledge of the business are essential, and you need IT resources to implement. Do not make the mistake to leave business uncommitted to the PLM implementation. Management and middle management does not take the time to understand PLM as they are too busy or not educated / interested.

Make business owners accountable for the PLM implementation – you will see stress (it is not their daily job – they are busy), but in the longer time you will see understanding and readiness of the organization to achieve the expected results.

We are the largest – why select the largest ?

marketleaderWhen your assignment is to select a new enterprise system, life could be easy for you. Select a product or service from the largest business and your career is saved. Nobody gets blamed for selecting the largest vendor, although if you work for a small mid-sized company, you might think twice.

Many vendors and implementers start their message with:
“…. Market leader in ABC, though leader in XYZ, recognized by 123”

The only thing you should learn from this message is that this company probably has delivered a trustworthy solution in the past. Looking at the past you get an impression of its readiness and robustness for the future. Many promising companies have been absorbed by the larger ones and disappeared. As Clayton Christensen wrote in The Innovators Dilemma:
“What goes up does not go down”.
Meaning these large companies focus on their largest clients and will focus less on the base of the business pyramid (where the majority is), making them vulnerable for disruptive innovation.
Related to this issue there is an interesting post (and its comments), written by Oleg Shilovitsky recently: How many PLM vendors disappear in disruption predicted by Gartner.

observationMy observation: the world of PLM is not in a moment of sudden disruption at this moment.

Still when selecting a PLM vendor it is essential to know if they have the scale to support you in the future and if they have the vision to guide you into the future.

The future of PLM is towards managing data in a connected manner, not necessary coming from a single database, not necessary using only structured data. If your PLM vendor or implementer is pushing you to realize document and file management, they are years late and not the best for your future.

PLM is a big elephant

PLM is considered as a big elephant, and I agree if you address everything in one shot that PLM can do. PLM has multiple directions to start from – I wrote about it: PLM at risk – it does not have a single job

PLM has a huge advantage compared to a transactional system like ERP and probably CRM. You can implement a PLM infrastructure and its functionality step by step in the organization, start with areas that are essential and produce clear benefits for the organization. That is the main reason that PLM implementations can take 2 – 3 years. You give the organization time to learn, to adapt and to extend.

We lose our flexibility ?

flexibleNobody in an organization likes to be pushed in a cooperate way of working, which by definition is not as enjoyable and as flexible as they way you currently work. It is still an area where PLM implementations can improve: provide the user with an environment that is not too rigid and does not feel like a rigid system. You seen this problem with old traditional large PLM implementations for example with automotive OEMs. For them, it is almost impossible to switch to a new PLM implementation as everything has been built and connected in such a proprietary way, almost impossible to move to more standard systems and technologies. Late PLM implementations should learn from these lessons learned.

PLM vendor A says PLM vendor B will be out of business

One of the things I personally dislike is FUD (Fear, Uncertainty and Doubt). It has become a common practice in politics and I have seen PLM vendors and implementers using the same tactics. The problem with FUD is that it works. Even if the message is not verifiable, the company looking for a PLM system might think there must be some truth in this statement.

imageMy recommendation to a company that gets involved in FUD during a PLM selection process, they should be worried about the company spreading the FUD. Apparently they have no stronger arguments to explain to you why they are the perfect solution; instead they tell you indirectly we are the less worst.

Is the future in the cloud ?

I think there are two different worlds. There is the world of smaller businesses that do not want to invest in an IT-infrastructure and will try anything that looks promising – often tools oriented. This is one of my generalizations of how US businesses work – sorry for that. They will start working with cloud based systems and not be scared by performance, scalability and security. As long all is easy and does not disturb the business too much.

cloudLarger organizations, especially with a domicile in Europe, are not embracing cloud solutions at this moment. They think more in private or on-premise environments. Less in cloud solutions as security of information is still an issue. The NSA revelations prove that there is no moral limit for information in the sake of security – combined with the fear of IP theft from Asia, I think European companies have a natural resistance for storing data outside of their control.

For sure you will see cloud advocates, primarily coming from the US, claiming this is the future (and they are right), but there is still work to do and confidence to be built.

Conclusion:

PLM selection often has a focus on checking hundreds of requirements coming from different departments. They want a dream system. I hope this post will convince you that there are so many other thoughts relevant to a PLM selection you should take into account. And yes you still need requirements (and a vision).

Your thoughts ?

image
Confused? You won’t be after this episode of Soap. “

Who does not remember this tagline from the first official Soap series starting in 1977 and released in the Netherlands in 1979?

Every week the Campbells and the Tates entertained us with all the ingredients of a real soap: murder, infidelity, aliens’ abduction, criminality, homosexuality and more.

The episode always ended with a set of questions, leaving you for a week in suspense , hoping the next episode would give you the answers.

For those who do not remember the series or those who never saw it because they were too young, this was the mother of all Soaps.

What has it to do with PLM?

Soap has to do with strange people that do weird things (I do not want to be more specific). Recently I noticed that this is happening even in the PLM blogger’s world. Two of my favorite blogs demonstrated something of this weird behavior.

First Steve Ammann in his Zero Wait-State blog post: A PLM junkie at sea point-solutions versus comprehensive mentioned sailing from Ventura CA to Cabo San Lucas, Mexico on a 35 foot sailboat and started thinking about PLM during his night shift. My favorite quote:

Besides dealing with a couple of visits from Mexican coast guard patrol boats hunting for suspected drug runners, I had time alone to think about my work in the PLM industry and specifically how people make decisions about what type of software system or systems they choose for managing product development information. Yes only a PLM “junkie” would think about PLM on a sailing trip and maybe this is why the Mexican coast guard was suspicious.

Second Oleg in his doomsday blog post: The End of PLM Communism, was thinking about PLM all the weekend. My favorite quote:

I’ve been thinking about PLM implementations over the weekend and some perspective on PLM concepts. In addition to that, I had some healthy debates over the weekend with my friends online about ideas of centralization and decentralization. All together made me think about potential roots and future paths in PLM projects.

imageIt demonstrates the best thinking is done during out-of-office time and on casual locations. Knowing this from my long cycling tours in the weekend, I know it is true.
I must confess that I have PLM thoughts during cycling.

Perhaps the best thinking happens outside an office?

I leave the follow up on this observation to my favorite Dutch psychologist Diederik Stapel, who apparently is out of office too.

Now back to serious PLM

Both posts touch the topic of a single comprehensive solution versus best-of-breed solutions. Steve is very clear in his post. He believes that in the long term a single comprehensive solution serves companies better, although user performance (usability) is still an issue to consider. He provides guidance in making the decision for either a point solution or an integrated solution.

And I am aligned with what Steve is proposing.

Oleg is coming from a different background and in his current position he believes more in a distributed or network approach. He looks at PLM vendors/implementations and their centralized approach through the eyes of someone who knows the former Soviet Union way of thinking: “Centralize and control”.

imageThe association with communism which was probably not the best choice when you read the comments. This association makes you think as the former Soviet Union does not exist anymore, what about former PLM implementations and the future? According to Oleg PLM implementations should be more focused on distributed systems (on the cloud ?), working and interacting together connecting data and processes.

And I am aligned with what Oleg is proposing.

Confused? You want be after reading my recent experience.

I have been involved in the discussion around the best possible solution for an EPC contractor (Engineering Procurement Construction) in the Oil & Gas industry. The characteristic of their business is different from standard manufacturing companies. EPC contractors provide services for an owner/operator of a plant and they are selected because of their knowledge, their price, their price, their price, quality and time to deliver.

This means an EPC contractor is focusing on execution, making sure they have the best tools for each discipline and this is the way they are organized and used to work. The downside of this approach is everyone is working on its own island and there is no knowledge capitalization or sharing of information. The result each solution is unique, which brings a higher risk for errors and fixes required during construction. And the knowledge is in the head of experience people ….. and they retire at a certain moment.

So this EPC contractor wanted to build an integrated system, where all disciplines are connected and sharing information where relevant. In the Oil & Gas industry, ISO15926 is the standard. This standard is relative mature to serve as the neutral exchange standard of information between disciplines. The ideal world for best in class tools communicating with each other, or not ?

imageImagine there are 6 discipline tools, an engineering environment optimized for plant engineering, a project management environment, an execution environment connecting suppliers and materials, a delivery environment assuring the content of a project is delivered in the right stages and finally a knowledge environment, capitalizing lessons learned, standards and best practices.

This results in 6 tools and 12 interfaces to a common service bus connecting these tools. 12 interfaces as information needs to be send and received from the service bus per application. Each tools will have redundant data for its own execution.

image

What happens if a PLM provider could offer three of these tools on a common platform? This would result into 4 tools to install and only 8 interfaces. The functionality in the common PLM system does not require data redundancy but shares common information and therefore will provide better performance in a cross-discipline scenario.

In the ultimate world all tools will be on one platform, providing the best performance and support for this EPC contractor. However this is utopia. It is almost impossible to have a 100 % optimized system for a group of independent companies working together. Suppliers will not give up their environment and own IP to embed it in a customer´s ideal environment. So there is always a compromise to find between a best integrated platform (optimal performance – reduced cost of interfaces and cost of ownership) and the best connected environment (tools connection through open standards).

And this is why both Steve and Oleg have a viewpoint that makes sense. Depending on the performance of the tools and the interaction with the supplier network the PLM platform can provide the majority of functionality. If you are a market dominating OEM you might even reach 100 % coverage for your own purpose, although the modern society is more about connecting information where possible.

MY CONCLUSION after reading both posts:

  • Oleg tries to provoke, and like a soap, you might end up confused after each episode.
  • Steve in his post gives a common sense guidance, useful if you spend time on digesting it, not a soap.

Now I hope you are not longer confused and wish you all a successful and meaningful 2013. The PLM soap will continue in alphabetical order:

  • Will Aras survive 21-12-2012 and support the Next generation ?
  • Will Autodesk get of the cloud or have a coming out ?
  • Will Dassault get more Experienced ?
  • Will Oracle PLM customers understand it is not a database ?
  • Will PTC get out of the CAD jail  and receive $ 200 ?
  • Will SAP PLM be really 3D  and user friendly ?
  • Will Siemens PLM become a DIN or ISO standard ?

See the next episodes of my PLM blog in 2013

image

CoveyIt is interesting to read management books and articles and reflect the content in the context of PLM. In my previous post How the brain blocks PLM acceptance and in Stephen Porter´s (not yet finished) serial The PLM state: the 7 habits of highly effective PLM adoption, you can discover obvious points that we tend to forget in the scope of PLM as we are so focused on our discipline.

christensenThis summer holiday I was reading the Innovator´s Dilemma: When New Technologies Cause Great Firms to Fail by Clayton Christensen. Christensen is an associated professor at the Harvard Business School and he published this book already in 1997. Apparently not everyone has read the book and I recommend that if you are involved in the management of a PLM company to read it.

Sustaining technology

Christensen states there are two types of technologies. Leading companies are supporting their customers and try to serve them better and better by investing a lot in improving their current products. Christensen calls this sustaining technology as the aim is to improve existing products. Sustaining technologies lead to every time more and more effort to improve the current product performance and capabilities due to the chosen technology and solution concepts. These leading companies are all geared up around this delivery process and resources are optimized to sustain leadership, till ….

Disruptive technology

The other technology Christensen describes is disruptive technology, which initially is not considered as competition for existing technologies as it under performs in the same scope, so no way to serve the customer in the same way. The technology underperforms if you would apply to the same market, but it has unique capabilities that make it fit for another market. Next if the improvement path of disruptive technology can be faster than the improvement path for the sustaining technology, it is possible that their paths meet at a certain point. And although coming from a different set of capabilities, due to the faster improvement process the disruptive technology becomes the leading one and companies that introduced the disruptive technology became the new market leaders.

Why leading companies failed..

failChristensen used the disk drive industry as an example as there the change in technology was so fast that it was a perfect industry to follow it´s dynamics. Later he illustrates the concepts with examples from other industries where the leading firms failed and stopped to exist because disruptive technologies overtook them and they were not able to follow that path too.

Although the leading companies have enough resources and skills, he illustrates that it is a kind of logical path – big companies will always fail as it is in their nature to focus on sustaining technology. Disruptive technologies do not get any attention as they are targeting a different unclear market in the beginning and in addition it is not clear where the value from this disruptive technology comes from, so which manager wants to risk his or her career to focus on something uncertain in an existing company.

Christensen therefore advises these leading companies, if they expect certain technologies to become disruptive for their business, to start a separate company and take a major share position there. Leave this company focus on its disruptive technology and in case they are successful and cross the path of the sustaining technology embed them again in your organization. Any other approach is almost sure to fail, quote:

flyExpecting achievement-driven employees in a large organization to devote critical mass of resources, attention and energy to disruptive projects targeted at a small market is equivalent to flapping one´s arms in an effort to fly

As the book was written in 1997, it was not in the context of PLM. Now let´s start with some questions.

Is ERP in the stage of sustaining technology?

erp_txtHere I would say Yes. ERP vendors are extending their functional reach to cover more than the core functionality for two reasons: they need continuous growth in revenue and their customers ask for more functionality around the core. For sustaining technologies Christensen identifies four stages. Customers select a product for functionality, when other vendors have the same functionality reliability becomes the main differentiation. And after reliability the next phase is convenience and finally price.
From my personal observations, not through research, I would assume ERP for the major vendors is in the phase between convenience and price. If we follow Christensen´s analysis for SAP and Oracle it means they should not try to develop disruptive technologies inside their organization, neither should they try to downscale their product for the mid-market or add a different business model. Quote:

What goes up – does not go down. Moving to a high-end market is possible (and usually the target) – they will not go to small, poor defined low-end markets

How long SAP and Oracle will remain market leaders will depend on disruptive technologies that will meet the path of ERP vendors and generate a new wave. I am not aware of any trends in that area as I am not following the world of ERP closely

Is PLM in the stage of sustaining technology?

plm_txtHere I would say No because I am not sure what to consider as a clear definition of PLM. Different vendors have a different opinion of what a PLM system should provide as core technologies. This makes it hard to measure it along the lifecycle of sustaining technology with the phases: functionality, reliability, convenience and price.

Where the three dominant PLM providers (DS/PTC/Siemens) battle in the areas of functionality, reliability and convenience others are focusing on convenience and price.

Some generalized thoughts passed my mind:

  • DS and PTC somehow provoke their customers by launching new directions where they believe the customer will benefit from. This somehow makes it hard to call it sustaining technology.
  • · Siemens claiming they develop their products based on what customers are asking for. According to Christensen they are at risk in the long term as customers keep you captive and do not lead you to disruptive technologies.
  • · All three focus on the high-end and should not aim for smaller markets with the same technology. This justifies within DS the existence of CATIA and SolidWorks and in Siemens the existence of NX and SolidEdge. Unifying them would mean the end of their mid-market revenue and open it for others.

 

Disruptive technologies for PLM

Although PLM is not a sustained technology to my opinion, there are some disruptive technologies that might come into the picture of mainstream PLM.

open_sourceFirst of all there is the Open Source software model, introduced by Aras, which initially is not considered as a serious threat for the classical PLM players – “big customers will never rely on open source”. However the Open Source model allows product improvements to move faster than main stream, reaching at a certain point the same level of functionality, reliability and convenience. The risk for Open Source PLM is that it is customer driven, which according Christensen is the major inhibitor for disruptive steps in the future

cloudNext there is the cloud. Autodesk PLM and Kenesto are the two most visible companies in this domain related to PLM. Autodesk is operating from a comfort zone – it labels its product PLM, it does not try to match what the major PLM vendors try to do and they come from the small and medium mid-size market. Not too many barriers to come into the PLM mid-market in a disruptive manner. But does the mid-market need PLM? Is PLM a bad annotation for its cloud based product? Time will tell.

The management from Kenesto obviously has read the book. Although the initially concept came from PLM++ (bad marketing name), they do not to compete with mainstream PLM and aim their product at a different audience – business process automation. Then if their product picks up in the engineering / product domain, it might enter the PLM domain in a disruptive manner (all according to the book – they will become market leaders)

searchFinally Search Based Applications which are also a disruptive technology for the PLM domain. Many companies struggle with the structured data approach a classical PLM system requires and especially for mid-market companies this overhead is a burden. They are used to work in a cognitive manner, the validation and formalization is often done in the brain of experienced employees. Why cannot search based technology not be used to create structured data and replace or support the experienced brain?

If I open my Facebook page, I see new content related to where I am, what I have been saying or surfing for. Imagine an employee´s desktop that works similar, where your data is immediately visible and related information is shown. Some of the data might come from the structured system in the background, other might be displayed based on logical search criteria; the way our brain works. Some startups are working in this direction and Inforbix (congratulations Oleg & team) has already been acquired by Autodesk or Exalead by DS.

For both companies if they believe in the above concept, they should remain as long as possible independent from the big parent company as according to Christensen they will not get the right focus and priorities if they are part of the sustainable mainstream technology

Conclusion
This blog post was written during a relaxing holiday in Greece. The country here is in a crisis, they need disruptive politicians. They did it 3500 years ago and I noticed the environment is perfect for thinking as you can see below.

Meanwhile I am looking forward to your thoughts on PLM, in which state we are what the disruptive technologies are.

IMAG0235IMAG0233IMAG0231

observationSorry for the provoking title in a PLM blog, but otherwise you would not read my post till the end.

In the past months I have been working closely with several large companies (not having a mid-market profile). And although they were all in different industries and have different business strategies, they still had these common questions and remarks:

  • How to handle more and more digital data and use it as valuable information inside the company or for their customers / consumers ?
  • What to do with legacy data (approved in the previous century) and legacy people (matured and graduated in the previous century) preventing them to change ?
  • We are dreaming of a new future, where information is always up-to-date and easy to access – will this ever happen ?
  • They are in the automotive industry, manufacturing industry, infrastructure development and maintenance, plant engineering, construction and plant maintenance
  • They all want data to be managed with (almost) zero effort
  • And please, no revolution or change for the company

Although I have been focusing on the mid-market, it is these bigger enterprises that introduce new trends and as you can see from the observations above, there is a need for a change. But also it looks like the demands are in a contradiction to each other.

jugle

I believe it is just about changing the game.

If you look at the picture to the left, you see one of the contradictions that lead to PLM.

Increasing product quality, reducing time to market and meanwhile reducing costs seemed to be a contradiction at that time too.

Change ?

Although PLM has not been implemented (yet) in every company that could benefit from it, it looks like the bigger enterprises are looking for more.

plm_txtthe P from PLM becomes vague – it is no longer only the product that has the focus, it is also the whole context around the product that might influence it, that they want to take in consideration

the L from PLM remains – they still want to connect all information that is related to the lifecycle of their products or plants.

the M from Management has a bad association – companies believe that moving from their current state towards a managed environment of data is a burden. Too much overhead is the excuse to not manage dat. And their existing environments to manage data do not excel in user-friendliness. And therefore people jump towards using Excel.

Next

So if the P is not longer relevant, the M is a burden, what remains of PLM ?

Early June I presented at the Dassault Systems 3DExperience forum the topic of digital Asset Lifecycle Management for owners / operators. One of the areas where I believe PLM systems can contribute a lot to increase business value and profitability (quality and revenue – see using a PLM system for Asset Lifecycle Management )

Attending the key note speech it was clear that Dassault Systems does not talk about PLM anymore as a vision. Their future dream is a (3D) lifelike experience of the virtual world.  And based on that virtual model, implement the best solution based on various parameters: revenue, sustainability,  safety and more. By trying to manage the virtual world you have the option to avoid real costly prototypes or damaging mistakes.

I believe it is an ambitious dream but it fits in the above observations. There is more beyond PLM.

In addition I learned from talking with my peers (the corridor meetings) that also Siemens and PTC are moving towards a more industry or process oriented approach, trying to avoid the association with the generic PLM label.

Just at the time that Autodesk and the mid-market started to endorse PLM, the big three are moving away from this acronym.

This reminds me of what happened in the eighties when 3D CAD was introduced. At the time the mid-market was able to move to mainstream 3D (price / performance ratio changed dramatically) the major enterprises started to focus on PDM and PLM. So it is logical that the mid-market is 10 – 15 years behind new developments – they cannot afford experiments with new trends.

So let’s see what are the new trends:search

  • the management of structured and unstructured data as a single platform. We see the rise of Search Bases Application and business intelligence based on search and semantic algorithms. Using these capabilities integrated with a structured (PLM ? ) environment is the next big thing.
  • Apps instead of generic applications that support many roles. The generic applications introduce such a complexity to the interface that they become hard to use by a casual user. Most enterprise systems, but also advanced CAD or simulation tools with thousands of options suffer from this complexity. Would not it be nice if you only had to work with a few dedicated apps as we do in our private life ?
  • Dashboards (BI) that can be created on the
    flydashboardrepresenting actual data and trends based
    on structured and unstructured data.
    It reminded me of a PLM / ERP discussion I had with a company, where the general manager all the time stated the types of dashboards he wanted to see. He did not talk about PLM, ERP or other systems – he wanted the on-line visibility
  • Cloud services are coming. Not necessary centralizing all data on the cloud to reduce it cost. But look at SIRE and other cloud services that support a user with data and remote processing power at the moment required.
  • Visual navigation through a light 3D Model providing information when required. This trend is not so recent but so far not integrated with other disciplines, the Google maps approach for 3D.

So how likely are these trends to change enterprise systems like PLM, ERP or CRM. In the table below I indicated where it could apply:

enterprise trends

As you can see the PLM row has all the reasons to introduce new technologies and change the paradigm. For that reason combined with the observations I mentioned in the beginning, I am sure there is a new TLA (Three Letter Acronym) upcoming.

The good news is that PLM is dynamic and on the move. The bad news for potential PLM users is that the confusion remains – too many different PLM definitions and approaches currently – so what will be the next thing after PLM ?

Conclusion: The acronym PLM is not dead and becomes mainstream. On the high-end there is for sure a trend to a wider and different perspective of what was initially called PLM. After EDM, TDM, PDM and PLM we are waiting for the next TLA

blog_start

May 24th, 2008 was the date I posted my first blog post as a Virtual Dutchman aiming to share PLM related topics for the mid-market.

I tried to stay away from technology and function/feature debates and based on my day to day observations, describe the human side of the PLM  – what people do and why . All  from a personal perspective and always open to discuss and learn more.

Looking back and reviewing my 86 posts and 233 comments so far, I would like to share a summary around some of the main topics in my blog.

PLM

PLM_profIn 2008, PLM awareness was much lower – at that time one of the reasons for me to start blogging. There was still a need to explain that PLM was a business strategy needed beside ERP and PDM.

PLM will bring more efficiency, and in better quality, new innovative products to the market due to better collaboration between teams and departments.

At that time the big three, Dassault Systemes, Siemens and PTC  were all offering a very CAD-centric, complex approach for PLM. There was no real mid-market offering, although their marketing organizations tried to sell as-if a mid-marketing offering existed.  Express, Velocity, ProductPoint where are these offerings now ?

Now, In 2012 there is an established PLM awareness as everyone is talking about (their interpretation of) PLM and with Autodesk, a company that knows how to serve the mid-market, also acknowledged there is a need for PLM in their customer base, the term PLM is widespread

The new PLM providers focus on a disconnect between PDM and PLM, as in particular the handling of enterprise data outside the PDM scope is a white space for many mid-market companies that need to operate on a global platform.

PLM & ERP

NoChangeIn the relation between PLM and ERP, I haven’t seen a big change the past four years. The two dominating ERP originated vendors, SAP and Oracle were paying attention to PLM in 2008 in their marketing and portfolio approach.

However their PLM offerings in my perception, haven’t moved much forward. SAP is selling ERP and yes there is a PLM module and Oracle is having PLM systems, but I haven’t seen a real targeted PLM campaign explaining the needs and value of PLM integrated with ERP.

Historically ERP is the main IT-system and gets all the management attention. PLM is more considered something for engineering (and gets less focus and budget). Understanding PLM and how it connects to ERP remains a point of attention and the crucial point of interaction is the manufacturing BOM and the place where it is defined. The two most read posts from my blog are: Where is the MBOM and next Bill of Materials for Dummies – ETO, indicating there is a lot of discussion around this topic.

I am happy to announce here that in October this year during PLM Innovation US, I will present and share my thoughts in more detail with the audience, hoping for good discussions

New trends

There are three new trends that became more clear the past four years.

dummies_logoThe first one to mention is the upcoming of Search Based Applications (SBA). Where PLM systems require structured and controlled data, search based applications assist the user by “discovering” data anywhere in the organization, often in legacy systems or possible in modern communication tools.

I believe companies that develop an integrated concept of PLM and SBA can benefit the most. PLM and ERP vendors should think about combining these two approaches in an integrated offering. I wrote about this combined topic in my post: Social Media and PLM explained for Dummies

cloudThe second trend is the cloud. Where two-three years ago social media combined with PLM was the hype as a must for product innovation and collaboration, currently cloud is in focus.

Mainly driven and coming for the US, where the big marketing engine from Autodesk is making sure it is on the agenda of mid-market companies.

In Europe there is less a hype at this moment, different countries and many languages to support plus discussions around security take the overhand here.

For me a cloud solution for sure is lowering the threshold for mid-market companies to start implementing PLM. However how to make the change in your company ? It is not only an IT-offering. Like a similar discussion around Open Source PLM, there is still a need to provide the knowledge and change push  inside a company to implement PLM correct. Who will provide these skills ?

alm_1The third trend is the applicability of PLM systems outside the classical manufacturing industries.

I have been writing about the usage of PLM systems for Owner/Operators and the civil / construction industry, where the PLM system becomes the place to store all plant related information, connected to assets and with status handling. Currently I am participating in several projects in these new areas and the results are promising

People and Change

frogI believe PLM requires a change in an organization not only from the IT perspective but more important from the way people will work in an organization and the new processes they require.

The change is in sharing information, making it visible and useful for others in order to be more efficient and better informed to make the right decisions much faster.

This is a global trend and you cannot stay away from it. Keeping data locked in your reach might provide job security but in the long term it kills all jobs in the company as competiveness is gone.

The major task here lies with the management that should be able to understand and execute a vision that is beyond their comfort zone. I wrote about this topic in my series around PLM 2.0

Modern companies with a new generation of workers will have less challenges with this change and I will try to support the change with arguments and experiences from the field.

Audience

Since February this year, WordPress provides much more statistics and interesting is the map below indicating in which countries my blog is read. As you can see there are only a few places left on earth where PLM is not studied.  Good news !!

audience

Although most of my observations come from working in Europe, it is the US that provides the most readers (30 %) , followed by India (9 %) and on the third place the UK (6 %).

This might be related to the fact that I write my blog in English  (not in 100 % native English as someone commented once).

It makes me look forward to be in October in Atlanta during the PLM Innovation US conference to meet face to face with many of my blog readers and share experiences.

Conclusion

Reading back my posts since 2008, it demonstrated for me that the world of PLM is not a static environment. It is even that dynamic that some of the posts I wrote in the early days have become obsolete. 

At the end of 2008 I predicted the future of PLM in 2050 – here we are on the right track.

There is still enough blogging to do without falling into repetitions and  I am looking forward to your opinion, feedback and topics to discuss.

 

cloudThe trigger for this post is based was a discussion I had around the Autodesk 360 cloud based PLM solution. To position this solution and to simplify the message for my conversation partner Joe the plumber, I told him”: “You can compare the solution with Excel on-line. As many small mid-market companies are running around with metadata (no CAD files) in Excel, the simplified game changer with this cloud based PLM offering is that the metadata is now in the cloud, much easier to access and only a single version exists.

(sorry for Autodesk, if I simplified it too much, but sometimes your conversation partner does not have an IT background as they are plumbers)

google_docs

Interesting enough Joe said: “But what is the difference with Google docs or SharePoint where I can centralize my Excel files too  – and Google Docs is like a cloud solution, right ?”

He was right and I had to go more in-depth to explain difference. This part of the conversation was similar to discussions I had in some meetings with owner / operators in the civil and energy sector, discussing the benefits of PLM practices for their industry.

I wrote about this in previous posts:

Using a PLM system for asset lifecycle management requires a vision

PLM practices for the engineering / construction industry

The trouble with dumb documents

tiff_pdfHere it was even more a key point of the discussion that most of the legacy data is stored in dumb documents. And the main reason dumb documents are used is because the data needs to be available during the long lifecycle of the the plant, application independent if possible. So in the previous century this was paper, later scanned documents (TIFF – PDF) and currently mainly PDF. Most of the data now is digital but where is the intelligence ?

The challenges these companies have is that despite the fact information is now stored in a digital file, the next step is how to deal with the information in an intelligent manner. A document or an Excel file is a collection of information, you might call it knowledge, but to get access to the knowledge you need to find it.

Did you try to find a specific document in Google docs or SharePoint ? The conclusion will be the file name becomes very important, and perhaps some keywords ?

Is search the solution ?

searchTo overcome this problem, full text search and search based applications were developed, that allow us to index and search inside the documents. A piece of cake for Google and a niche for others to index not only standard documents but also more technical data (drawings, scans from P&ID, etc, etc).

Does this solve the problem ?

Partly, as suddenly the user finds a lot more data. Search on Google for the words “Right data” and you have 3.760.000.000 hits (or more). But what is the right data ? The user can only decide what is the right data by understanding the context.

  • Is it the latest version ?
  • Is it reflecting the change we made at that functional position ?
  • What has changed ?

rel_model

And here comes the need for more intelligent data. And this is typically where a PLM system provides the answer.

A PLM systems is able to manage different types of information, not only documents. In the context of a plant or a building, the PLM system would also contain:

  • a functional definition / structure (linked to its requirements)
  • a logical definition / structure  (how is it supposed to be ?)
  • a physical definition / structure (what is physically there ?)
  • a location definition / structure  (where in the plant / building ?)

and this is all version managed and related to the supported documents and other types of information. This brings context to the documents and therefore it exposes knowledge.

As there is no automatic switch from dumb documents towards intelligent data, it will be a gradual process to move towards this vision. I see a major role for search based applications to support data discovery. Find a lot of information, but than have the capability to capture the result (or generate a digest of the result) and store it connected to your PLM system, where is it managed in the future and provides the context.

Conclusion:  We understand that paper documents are out of time. Moving these documents to digital files stored in a central location, either in SharePoint or a cloud-based storage location is a step we will regret in ten years from now, as intelligent data is not only inside the digital files but also depending on its context.

observationThe past three weeks I had time to observe some PLM Vendors marketing messages (Autodesk as the major newbie). Some of these message lead to discussions in blogs or (LinkedIn) forums. Always a good moment to smile and think about reality.

In addition the sessions from PLM Innovation 2012 became available for the attendees (thanks MarketKey – good quality).  I had the chance to see the sessions I missed. On my wish list was “The future of PLM Business Models” moderated by Oleg as here according to Oleg some interesting viewpoints came up. This related to my post where I mentioned the various definitions of PLM.

All the above inspired me to write this post, which made me realize we keep on pushing misconceptions around PLM in our customer’s mind, with the main goal to differentiate.

I will address the following four misconceptions. The last one is probably not a surprise, therefore on the last position. Still sometimes taken for granted.

  1. PLM = PLM
  2. On the cloud = Open and Upgradeable
  3. Data = Process Support
  4. Marketing = Reality

1. PLM = PLM

It is interesting to observe that the definition of PLM becomes more and more a marketing term instead of a common definition which applies to all.

plm_shareLet me try to formulate again a very generic definition which captures most of what PLM Vendors target to do.

PLM is about connecting and sharing the company’s intellectual property through the whole product lifecycle. This includes knowledge created at the concept phase going through the whole lifecycle till a product is serviced in the field or decommissioned.

Experiences from the field (services / customers / market input) serve again for the other lifecycle phases as input to deliver a better or innovative product.

Innovation is an iterative process. It is not only about storing data, PLM is also covering the processes of managing the data, especially the change processes. Sharing data is not easy. It requires a different mind set, data is not only created for personal or departmental usage, but also should be found and extended by other roles in the organization. This all makes it a serious implementation, as aligning people is a business change, not an IT driven approach.

Based on this (too long) high-level PLM definition, it does not imply you cannot do PLM without a PLM system. You might also have a collection of tools that are able to provide a complete coverage of the PLM needs.

DIYOleg talks about DIY (Do It Yourself) PLM, and  I have seen examples of Excel spreadsheets managing Excel spreadsheets and Email archives.  The challenge I see with this type of PLM implementations is that after several years it is extremely difficult for a company to change. Possible reasons: the initial gurus do not longer work for the company, new employees need years of experience to find and interpret the right data.

A quick and simple solution can become a burden in the long term if you analyze the possible risks.

Where in the early years of PLM, it was mainly a Dassault Systemes, Siemens and PTC driven approach with deep CAD integrations,  the later years other companies like Aras and now Autodesk, started to change the focus from classical PLM more to managing enterprise metadata. A similar approach SAP PLM is offering. Deep integrations with CAD are the most complex parts of PLM and by avoiding them, you can claim your system is easier to implement, etc., etc.

myplmA Single version of the truth is a fancy PLM expression. It would be nice if this was also valid for the definition of PLM. The PLM Innovation 2012  session at the future of PLM models demonstrated that the vendors in this panel discussion had a complete different opinion about PLM. So how can people inside their company explain to the management and others why the need PLM and which PLM they have in mind ?

2. On the cloud = Open and Upgradeable

cloudDuring the panel discussion Grant Rochelle from Autodesk mentioned the simplicity of their software and how easy it will be upgradeable in the future. Also he referred to Salesforce.com as a proof point.They provide online updates from the software, without the customer having to do anything.

The above statement is true as long as you keep your business coverage simple and do not anticipate changes in the future. Let me share you an analogy with SmarTeam, how it started in 1995

At that time SmarTeam was insanely configurable. The Data Model Wizard contained several PDM templates an within hours you could create a company specific data model. A non-IT skilled person could add attributes, data types, anything they wanted and build the application, almost the same as Autodesk 360. The only difference, SmarTeam was not on the cloud, but it was running on Windows, a revolution at that time as all serious PDM systems were Unix based.

The complexity came however when SmarTeam started to integrate deeply with CAD systems. These integrations created the need for a more standardized data model per CAD system. And as the SmarTeam R&D was not aware of each and every customer’s implementation, it became hard to define a common business logic in the data (and to remain easily upgradable).

warningI foresee similar issues with the new cloud based PLM systems. They seem to be very easy to implement (add what you want – it is easy). As long as you do not integrate to other systems it remains safe. Integrating with other and future systems requires either a common data definition (which most vendors do not like) or specific integrations with the cost of upgrading.

In the beginning everything is always possible with a well-defined system. But  be aware looking back in history, every 10 years a disruptive wave comes in, changing the scope and upgradability.

And to challenge the cloud-based PLM vendors: in the generic definition of PLM that I shared above, PLM integrates also design data.

3. Data = Process Support

Another misconception, which originates from the beginning of PLM is the idea that once you have support for specific data in your system, you support the process.

PDM_ERP_2000First example: Items defined in ERP. When engineers started to use a PDM system and started to define a a new item there were challenges.  I had many discussions with IT-departments, that they did not need or wanted items in PDM. ERP was the source for an item, and when a designer needed a new item, (s)he had to create it in ERP. So we have a single definition of the item.

Or the designer had to request a new item number from the ERP system. And please do not request numbers too often as we do not want to waste them was the message.

Ten years later this looks like a joke, as most companies have an integrated PDM/ERP process and understand that the initial definition of a new item comes from PDM and at a certain stage the matured item is shared (and completed) by the ERP system.  It is clear that the most efficient manner to create a new item is through PLM as the virtual definition (specs / CAD data) also reside there and information is handled in that context.

capaA second more actual example is the fact that compliancy is often handled in ERP. It is correct that in the case you manufacture a product for a specific target market, you need to be able to have the compliancy information available.

However would you do this in your ERP system, where you are late (almost at the end) of the design lifecycle or is it more logical that during your design stages at all time you verify and check compliancy ? The process will work much more efficient and with less cost of change when done in PLM but most companies still see ERP as their primary IT system and PLM is an engineering tool.

Finally on this topic a remark to the simplified PLM vendors. Having the ability to store for example requirements in your system does not mean you have support for a complete requirements management process. It is also about the change and validation of requirements, which should be integrated for a relevant role during product definition (often CAD) and validation. As long as the data is disconnected there is not such a big advantage compared to Excel.

4. Marketing = Reality

plm_modelsIn the future of PLM Business Models
Oleg showed a slide with the functional architectures of the major PLM Vendors. In the diagram all seems to be connected as a single system, but in reality this is usually not the case.

As certain components / technologies are acquired, they provide the process coverage and only in the future you can imagine it works integrated. You cannot blame marketing for doing so, as their role is to position their products in the most appealing way customers will buy it. Without marketing perhaps no-one would buy a PLM system, when understanding the details Smile

Autodesk as a newcomer in PLM has a strong background in marketing. This is understandable as similar to Microsoft, their main revenue comes from selling a large volume of products, where the classical PLM vendors often have a combination with services and business change. And therefore a different price point.

When in the eighties Autodesk introduced AutoCAD, it was a simple, open 2D CAD environment, able to run on a PC. Autodesk’s statement at that time: “We provide 80 percent of the functionality for 20 % of the price”.
Does this sound familiar nowadays ?

As AutoCAD was a basic platform allowing customers and resellers to build their solutions on top of it, this became the mid-market success for Autodesk with AutoCAD.

The challenge with Autodesk PLM 360 is that although the same logic seems to make sense, I believe the challenge is not in the flexible platform. The challenge is in the future, when people want to do more complex things with the system, like integrations with design, enterprise collaboration.

At that time you need people who can specify the change, guide the change and implement the change. And this is usually not a DIY job.

pinoAutodesk is still learning to find the right PLM messages I noticed recently. When attending the Autodesk PLM session during PLM Innovation 2012 (end of February), one of their launching customers ElectronVault presented their implementation – it took only two weeks !!! Incredible

However reading Rob Cohee’s blog post the end of March, he mentions ElectronVault again. Quote:

ElectronVault was searching for something like this for over two years and after 6 weeks they have implemented Project Management, EBOM, MBOM, and starting on their APQP project. Six Weeks!!!

As you see, four weeks later the incredible two weeks have become six weeks and again everything is implemented. Still incredible and I am looking forward to meet ElectronVault in the future as I believe they are a typical young company and they will go through all of the maturity phases a company will go through: people, processes and tools (in this order). A tool driven implementation is more likely to slow down in the long term.

Conclusion: Misconceptions are not new. History can teach us a lot about what we experience now. New technology, new concepts can be a break through. However implementing them at companies requires a change in organizations and this has been the biggest challenge the past 100 years.

observation It has been silent from my side the past – more than two months. Extremely busy and sometimes surprised to see the amounts of post some of my colleagues could produce, with Oleg as the unbeaten number one. During this busy period I was able to observe some interesting trends listed below:

Going Social

Social Media and PLM is one of the favorite topics for both bloggers and some PLM vendors at this moment. New products for community based collaboration or social engineering  are promoted. Combined with discussions and statements how the new workforce (Generation Y) should get challenging jobs without doing the ‘old boring stuff’.

True arguments to initiate a change in the way we work.  And I agree, must of current PLM systems are not ‘intelligent’ enough to support engineers in a friendly manner. However is there an alternative at this moment ? Below a commercial (in Dutch) promoting that elderly workers are still required for quality.

I discussed the relation PLM and Social Media some time ago in my post Social Media and PLM explained for dummies. In addition my observation from the field, gives me the feeling that in most companies the management is still dominated by the older generation, and most of the time they decide on the tools they will be using.  No X and Y at this moment. Therefore I do not see a quick jump to social media integrated with PLM – yes the vision is there – but the readiness of most companies is not yet there.

Cloud

PLM and Cloud are also more and more mentioned by PLM vendors as a new solution specially for the mid-market.  And with an optimistic mind you can indeed believe that with a low investment (pay per use) mid-market companies can do their PLM on-line. But why are existing on-line PLM systems not booming at this time ? (Arena  / PLMplus / and the major PLM vendors) I believe that there are two key reasons for that:

  1. Implementing PLM is not equal to installing a  system. PLM is a vision to be implemented using a system. And the difficulty is that a vision does not match function and features from a product vendor. There is a need for a driving force inside the company that will support the business change. Where are the consultants and advocates (with experience) for this type of solutions ?knowledge_theft
  2. There is still a reluctance to store intellectual property somewhere on-line in a cloud without direct control  and ownership of data. Mid-market companies are not known to choose solutions ahead of the mass. In this type of companies cloud based CAD tools might be an entry point, but all product data – no way they say.

 

 

PLM or ERP

Before even talking about new technologies or fundamentals for PLM, I see the biggest challenge for PLM is still to get the recognition as the system for product knowledge (IP) and innovation. In too many companies ERP rules and PLM is considered as a way to capture CAD and engineering data. The main PLM vendors are not addressing this challenge – they neglect ERP (yes we can connect). And ERP vendors like SAP and Oracle are not known for their PLM messages and strategy  (yes we have PLM). As ERP is often the major IT-system historically, there is so often the (wrong) opinion that everything should be built and based on one system.

swiss

In some publications I have seen the Swiss knife as an example for the multi-functional system with all tools embedded. My question remains – who wants to work only with a Swiss knife when doing professional work ?

I like to have the right tools to do my job

The most important topic around my blog the past 3 years has been around the Manufacturing BOM – where should it be – and where is the MBOM current ?

Sweden – a reality check

Last week I attended the DS PLM forum in Gothenburg to present the vision of using a PLM system as the backbone for plant information management for owners/operators and how ENOVIA supports this vision.

PLM forum But I also learned Sweden is (one of) the most innovative countries (I need to verify the criteria used but can imagine there is a source of truth). What impressed me more where the presentations from  Staffan Lundgren from Sony Ericsson with the title “Living in a world of change – balancing stability and flexibility” and Magnus Olsson from Volvo Cars with the title “Driving operational excellence in a challenging business environment”. Both companies are well known for their robust image. From both speakers you could experience that they are not worried so much about Generation Y, their success is depending on a clear vision and a will to go there. And this basic drive is often missing – PLM is not a product you buy and then business continues as usual

Conclusion

PLM vendors made a lot of noise the past months (events / product announcements) and customers might get the impression that technology and software (or the price of software) are the main criteria for successful PLM. Although not unimportant, I would focus on the vision and to assure this vision is understood and accepted by the company. old_fashioned

Am I old fashioned ?

dummies_logo In many PLM communities, you see discussions and statements, that there will be a significant impact on the way we perform product lifecycle management using social media capabilities. In this post I will give my thoughts without going in-depth into certain products. At the bottom of this post you will find some links to posts which contributed to my opinion (the usual suspects).

PLM

Let’s first establish a baseline, why we want  Product Lifecycle Management. In a general  PLM is a vision, to bring products to the market faster, with better quality, being innovative and more customer focused.

plmbookThis vision can be implemented by best practices, like a standardized global staged New Product Introduction process, an enterprise wide Engineering Change Process, integration between different disciplines to work globally on a single repository for product definition and many more depending on your industry..

Two words pop up here: a single repository  for product definition and global. This is where the technology comes in. Due to an improved world-wide (internet / WAN) connectivity,  the capabilities are there to communicate efficient around a single repository for product definition and reliable, around the globe .  The world became a global workplace and we are all connected.  The improved connectivity enables PLM vendors (and others) to promote the “single version of the truth” concept.

Single Version of the truth

The idea behind it is that,  if you store everything in a single database, you will always find the right information. This idea was developed at the time the world was not yet global and people were thinking more local.

swiss Some of the major ERP vendors also push for this concept. If you store all your data in their single platform, you are sure there is no redundancy of data, so you are always confident about the content is the message.  This concept is often embraced by IT-departments, as the message having one single platform or one single system for all enterprise data sound like efficient.  (I call it Swiss Knife thinking – it does everything – but would you use it for professional work ?)

As long as we are talking about explicit data and local activities, this concept seems to prove itself. However, a lot of informal activities exist around the product development process, and these activities are not managed.

In 2009, I participated in a COE Automotive panel, where one of the attendees in the audience stood up accused the PLM vendors of making their life so complex. He said:

“If we have an issue on the shop floor with production, we just gather the right people and solve the issue – no need to fill in screens of data that PLM or ERP systems require. And if there was a customer with a problem, we send a service engineer and the problem is fixed”

Of course if his company was a global company, it would be impossible to gather all around the shop floor, or to visit the customer site and solve it in a reasonable time.

save To avoid missing information,  PLM and ERP systems try to collect as much as possible information in their systems, to have the best possible single version of the truth. Through immersive integrations and clever business logic, the user has to enter the minimum of information only once. However for the user still too much and way to complex they complain (and still not enough information is captured).

My conclusion so far: Single version of the truth is a concept to collect a majority of product related data, however missing all the informal communication, which is exchanged during talking (and later emailing and modern means of communication).

Extending the single version of the truth 

The single version of the truth is an important implementation concept for PLM and it requires already a major challenge for companies. Imagine all information that you produce will be stored in such a manner that everyone authorized can see it and it is stored in such a manner that someone else also can understand the information (and not only you) . This is such an important change process often overlooked by IT-driven PLM implementations.

knowledge

PLM Vendors focus on the tools to provide a platform to capture and share product data all around the product lifecycle. Not easy either as development is often based on generic functionality not optimized for a certain user role.

To understand the context of the shared data, you would like to hear or rewind the discussions people had at that time – as from these discussion a lot of implicit information can be retrieved. But no one wants to enter implicit information in a PLM system.

And then going global

Continuing with product development towards a global operation and addressing the communication around the product development is the next logic step. As it is much more difficult to communicate directly with everyone around the world, it is obvious that here social media come in the picture. 

email_lock Initial people were using email to exchange ideas with other people around the world.  This created new problems – sometimes huge attachments of unmanaged data, sometimes important information in mail boxes of only a few people, relevant for many.  Email is against the concept of a single version of the truth as people were creating there own non-pubic archives of product discussions. Before the WEB 2.0 revolution, PLM vendors provided email integrations or embedded email functionality in their PLM system. Still there a millions of email databases (Lotus Notes / Exchange) with product data, development data and more not visible for many.

The 2.0 change

A trend we see in social media is that your ‘old’ email system becomes more a notification system, that you got a direct message through one of your social platforms (RSS feeds, LinkedIn, Facebook, Blog, etc). Of course, if you are connected to all these platforms on-line all the time, you do not need a notification system anymore and you are connected to whom you want and when you want – a little bit back to the old days. If you enter a social environment, you are a participant of the communication going on – you can look around or participate.

social The benefit from social collaboration is that is does not push you into a structure of managed data. And it provides you with on-line communication with everyone who is in your selected community. The downside with the known social media is that it is not clear and secure, what happens with this information shared in social platforms. Will it be sold to special interest groups ? Can you find it through a search engine ?

For product related social collaboration, we need obviously  more secure communities to collaborate. And as for this collaboration you do not know who is in your trusted company network,  it is clear that cloud based solutions come in as a logical technical infrastructure. Of course these communities must be as easy accessible as popular social media and well integrated in the user’s environment.

 

Global single version of the truth ?

Combining the single version of the truth concept and the loose communication and content of social media brings us to the challenge of current PLM consultants, vendors and implementers.  All the information collected in current social platforms is hard to find or interpret. If you use the embedded search functions in these communities, they are not designed for clever searches – you need to know the context?  And then the question pops up if it is all information that existed ? You do not know as you do not see the full picture.

An old-fashioned managing all this data in our PLM system won’t work either. The capturing and classification and linking of data would be too much overhead for the reluctant user. We do not want to manage and justify each action that we do. (although this were knowledge management concepts in the 90’s)

Extended search

And this is where extended search comes in for PLM. The extended search is the glue between the single version of the truth and all the unmanaged data around the product in communities. Integrating this in a single environment is the challenge for PLM vendors (and less important for ERP vendors).

image Why mainly for PLM vendors  ? The main reason is that  in product development a lot of iterations, knowledge building, searching and capturing of date takes place. The more you know and understand, the better you are capable to make adequate decisions, understanding the right context during your development process. And for that you need the formal and informal information, global available across disciplines, companies and countries.

I see two major requirements for these extended search capabilities. 

  1. First of all the search engine should understand your context and skip irrelevant discussions and posts from your social media. I have seen how this could be done. In 2007,  Yedda,  an Israeli startup bought by AOL, came with an intelligent question and answer engine. The concept behind the engine was that it mapped the context of the question to your profile, to the community you belonged and compared it to other profiles of people in the community.  The more you asked and the more your answered the clearer your context became. In additions your answers were rewarded by others, and the more thumbs up you got, the more value you provided to the community – saving your boss to do appraisals.
  2. Search engines should provide you with a facetted search to drill down on the search results. As you do not know exactly what you are looking for, some keywords should do and then as a next step based on your context and the search context you should be able to drill down to the information you are looking for. This information should now give you a better understanding of the context of your product – if it is versioned ? if it is the latest ? Leave this to PLM.

 

My conclusion:

Classic PLM (single version of the truth) and social media capabilities (easy collaboration in communities)  combined with an extended search engine are the mandatory capabilities for a modern global PLM implementation where both formal and informal data are managed in the context of a product

Links for reference:

Tech-clarity white paper

Social Product Innovation

A hyper social organization for plm dummies

Atos Origin abandoning email

Follow

Get every new post delivered to your Inbox.

Join 307 other followers

%d bloggers like this: