You are currently browsing the category archive for the ‘open source’ category.
Who does not remember this tagline from the first official Soap series starting in 1977 and released in the Netherlands in 1979?
Every week the Campbells and the Tates entertained us with all the ingredients of a real soap: murder, infidelity, aliens’ abduction, criminality, homosexuality and more.
The episode always ended with a set of questions, leaving you for a week in suspense , hoping the next episode would give you the answers.
For those who do not remember the series or those who never saw it because they were too young, this was the mother of all Soaps.
What has it to do with PLM?
Soap has to do with strange people that do weird things (I do not want to be more specific). Recently I noticed that this is happening even in the PLM blogger’s world. Two of my favorite blogs demonstrated something of this weird behavior.
First Steve Ammann in his Zero Wait-State blog post: A PLM junkie at sea point-solutions versus comprehensive mentioned sailing from Ventura CA to Cabo San Lucas, Mexico on a 35 foot sailboat and started thinking about PLM during his night shift. My favorite quote:
Besides dealing with a couple of visits from Mexican coast guard patrol boats hunting for suspected drug runners, I had time alone to think about my work in the PLM industry and specifically how people make decisions about what type of software system or systems they choose for managing product development information. Yes only a PLM “junkie” would think about PLM on a sailing trip and maybe this is why the Mexican coast guard was suspicious.
Second Oleg in his doomsday blog post: The End of PLM Communism, was thinking about PLM all the weekend. My favorite quote:
I’ve been thinking about PLM implementations over the weekend and some perspective on PLM concepts. In addition to that, I had some healthy debates over the weekend with my friends online about ideas of centralization and decentralization. All together made me think about potential roots and future paths in PLM projects.
It demonstrates the best thinking is done during out-of-office time and on casual locations. Knowing this from my long cycling tours in the weekend, I know it is true.
I must confess that I have PLM thoughts during cycling.
Perhaps the best thinking happens outside an office?
I leave the follow up on this observation to my favorite Dutch psychologist Diederik Stapel, who apparently is out of office too.
Both posts touch the topic of a single comprehensive solution versus best-of-breed solutions. Steve is very clear in his post. He believes that in the long term a single comprehensive solution serves companies better, although user performance (usability) is still an issue to consider. He provides guidance in making the decision for either a point solution or an integrated solution.
And I am aligned with what Steve is proposing.
Oleg is coming from a different background and in his current position he believes more in a distributed or network approach. He looks at PLM vendors/implementations and their centralized approach through the eyes of someone who knows the former Soviet Union way of thinking: “Centralize and control”.
The association with communism which was probably not the best choice when you read the comments. This association makes you think as the former Soviet Union does not exist anymore, what about former PLM implementations and the future? According to Oleg PLM implementations should be more focused on distributed systems (on the cloud ?), working and interacting together connecting data and processes.
And I am aligned with what Oleg is proposing.
Confused? You want be after reading my recent experience.
I have been involved in the discussion around the best possible solution for an EPC contractor (Engineering Procurement Construction) in the Oil & Gas industry. The characteristic of their business is different from standard manufacturing companies. EPC contractors provide services for an owner/operator of a plant and they are selected because of their knowledge, their price, their price, their price, quality and time to deliver.
This means an EPC contractor is focusing on execution, making sure they have the best tools for each discipline and this is the way they are organized and used to work. The downside of this approach is everyone is working on its own island and there is no knowledge capitalization or sharing of information. The result each solution is unique, which brings a higher risk for errors and fixes required during construction. And the knowledge is in the head of experience people ….. and they retire at a certain moment.
So this EPC contractor wanted to build an integrated system, where all disciplines are connected and sharing information where relevant. In the Oil & Gas industry, ISO15926 is the standard. This standard is relative mature to serve as the neutral exchange standard of information between disciplines. The ideal world for best in class tools communicating with each other, or not ?
Imagine there are 6 discipline tools, an engineering environment optimized for plant engineering, a project management environment, an execution environment connecting suppliers and materials, a delivery environment assuring the content of a project is delivered in the right stages and finally a knowledge environment, capitalizing lessons learned, standards and best practices.
This results in 6 tools and 12 interfaces to a common service bus connecting these tools. 12 interfaces as information needs to be send and received from the service bus per application. Each tools will have redundant data for its own execution.
What happens if a PLM provider could offer three of these tools on a common platform? This would result into 4 tools to install and only 8 interfaces. The functionality in the common PLM system does not require data redundancy but shares common information and therefore will provide better performance in a cross-discipline scenario.
In the ultimate world all tools will be on one platform, providing the best performance and support for this EPC contractor. However this is utopia. It is almost impossible to have a 100 % optimized system for a group of independent companies working together. Suppliers will not give up their environment and own IP to embed it in a customer´s ideal environment. So there is always a compromise to find between a best integrated platform (optimal performance – reduced cost of interfaces and cost of ownership) and the best connected environment (tools connection through open standards).
And this is why both Steve and Oleg have a viewpoint that makes sense. Depending on the performance of the tools and the interaction with the supplier network the PLM platform can provide the majority of functionality. If you are a market dominating OEM you might even reach 100 % coverage for your own purpose, although the modern society is more about connecting information where possible.
MY CONCLUSION after reading both posts:
- Oleg tries to provoke, and like a soap, you might end up confused after each episode.
- Steve in his post gives a common sense guidance, useful if you spend time on digesting it, not a soap.
Now I hope you are not longer confused and wish you all a successful and meaningful 2013. The PLM soap will continue in alphabetical order:
- Will Aras survive 21-12-2012 and support the Next generation ?
- Will Autodesk get of the cloud or have a coming out ?
- Will Dassault get more Experienced ?
- Will Oracle PLM customers understand it is not a database ?
- Will PTC get out of the CAD jail and receive $ 200 ?
- Will SAP PLM be really 3D and user friendly ?
- Will Siemens PLM become a DIN or ISO standard ?
See the next episodes of my PLM blog in 2013
It is interesting to read management books and articles and reflect the content in the context of PLM. In my previous post How the brain blocks PLM acceptance and in Stephen Porter´s (not yet finished) serial The PLM state: the 7 habits of highly effective PLM adoption, you can discover obvious points that we tend to forget in the scope of PLM as we are so focused on our discipline.
This summer holiday I was reading the Innovator´s Dilemma: When New Technologies Cause Great Firms to Fail by Clayton Christensen. Christensen is an associated professor at the Harvard Business School and he published this book already in 1997. Apparently not everyone has read the book and I recommend that if you are involved in the management of a PLM company to read it.
Christensen states there are two types of technologies. Leading companies are supporting their customers and try to serve them better and better by investing a lot in improving their current products. Christensen calls this sustaining technology as the aim is to improve existing products. Sustaining technologies lead to every time more and more effort to improve the current product performance and capabilities due to the chosen technology and solution concepts. These leading companies are all geared up around this delivery process and resources are optimized to sustain leadership, till ….
The other technology Christensen describes is disruptive technology, which initially is not considered as competition for existing technologies as it under performs in the same scope, so no way to serve the customer in the same way. The technology underperforms if you would apply to the same market, but it has unique capabilities that make it fit for another market. Next if the improvement path of disruptive technology can be faster than the improvement path for the sustaining technology, it is possible that their paths meet at a certain point. And although coming from a different set of capabilities, due to the faster improvement process the disruptive technology becomes the leading one and companies that introduced the disruptive technology became the new market leaders.
Why leading companies failed..
Christensen used the disk drive industry as an example as there the change in technology was so fast that it was a perfect industry to follow it´s dynamics. Later he illustrates the concepts with examples from other industries where the leading firms failed and stopped to exist because disruptive technologies overtook them and they were not able to follow that path too.
Although the leading companies have enough resources and skills, he illustrates that it is a kind of logical path – big companies will always fail as it is in their nature to focus on sustaining technology. Disruptive technologies do not get any attention as they are targeting a different unclear market in the beginning and in addition it is not clear where the value from this disruptive technology comes from, so which manager wants to risk his or her career to focus on something uncertain in an existing company.
Christensen therefore advises these leading companies, if they expect certain technologies to become disruptive for their business, to start a separate company and take a major share position there. Leave this company focus on its disruptive technology and in case they are successful and cross the path of the sustaining technology embed them again in your organization. Any other approach is almost sure to fail, quote:
Expecting achievement-driven employees in a large organization to devote critical mass of resources, attention and energy to disruptive projects targeted at a small market is equivalent to flapping one´s arms in an effort to fly
As the book was written in 1997, it was not in the context of PLM. Now let´s start with some questions.
Is ERP in the stage of sustaining technology?
Here I would say Yes. ERP vendors are extending their functional reach to cover more than the core functionality for two reasons: they need continuous growth in revenue and their customers ask for more functionality around the core. For sustaining technologies Christensen identifies four stages. Customers select a product for functionality, when other vendors have the same functionality reliability becomes the main differentiation. And after reliability the next phase is convenience and finally price.
From my personal observations, not through research, I would assume ERP for the major vendors is in the phase between convenience and price. If we follow Christensen´s analysis for SAP and Oracle it means they should not try to develop disruptive technologies inside their organization, neither should they try to downscale their product for the mid-market or add a different business model. Quote:
What goes up – does not go down. Moving to a high-end market is possible (and usually the target) – they will not go to small, poor defined low-end markets
How long SAP and Oracle will remain market leaders will depend on disruptive technologies that will meet the path of ERP vendors and generate a new wave. I am not aware of any trends in that area as I am not following the world of ERP closely
Is PLM in the stage of sustaining technology?
Here I would say No because I am not sure what to consider as a clear definition of PLM. Different vendors have a different opinion of what a PLM system should provide as core technologies. This makes it hard to measure it along the lifecycle of sustaining technology with the phases: functionality, reliability, convenience and price.
Where the three dominant PLM providers (DS/PTC/Siemens) battle in the areas of functionality, reliability and convenience others are focusing on convenience and price.
Some generalized thoughts passed my mind:
- DS and PTC somehow provoke their customers by launching new directions where they believe the customer will benefit from. This somehow makes it hard to call it sustaining technology.
- · Siemens claiming they develop their products based on what customers are asking for. According to Christensen they are at risk in the long term as customers keep you captive and do not lead you to disruptive technologies.
- · All three focus on the high-end and should not aim for smaller markets with the same technology. This justifies within DS the existence of CATIA and SolidWorks and in Siemens the existence of NX and SolidEdge. Unifying them would mean the end of their mid-market revenue and open it for others.
Disruptive technologies for PLM
Although PLM is not a sustained technology to my opinion, there are some disruptive technologies that might come into the picture of mainstream PLM.
First of all there is the Open Source software model, introduced by Aras, which initially is not considered as a serious threat for the classical PLM players – “big customers will never rely on open source”. However the Open Source model allows product improvements to move faster than main stream, reaching at a certain point the same level of functionality, reliability and convenience. The risk for Open Source PLM is that it is customer driven, which according Christensen is the major inhibitor for disruptive steps in the future
Next there is the cloud. Autodesk PLM and Kenesto are the two most visible companies in this domain related to PLM. Autodesk is operating from a comfort zone – it labels its product PLM, it does not try to match what the major PLM vendors try to do and they come from the small and medium mid-size market. Not too many barriers to come into the PLM mid-market in a disruptive manner. But does the mid-market need PLM? Is PLM a bad annotation for its cloud based product? Time will tell.
The management from Kenesto obviously has read the book. Although the initially concept came from PLM++ (bad marketing name), they do not to compete with mainstream PLM and aim their product at a different audience – business process automation. Then if their product picks up in the engineering / product domain, it might enter the PLM domain in a disruptive manner (all according to the book – they will become market leaders)
Finally Search Based Applications which are also a disruptive technology for the PLM domain. Many companies struggle with the structured data approach a classical PLM system requires and especially for mid-market companies this overhead is a burden. They are used to work in a cognitive manner, the validation and formalization is often done in the brain of experienced employees. Why cannot search based technology not be used to create structured data and replace or support the experienced brain?
If I open my Facebook page, I see new content related to where I am, what I have been saying or surfing for. Imagine an employee´s desktop that works similar, where your data is immediately visible and related information is shown. Some of the data might come from the structured system in the background, other might be displayed based on logical search criteria; the way our brain works. Some startups are working in this direction and Inforbix (congratulations Oleg & team) has already been acquired by Autodesk or Exalead by DS.
For both companies if they believe in the above concept, they should remain as long as possible independent from the big parent company as according to Christensen they will not get the right focus and priorities if they are part of the sustainable mainstream technology
This blog post was written during a relaxing holiday in Greece. The country here is in a crisis, they need disruptive politicians. They did it 3500 years ago and I noticed the environment is perfect for thinking as you can see below.
Meanwhile I am looking forward to your thoughts on PLM, in which state we are what the disruptive technologies are.
I am just back from an exciting PLM Innovation 2012 conference. With a full program and around 250 participants, it was two intensive days of PLM interaction.
What I liked the most is that the majority of the audience was focusing on PLM business related topics. The mood of PLM has changed.
In this post, I will give an impression of the event, how I experienced it without going into the details of each session.
Several interesting sessions were in parallel so I could not attend them all, but MarketKey, the organizer of the conference confirmed that all presentations are filmed and will become available on-line for participants. So more excitement to come.
First my overall impression: Compared to last year’s conference there was more a focus on the PLM business issues and less on PLM IT or architecture issues (or was it my perception ?)
Gerard Litjens (CIMdata Director European Operations) opened the conference as CIMdata co-hosted the conference. In his overview he started with CIMdata’s PLM definition – PLM is a strategic business approach. (Everyone has his own definition as Oleg noticed too). Next he presented what CIMdata sees as the hottest topics. No surprises here: Extension from PLM to new industries, extending PDM towards PLM, Integration of Social Media, Cloud, Open Source, Enterprise integration and compliance.
Next speaker was Thomas Schmidt (Vice President, Head of Operational Excellence and IS – ABB’s Power Products Division) was challenging the audience with his key note speech: PLM: Necessary but not sufficient. With this title it seemed that the force was against him (thanks Oleg for sharing).
Thomas explained that the challenge of ABB is being a global company and at the same time acting as a ‘local’ company everywhere around the world. In this perspective he placed PLM as part of a bigger framework to support operational excellence and presented some major benefits from a platform approach. I believe the Q&A session was an excellent part to connect Thomas’s initial statements to the PLM focused audience.
Marc Halpern from Gartner gave his vision on PLM. Also Marc started with the Gartner definition of PLM, where they characterized PLM as a discipline. Gartner identified the following 5 major trends: Software everywhere in products, usage of social media for product development and innovation, using analytics tools to support the whole product lifecycle – after sales, service, connecting to the customer. Opportunities for existing products to deliver them through services (media content, transportation)
Next I attended the Autodesk session, a PLM journey using the cloud, where I was eager to learn their approach towards PLM. Autodesk (Mike Lieberman) let Linda Maepa, COO from Electron Vault in the USA explain the benefits of the Autodesk PLM 360 solution. Electron Vault, a young, high-tech company, has implemented the solution within 2 weeks. And here I got disconnected . Also when the suggestion was raised that you do not need time to specify the requirements for the system (old-fashioned stuff),
I suddenly got into a trance and saw a TV advert from a new washing power, with numerous features (program management, new product introduction, …..) that was washing whiter than all the others and a happy woman telling it to the world. I believe if Autodesk wants to be considered as serious in the PLM world it should also work with existing customers and managing the change in these organizations. Usually it takes already more than two weeks to get them aligned and agree on the requirements. Unfortunate I did not have time during the breaks to meet Autodesk at their booth as I would love to continue the discussion about reality as my experience and focus is on mid-market companies. Waiting for a next opportunity.
After Autodesk, I presented in my session what are the main drivers for making the case for PLM. I also started with my favorite PLM definition (a collection of best practices – 2PLM) and explained that PLM starts with the management vision and targets for the future. Is it about efficiency, quality, time to market, knowledge capture or a more challenging task: creating the platform for innovation?
Next I followed the Energy tracks, where I listened to Charles Gagnon from Hydro Quebec, who gave an interesting lecture called: Implementing Open Innovation and Co-Development.
At first glance this is a sensitive topic. When you innovate it is all about creating new intellectual property, and the fear that when working with partners the IP might be out of the company, Charles explained how this process of collaborative innovation was started and monitored. At the end he reported they measured a significant gain in R&D value perceived when working with external partners. And they did not use a PLM system to manage Innovation (to be investigated how they could survive)
After the lunch I continued with Jonas Hagner from WinWinD, a young manufacturer of windmills that are targeted to operate in extreme climate conditions ( a niche market). They are both implementing PLM and ERP in parallel and they did not have to suffer from years of ERP before PLM and therefore could have a more balanced discussion around part information availability / part number and more. Still I believe they have the challenge to connect in an efficient manner the services of the windmills back to their R&D organization, to do a full PLM circle.
Karer consulting together with Siemens Energy presented the case how they have designed and starting the implement the interface between their PLM system (Teamcenter) and ERP system (SAP). What was disappointing to see was that the interface between Teamcenter and SAP was relative complex (bi-directional with engineering activities in both sides) . Almost 1½ years of development of this interface and one of the main reasons, because SAP was first and they start the engineering order in SAP.
Apparently 2 years later Siemens Energy could not implement a clear distinct separation between PLM and ERP anymore and will not have to live with this complex interface. In the past I have written several times about this complexity that companies seem to accept due to political or historical reasons. Sad story for PLM – Where is the MBOM ?.
The day finished with a closing keynote from Peter Bilello, explaining how a successful PLM implementation could look like. Many wise statements that everyone should follow in case you want to come to a successful implementation (and define correctly what success is)
Thanks to Autodesk we had a nice evening reception, discussion and evaluating with peers the first day.
Day 2 started for me with an interesting lecture from Peter Fassbender, Head Design Center Fiat Latin America, describing how in Brazil the Fiat Mio experiment used modern social media techniques, like crowdsourcing, communities and user involvement to guide the innovation and development of a potential car. A unique experiment demonstrating that this type of projects are influence the brand reputation positively (if managed correct) and for me an example of what PLM could bring if R&D is connected to the outside world.
Christian Verstraete Chief Technologist – Cloud Strategy from HP gave an inspiring session about the open frontiers of innovation. The speed of business in the past 30 years has increased dramatically (you need to be from an older generation to be aware of this – the definition of response time has changed due to new technologies) Christian pushed everyone to think Out of the Box and to be innovative, which made me wonder how long will companies in the future build standard boring products. Will keep on innovating in this amazing pace as we did in the past 30 years ?
Graeme Hackland, IT/IS director from the UK based Lotus F1 team presented the challenges a F1 team has to face every year due to changing regulations. I visited Lotus F1 last year and was impressed by the fact that over 500 engineers are all working around one carper year to optimize the car mainly for aerodynamics, but next to assure it performs during the years. Thousands of short interactions, changes to be implemented a.s.a.p. challenge the organization to collaborate in an optimum manner. And of course this is where PLM contributes. All the F1 fans could continue to dream and listen to Graeme’s stories but Jeremie Labbe from Processia brought us back to earth by explaining how Processia assisted Lotus F1 in a PLM value assessment as a next step.
Meanwhile I had some side discussions on various PLM topics and went back to the sessions, seeing how David Sherburne, Director of Global R&D Effectiveness from Carestream Health presented his case (open source PLM) and his analysis why an open source PLM model (based on Aras) is very appealing in their case. Indeed the business value perceived and significant lower operational costs for the software are appealing for his organization and for sure will influence the other PLM vendors in their pricing model.
Pierfrancesco Manenti, from IDC Manufacturing Insights gave a clear presentation indicating the future directions for PLM: managing operational complexity, not product complexity. As you could expect from IDC Manufacturing Insights all was well based on surveys in the manufacturing industry and clearly indicating that there is still a lot to do for companies to efficient share and work around a common product development and operational platform. New technologies (the four IT forces: mobility, cloud, social business and big data analytics) will help them to improve.
The closing keynote came from Jason Spyromilio , who was director of the European Southern Observatory’s Very Large Telescope (http://www.eso.org) and he gave us the insights in designing (and building) the biggest eye on the sky. Precision challenges for such a huge telescope mirror, being built in the high mountains of Chili in an earthquake sensitive area demonstrate that all participants are required to contribute their IQ in order to realize such a challenge.
Conclusion: This PLM Innovation 2012 event doubled the 2011 event from a year ago in all dimensions. Thanks to the sponsors, the organization and high quality lectures, I expect next year we could double again – in participants, in content and innovation. It shows PLM is alive. But comming back to the title of this post: I saw some interesting innovation concepts – now how to enabale them with PLM ?
Note: looking at the pictures in this postyou will notice PLM is everywhere. I published this post on February 29th – a unique day which happens only every 4 years. In May this year my blog will be 4 years old.
This week I was happy to participate in the PLM INNOVATION 2011 conference in London. It was an energizer, which compared to some other PLM conferences, makes the difference. The key of the success, to my opinion was that there was no vendor dominance. And that participants were mainly discussing around their PLM implementation experiences not about products.
Additional as each of the sessions were approximate 30 minutes long, it forced the speakers to focus on their main highlights, instead of going into details. Between the sessions there was significant time to network or to setup prescheduled meetings with other participants. This formula made it for me an energizing event as every half hour you moved into a next experience.
In parallel, I enjoyed and experienced the power of the modern media. Lead by Oleg, a kind of parallel conference took place on Twitter around the hash tag #plminnovation2011. There I met, and communicated with people in the conference (and outside) and felt sorry I was not equipped with all the modern media (iPhone/Pad type equipment) to interact more intensive during these days.
Now some short comments/interpretations on the sessions I was able to attend
Peter Bilello, president of Cimdata opened the conference in the way we are used from Cimdata, explaining the areas and values of PLM, the statistics around markets, major vendors and positive trends for the near future. Interesting was the discussion around the positioning of PLM and ERP functionality and the coverage of these functionalities between PLM and ERP vendors.
Jean-Yves Mondon, EADS’ head of PLM Harmonization (Phenix program) , illustrated by extracts of an interview with their CEO Louis Gallois, how EADS relies on PLM as critical for their business and wants to set standards for PLM in order to have the most efficient interoperability of tools and processes coming from multiple vendors
Due to my own session and some one-to-one sessions, I missed a few parallel sessions in the morning and attended Oleg Shilovitsky’s session around the future of engineering software. Oleg discussed several trends and one of the trends I also see as imminent, it the fact that the PLM world is changing from databases towards networks. It is not about capturing all data inside one single system, but to be able to find the right information through a network of information carriers.
This suits also very well with the new generation of workers (generation-Y) who also learned to live in this type of environments and collect information through their social networks.
The panel discussion with 3 questions for panelist could have been a little better in case the panelist would have had the time to prepare some answers, although some of the improvisations were good. I guess the audience choose Graham McCall’s response on the question: “What will be the Next Biggest Disappointment” as the best. He mentioned the next ‘big world-changing’ product launch from a PLM vendor.
Then I followed the afternoon session from Infor, called Intelligent PLM for Manufacturing. The problem with this session I had (and I have this often with vendor sessions) was that Venkat Rajaj did exactly wrong what most vendors do wrong. They create their own niche definition – Product Lifecycle Intelligence (is there no intelligence in PLM) , being the third software company (where are they on Cimdata’s charts) and further a lot of details on product functions and features. Although the presentation was smooth and well presented, the content did not stick.
A delight that day was the session from Dr. Harminder Singh, associate fellow at Warwick Business School, about managing the cultural change of PLM. Harminder does not come from the world of software or PLM and his outsider information and looks, created a particular atmosphere for those who were in the audience and consider cultural change as an important part of PLM. Here we had a session inspired by a theme not by product or concept. I was happy to have a longer discussion with Harminder that day as I also believe PLM has to do with culture change – it is not only technology and management push as we would say. Looking forward to follow up here.
The next day we started with an excellent session from Nick Sale from TaTa Technologies. Beside a Nano in the lobby of the conference he presented all the innovation and rationalization related to the Nano car and one of his messages was that we should not underestimate the power of innovation coming from India. An excellent sponsor presentation as the focus was on the content.
In the parallel track I was impressed how Philips Healthcare implemented their PLMD architecture with three layers. Gert-Jan Laurenssen explained they have an authoring layer, where they do global collaboration within one discipline. A PDM layer where they manage the interdisciplinary collaboration, which of course in the case of Healthcare is a mix of mechanical, electrical and software. And above these two layers they connect to the layer of transactional systems, that need the product definition data. Impressive was their implementation speed for sure due to some of the guidelines Gert-Jan gave – see Oleg’s picture from his slide here. Unfortunate I did not have the time to discuss deeper with Gert-Jan as I am curious about the culture change and the amount of resources they have in this project. Interesting observation was that the project was driven by IT-managers and Engineering managers, confirming the trend that PLM more and more becomes business focussed instead of IT-focused.
Peter Thorne from Cambashi brought in his session called Trends and Maximizing PLM investments an interesting visual historical review on engineering software investments using Google Earth as the presentation layer. Impressing to see the trends visualized this way and scary the way Europe is not really a major area of investment and growth.
Keith Connolly explained in his session how S&C Electric integrated their PLM environment with ERP. Everything sounded so easy and rational but as I know the guys from S&C for a longer time, I know it is a result of having a clear vision and working for many years towards implementing this vision.
Leon Lauritsen from Minerva gave a presentation around Open Source PLM and he did an excellent job around explaining where Open Source PLM could/should become attractive. Unfortunate his presentation quickly went into the direction of Open Source PLM equals Aras and he continued with a demo of Aras capabilities. I would have preferred to have a longer presentations around the Open Source PLM business model instead of spending time on looking at a product.
I believe Aras has a huge potential, for sure in the mid-market and perhaps beyond, but I keep coming back on my experiences I also have with SmarTeam: An open and easy to install PLM system with a lot of features is a risk in the hand of IT-people with no focus on business. Without proper vision and guiding (coming from ????? ) it will become again an IT-project, for cheaper to the outside world (as internal investments often are not so clear), but achieving the real PLM goals depends on how you implement.
After lunch we really reached to the speed of light with David Widgren, who gave us the insight of data management at CERN. Their problematic, somehow a single ‘product’ – the accelerators and all its equipment plus a long lifecycle (20 years development before operational), surviving all technologies and data formats requires them to think all time on pragmatic data storage and migration. In parallel as the consumers of data are not familiar with the complexity of IT-systems they build lots of specific interfaces for specific roles to provide the relevant information in a single environment. Knowing a lot of European funds are going there, David is a good ambassador for the CERN, explaining in a comic manner he is working at the coolest place on Earth.
Last session I could attend was Roger Tempest around Data Management. Roger is a co-founder of the PLMIG (PLM Interest group) and they strive for openness, standards and interoperability for PLM systems. I was disappointed by this session as I was not able to connect to the content. Roger was presenting his axioms as it seemed. I had the feeling he would come down the stage with his 10 commandments. I would be interested to understand where these definitions came from. Is it a common understanding or it it just again another set of definitions coming from another direction and what is the value or message for existing customers using particular PLM software.
I missed the closing keynote session from John Unsworth from Bentley. I learned later this was also an interesting session but cannot comment it.
An inspiring event, both due to its organization and agenda and thanks to the attendees who made a real PLM centric event. Cannot wait for 2012
Like many people, the meditation of the dark Christmas days and the various 2009 reviews give you a push to look back and reflect. What happened and what did not happen in 2009?
And what might happen in 2010?
Here my thoughts related to:
ERP-related PLM vendors
Here I think mainly about Oracle and SAP. They have already identified PLM as an important component for a full enterprise solution. They are further pushing their one-shop-stop approach . Where Oracle’s offering is based on a set of acquired and to-be-integrated systems, SAP has been extending their offering by more focus on their own development.
As there might be real PLM knowledge in the Oracle organization as an effect of the acquisitions, but is it easily accessible for you? Is it reflected in the company’s strategy ?
With SAP I am even more in doubt; here you might find more people with ERP blood having learned the PLM talk. Maybe for that reason, I saw mostly Oracle as a PLM option in my environment and very few SAP opportunities for real PLM.
I assume in 2010 Oracle will push stronger and SAP try harder.
CAD-related PLM vendors
In this group you find as the major players PTC, Siemens and Dassault Systems. Autodesk could be there too, but they refuse to do PLM and remain focused around design collaboration. All these PLM vendors are striving to get the PLM message towards the mid-market. They have solutions for the enterprise, but to my feeling, most of the enterprises in the traditional well-know PLM markets, like Automotive and Aerospace, are in a kind of stand-still due to economical and upcoming environmental crisis.
It is sure business will not be as usual anymore, but where will the sustainable future go? Here I believe answers will come from innovation and small mid-market companies. The bigger enterprises need time to react so before we see new PLM activities in this area it will take time.
Therefore all PLM vendors move in directions outside engineering, like apparel, life sciences, and consumer packaged goods. These industries do not rely on the 3D CAD, but still can benefit from the key building blocks of PLM, like lifecycle management, program and portfolio management and quality/compliancy management. The challenge I believe for the PLM vendors is: Will these CAD-focused organizations be able to learn and adapt other industries fast enough? Where does 3D fit – although Dassault has a unique vision here.
For the mid-market, the PLM vendors offer more OOTB (Out Of The Box) solutions, mostly based on limited capabilities or more common available Microsoft components like SharePoint and SQL Server. This is not so strange as according to my observation, most smaller mid-market companies have not really made or understood the difference internally between document management and product data management, including Bill Of Materials not to be managed in Excel.
I assume 2010 the CAD related PLM vendors initially will focus on the bigger enterprises and new industries, the smaller mid-market companies require a different approach
This is an area which I expect to disappear in the future, although this is also the area where interesting developments start to happen. We see open source PLM software coming up with Aras leading and we see companies coming up with PLM on-demand software, Arena as the first company to sell this concept.
The fact that the traditional PLM-only vendors disappeared in this area (Eigner bought by Agile, Agile bought by Oracle, MatrixOne bought by Dassault Systems) indicates that the classical way of selling PLM-only was not profitable enough.
Either PLM needs to be integrated in companywide business processes (which I believe), or there will be PLM-only vendors that find a business model to stay alive.
Here I hope to see more clarity in 2010
Smaller mid-market companies
What I have seen in the past year is, that despite the economical crisis, PLM investments by these companies remained active. Maybe not in purchasing much more licenses or implementing new PLM features. Main investments here were around optimizing or slightly extending the PLM base. Maybe because there was time to sit still and analyze what could be changed, or maybe it was planned but due to work pressure, it was never executed. Anyway there was a lot of activity in this area not less than in 2008.
An interesting challenge for these mid-market companies will be to remain attractive for the new generation. They are not used to the classical ways of structured work as most of the current workforce is used to.
Social networking, social PLM, I have seen the thoughts, discussions and benefits, still trying to see where it will become reality.
2010 is another chance.
Sustainability and going green
This is an area where I am a little disappointed and this is perhaps not justified. I would expect with the lessons learned around energy and the upcoming shortage of natural resources, companies would take the crisis as a reason to change.
To my observation most of the companies I have seen are still trying to continue as usual, hoping that the traditional growth will come back. The climate conference in Copenhagen also showed that, we as human beings, do not feel pressured enough to adapt, by nature we are optimists (or boiling frogs).
Still there are interesting developments – I assume in the next few years we will see innovation coming – probably first from smaller companies as they have the flexibility to react. During the European Customer Conference in Paris, I heard Bernard Charles talking about the concept of a Bill Of Energy (The energy needed to create, maintain and demolish a product) As PLM consultants we already have a hard time explaining to our customers the various views on a BOM, still I like the concept, as a Bill Of Energy makes products comparable.
2010 the acceptance of Bill Of Energy
Here I want to conclude my post for this year. Thank you all for reading and sharing your thoughts and comments with this community. My ultimate conclusion for 2009 is, that is was a good PLM year for the mid-market, better as expected but the changes are going slow. Too slow – we will see next year.
The title of this post came in my mind when looking back on some of the activities I was involved in, in the past two weeks. I was discussing with several customers their progress or status of the current PLM integration. One of the trends was, that despite the IT department did their best to provide a good infrastructure for project or product related information, the users always found a problem ,why they could not use the system.
I believe the biggest challenge for every organization implementing PDM and later PLM is, to get all users aligned to store their information in a central location and to share it with others. Only in this manner a company can achieve the goal of having a single version of the truth.
With single version of the truth I mean – if I look in the PLM system I find there all the needed data to explain me the exact status of a product or a project.
If it is not in the PLM system, it does not exist !
How many companies can make that statement ?
If your company does not have the single version of the truth implemented yet , you might be throwing away money and even bring your company at risk in the long term. Why ? Let’s look at some undisclosed examples I learned in the past few weeks:
- A company ordering 16 pumps which on arrival where not the correct ones –
1 M Euro lost
- During installation at a drilling site the equipment did not fit and had many clashes – 20 M Dollar lost, due to rework and penalties
- 7000 K Euro lost due to a wrong calculation based on the wrong information
- A major bid lost due to high price estimation due to lack of communication between the estimator and the engineering department
- 500 K Euro penalty for delivering the wrong information (and too late)
All the above examples – and I am sure it is just a tip of what is happening around the world – were related to the power & process industry, where of course high-capital projects run and the losses might look small related to the size of the projects.
But what was the source of all this: Users
Although the companies were using a PLM system, in one company a user decided that some of the data should not be in the system, but should be in his drawer, to assure proper usage (according to his statement, as otherwise when the data is public available, people might misuse the data) – or was it false job security as at the end you loose your job by this behavior.
People should bring value in collaboration not in sitting on the knowledge.
Another frequently heard complaint is that users decide the PLM system is too complex for them and it takes too much time for them to enter data. And as engineers have not been bothered by any kind of strict data management, as ERP users are used to work with, their complaints are echoed to the PLM implementer. The PLM implementer can spend a lot of time to customize or adapt the system to the user’s needs.
But will it be enough ? It is always subjective and from my experience, the more you customize the higher the future risks. What about upgrades or changes in the process ?
And can we say NO to the next wish of this almighty user ?
Is the PLM system to blame ?
The PLM system is often seen as the enemy of the data creator, as it forces a user in a certain pattern. Excel is much easier to use, some home-made macros and the user feels everything is under control (as long as he is around).
Open Source PLM somehow seems to address this challenge, as it does not create the feeling, that PLM Vendors only make their money from complex, unneeded functionality. Everything is under own control for the customer, they decide if the system is good enough.
PLM On Demand has even a harder job to convince the unwilling user, therefore they also position themselves as easy to use, friend of the user and enemy of the software developer. But at the end it is all about users committing to share and therefore adapt themselves to changes.
So without making a qualification of the different types of PLM systems, for me it is clear that:
The first step all users in a company should realize is that, by working together towards a single version of the truth for all product or project related data, it brings huge benefits. Remember the money lost due to errors because another version of data existed somewhere. This is where the most ROI for PLM is reported
Next step is to realize, it is a change process and by being open minded towards change, either motivated or pushed by the management, the change will make everyone’s work more balanced – not in the first three months but in the longer term.
Conclusion: Creating the single version of the truth for project or product data is required in any modern organization, to remain competitive and profitable. Reaching this goal might not be as easy for every person or company but the awards are high when reaching this very basic goal.
At the end it is about human contribution – not what the computer says:
As a consultant working with mid-market companies, I enjoyed reading this post from Al Dean and its related comments and posts. Although I must say Al’s statement:
PLM+ are looking to solve this by creating a rich application that engages the user, provides ease of implementation and ongoing maintenance (by allowing the user/admin, rather than costly consultant) and can be delivered over the web, in an on-demand manner (which saves hardware and infrastructure cost)
was a trigger to react, as I am a consultant.
The base of every PDM/PLM
First I believe the base of PLM and PDM is to agree inside your company that you share and centralize product data. This means not only files, but also Bill of Materials, Issues, etc, etc, ..
To share and centralize product data seems like an easy mission and this is what all PLM software as a base provides, and I assume PLM+ does the same, only they store the data in the cloud, like Arena.
Sharing data is not a natural process in all companies as there is always the culture to share the minimum and to keep the rest to prove your own value – the bigger the company the more this will happen. This is human nature and this differs case by case. To make people share data is an area where either the management has to push, or in very small companies, a power user. In larger companies, often an external consultant is doing this job, in the role of an ‘outsider’ who can moderate and explain the benefits for all, instead of the threats.
This is what consultants really do; they do not install or administer systems.
PLM solutions can vary in the way they make sharing of data available. Some solutions are very rigid in what they offer as data model, but most of the necessary entities and attributes are there. They are based on best practices and target the 80 %. Often this is good enough, if the customer has no alternative and has the power by themselves to enforce the system as their platform for sharing data.
More flexible PLM solutions have an advantage and in the same time this is their disadvantage. They can be extended beyond the 80 % scenario and both the implementer and the customer will be challenged to reach the 100 % satisfaction. However we all know from the 80-20 rule, this is where it gets complicated.
80 % of the project is done in 20 % of the time, or in other words: you spent 80 % of your time (and budget) on reaching the last 20 %
Once having reached a common platform for sharing all product related information, for me the real PLM is starting. This is where a company would implement processes, that streamline the product development or delivery process – it requires a cross departmental change.
And at this stage, it is often where a consultant comes in. It is very rare, that in mid-market companies, management reserves time and resources to come with a strategic plan to implement PLM – I wrote about this in an older post. PLM requires a change in the way the company currently works.
So I am curious to learn how PLM+ and other On-Demand PLM software companies will try to address this step, as change is needed and someone has to push for it.
In my last three consecutive posts, I wrote about who decides on PLM in mid-market companies (a generalization from 15 years experience). There I claim that the selection for a PLM system is subjective, very much based on personal relations with the mid-market company. Again how PLM+ will address this in their business model, as there is a need for someone to push.
Open Source PLM software has somehow similar challenges. You need the drive from inside the customer to agree on sharing product data and next to extend. This is where the traditional PDM and PLM vendors push their business in direct contacts. Of course Open Source PLM providers have their focus on after the initial installation of the platform to extend it with a consultative and service model.
Conclusion: As every PLM provider at the end needs revenue for a living, I am looking forward to see where On-Demand PLM will go and finds it place. What will be business model that makes people buy and create the change
I realized that time is flying when you are busy, and I promised to publish the conclusion from my previous post: More on who decides for plm in a mid market company. In my two previous posts, I described the difficulties companies have to select the right PLM system. So far I discussed the two extremes, the silent approach where a possible bottom up approach was discussed and as the opposite where an ‘academical’ approach was followed.
Now it is time to get the answers on the academical approach.
These were the questions to be answered in the previous post:
- How much time has passed since the management decided PLM was good for their organization?
- How independent is the consultancy firm?
- Did they consider open source PLM as a solution?
- What was the ranking of the PLM vendors?
How much time has passed since the management decided PLM was good for their organization?
The whole process of selecting a PLM system often takes more than one or two years, starting from the first activities till the final conclusion to start. I believe this is unavoidable, as especially in mid-market companies the business values that PLM can bring are not always discussed and realized on the strategic level.
However, I believe the recent years PLM has been recognized by analysts, by software vendors and many young companies as a necessity for innovation and in the long term remaining competitive. And this is not only in the classical domains where PLM started – automotive / aero / industrial equipment. PLM value is everywhere in different industries, even apparel for example.
For companies that are now in the decision process, I believe 2009 and early 2010 are the years to decide, because a recovery of the economy might put back the focus on execution and not on strategy and they might miss the management focus for PLM. And as I wrote in a previous post, companies who made the best pit stop will benefit upmost.
For companies still in doubt: It is now or never
How independent is the consultancy firm?
It is clear that real independent consultancy firms do not exist – even if a consultant wants to be independent, there are three challenges to meet:
- How can a consultant evaluate or judge PLM systems they have not seen?
- How much experience does the consultant have in your business?
- How much work is there required in the project for the consultant?
As you can imagine, reviewing the above challenges, you will realize that consultants usually specialize in systems, where their expertise it required – as they also want to make a living. Consultants cannot afford to be an academic institute, as coming back to the previous point, all consultancy work at the end will be paid by the customer.
So to conclude on this point, if you want to be cost-effective, a company should do already a pre-selection based on systems and possible implementation partners, that fit naturally to their type of business and then evaluate how consultancy can be achieved.
What you will find out is that the major ‘expensive’ packages have loads of consultants to offer en the more and more you go into a mid-market environment, consultants become rare. For software from PLM vendors you will usually find a reseller network with people close to your offices that can support you. For Open Source software you will need to find the consultancy services through their software delivery program.
Anyway remember: 50 % of the success of a PLM implementation is based on the right implementation approach and partner not on the PLM functions and features.
Did they consider open source PLM as a solution?
No, because the consultant was not familiar with it, and discouraged the company to look at it. In general Open Source PLM, like PLM On-Demand are interesting trends to follow and should not be neglected. However the focus and approach for this type of solutions is different. I will not generalize at this moment as also I have no clear picture where Open Source PLM or PLM on Demand would be a big differentiator. I will try to evaluate and report it in future posts.
Comments from Open Source PLM Vendors or On Demand PLM Vendors are welcome to complete the PLM selection approach.
What was the ranking of the PLM vendors?
Ranking was done by the management, the selection team and the design department. These were the results plus their major comment:
1. The slide show PLM provider – they liked the business pitch
2. The CAD supplier with PLM features and gadgets – good guys – we know them
3. The PLM provider who showed everything – too much handling of data – too complex
1. The PLM Provider who showed everything – they really did it
2. The CAD supplier with PLM features and gadgets– we understand where they are going
3. The slide show PLM provider – do they really have a solution?
1. The CAD supplier with PLM features and gadgets– he knows what we want
2. The slide show PLM provider– could be a good solution too
3. The PLM Provider who showed everything – too complex, it will limit our productivity
The reason to drop the CAD supplier was that they were too afraid this provider does not know all about PLM. Both management and users felt the PLM provider that showed everything was too complex, this opposite to the project team where the members were very familiar with PLM capabilities after two years investigation and many demos and trade shows.
Conclusion: Selecting PLM, even in an academical manner is a subjective process. As in general the customer does not exactly knows what he needs and often the PLM provider shows too much in detail, the real journey starts at implementation time. And in this stage you need an experienced implementation partner who can match and communicate the expectations
Last week I saw once more a post, where free PLM software was offered and combined with the open source aura it should be THE solution for companies that want to implement PLM during this economical downturn. I believe this is a big mistake and for the following reasons:
WYPIWYG (What You Pay Is What You Get)
I learned that the WYPIWYG rule usually applies in the software world. Free software is nice, but does not guarantee that in case some functionality is missing or corrupt, that it will be fixed. So in case a company wants to implement the free PLM software, what to do if you feel something important for your business is missing ? You can ask the software provider to implement it for you – but will this be done ? Probably only when it is easy to achieve it will be done, but no commitment as the software is for free.
To assure it can be done, the software vendor will say it is open source software, so it can be changed if you want it. But who is going to make the change ? The mid-market company that thought to have selected an economical solution is not an IT-company – so who to hire? The open source software development company ? And this is what their business model is based on – they have the expertise with their software, so probably they are the best to adapt the open source software – not for free of course – and they learn from that but the customer pays.
Conclusion: there is no such thing as a free lunch.
It does not mean that all open source software is bad. Linux has shown that for an operating system it makes sense. Operating systems are 100 % in the scope of IT. PLM is something different. PLM systems indeed need to provide an IT backbone to assure data collaboration and replication globally. However PLM is probably more focused on business process changes and NOT on IT.
PLM requires people with business skills and not software developers
From my experience, PLM projects fail in case there are no business knowledgeable people available. It did not only happen with free PLM software or open source software. Some years ago, ERP vendors started to provide free PLM software to their customers to keep PLM companies on a distance. Like free PLM software it looked nice business wise, the software is free when you buy their ERP system. But who is going to implement it ?
This free PLM software availability has changed in the past years for ERP vendors. Also ERP vendors see PLM as a growth market for their business, so they started also to invest in PLM, providing PLM consultancy and no longer for free PLM functionality. However in one of the projects I was involved, it is clear that PLM and ERP are complementary approaches. Interesting is that none of the PLM vendors focus on ERP, apparently ERP vendors believe they can master PLM. I won’t say it is impossible however I believe if there is no real PLM vision on the top level of an ERP company, you cannot expect the competitive focus to exist.
Are CAD vendors providing PLM ?
Some CAD vendors have an embedded data management solution to manage their own data. This is usually more a PDM system and often the word PDM (Product Data Management) is too much for that. These systems manage their own CAD data but have no foundation for a multi-discipline Engineering BOM. For me, this is the base for PDM, as most companies have several disciplines working with different tools all around the same product. So CAD data management for me is not a the base for PDM, so for sure not for PLM.
PLM vendors bring real PLM value !
For me it is clear having worked with different vendors in the past: an ERP vendor, several PDM and PLM vendors, it is clear for me in order to bring committed value to a customer, you need first of all people with PLM skills – the ones that can differentiate between business process adaptation and software development. In order to implement PLM successful companies need to change the way they were working (read many of my previous posts about this – in particular this one). Software developers tend not to take this approach, but they adapt or extend the software to support the old way of working.
Finally paying for PLM software guarantees that the development of this software has a continuation based on business drivers and best practices. A PLM software vendor has the drive to improve to stay in business, both by software capabilities but even more by providing industry best practices.
Therefor my conclusion is that free PLM software does not help mid-market companies.
Feel free to react as I believe it is an important topic in this market.