You are currently browsing Jos Voskuil’s articles.
Last week I started a small series of posts related to the topic PLM 2.0. I was hoping for more comments and discussion about the term PLM 2.0, although I must say I was glad Oleg picked it up in his posts: PLM 2.0 born to die? and Will JT-open enable future of PLM 2.0?
Oleg, as a full-time blogger, of course had the time to draw the conclusions, which will take me another two weeks, hoping meanwhile the discussion evolves. Where Oleg’s focus is on technology and openness (which are important points), I will also explain that PLM 2.0 is a change in doing business, but this will be in next week’s post.
This week I will focus on the current challenges and pitfalls in PLM. And we all know that when somebody talks about challenges, there might be problems.
| Last week | : What is PLM 2.0? |
| This week: | : Challenges in current PLM |
| Next | : Change in business |
| Final post | : Why PLM 2.0 – conclusions |
The Challenges in current PLM
First I want to state that there are several types of definition in the world for PLM, coming from different type of organizations – I listed here two vendor independent definitions:
In industry, product lifecycle management (PLM) is the process of managing the entire lifecycle of a product from its conception, through design and manufacture, to service and disposal. PLM integrates people, data, processes and business systems and provides a product information backbone for companies and their extended enterprise.
Product Lifecycle Management (PLM) is the business activity of managing a company’s products all the way across the lifecycle in the most effective way. The objective of PLM is to improve company revenues and income by maximizing the value of the product portfolio
And there are more definitions. Just recently, I noticed on the PlanetPTC blog from Aibhe Coughlan a post where she promoted a definition of PLM published in the Concurrent Engineering blog. Here I got immediate a little irritated reading the first words: “PLM is software designed to enhance process efficiencies ……… and more …”
I do not believe PLM is software. Yes there is software used to automate or implement PLM practices, but this definition starts to neglect the culture and process sides of PLM. And as Oleg was faster – read his more extended comment here
(I am not paid by Oleg to promote his blog, but we seem to have similar interests)
Back to the classical definitions
The Wiki definition gives the impression that you need to have an infrastructure to manage (store) all product data in order to serve as an information backbone for the extended enterprise. It becomes more an IT-project, often sponsored by the IT-department, with the main goal to provide information services to the company in a standardized manner.
This type of PLM implementations tends to be the same type of implementation as an ERP system or other major IT-system. In this type of top-down implementations, the classical best practices for project management should be followed. This means:
- A clear vision
- Management sponsorship
- A steering committee
- A skilled project leader and team
- Committed resources
- Power user involvement
- Communication
- …… and more …
These PLM projects are promoted by PLM vendors and consultants as the best way to implement PLM. And there are a lot of positive things to say about this approach. For many big companies implementing cPDM or PLM was a major step forward. Most of the ROI stories are based on this type of implementations and have been the showcases on PLM events. It is true that data quality increases, therefore efficiency and product quality. Without PLM they would not reach the same competiveness as they have now.
But sometimes these projects go into extreme when satisfying users or IT-guidelines
To avoid the implementation of a ‘new IT-system’, companies often have the strategy that if we already have an ERP-system , let’s customize or extend it, so we can store the additional data and perform workflow processes based on this system.
In a recent webinar, I heard a speaker saying that in their company they had the following automation strategy defined together with IT is:
- First they will see if the needed PLM functionality exists in their ERP system or is part of the portfolio of their ERP provider. If the functionality is there (this means the ERP vendor has the capability to store metadata and a factsheet mentioning the right name), there is no looking outside.
- If the functionality is not there, there will be a discussion with the ERP vendor or implementer to build it on top of their ERP system.
I have seen implementations where the company has developed complete custom user interfaces in order to get user acceptance (the users would not accept the standard graphical interface). At that time, no one raised the flag about future maintenance and evolution of these custom environments. The mood was: we kept it simple – one single system.
I believe this closes the door for real PLM, as storing data in a system does not mean you will use it in an efficient and optimized manner. How will you anticipate on changes in business if it is just doing more with the same system?
And mid-market companies ?
The top-down approach described before is the fear of many mid-market companies, as they remember how painful their first ERP implementation was. And now with PLM it is even more unclear. PLM aims to involve the engineering department, which so far has not worked in a very procedural manner. Informal and ad-hoc communication combined with personal skills within this department was often the key for success.
And now an unfriendly system is brought in, with low or little usability, pushing these creative people to enter data without seeing any benefits. The organization downstream benefits but this will be only noticed later in time. And for the engineering department it will take more effort to change their work methodology focused on innovation. However, in general in the mid-market, the target of a PLM project is to have a Return on Investment (ROI) in a very short timeframe ( 1-2 years). Investing in usability should be even more important for this type of companies as there is less top-down pressure to accept this new PLM system.
And flexibility ?
In the past years we have seen that business is changing – there is a shift in global collaboration and manufacturing and from the recent history we can learn that those big enterprise projects from the past became a threat. Instead of being able to implement new concepts or new technology, the implementation became more and more vendor monolithic as other capabilities and applications do not fit anymore. This is against the concept of openness and being flexible for the future. I believe if PLM becomes as rigid as ERP, it blocks companies to innovate – the challenge for big companies is to find the balance between stability and flexibility (This was the title from Sony Ericsson’s presentation at the PLM forum in Sweden this year)
And again for mid-market companies who do not have the budget or resources to invest in similar projects. They have less a drive to optimize themselves in the same manner as big companies do as flexibility is often their trade mark (and capability to innovate) . So PLM for the mid-market will not work in the classical way.
This is one of the reasons why a mid-market PLM standard has not yet been found (yet ?). From the other hand many mid-market companies are dealing with PLM practices although often it is more close to PDM and CAD data management. And mid-market companies do not change their organization easily – there is more a departmental approach avoiding therefore a change in business.
To summarize the biggest challenges in current PLM described in this post:
- PLM is considered complex to implement
- PLM is a huge IT-project
- PLM requires change and structuring – but what about flexibility
- Where is the PLM value and ROI – user acceptance
- PLM for the mid-market – does it exist ?
Conclusion: I have been writing about the PLM challenges in the past, see the links below if you are interested in more details on a specific topic.
In 2008,I thought that Out-of-the-Box PLM systems and standard functionalities could bring a solution for the mid-market, perhaps future solutions based on the cloud. However I learned that if you want to do real PLM in a modern manner, you need to change the way you do your business – and this I will explain in my upcoming post.
Related links:
Recently I have been reading various interesting articles, it started with Why Amazon can’t Make a Kindle in the USA from Steve Denning and from here I followed several interesting links.
Most of the articles were business driven and not with a focus on technology. However what caught my attention was the similarity of issues that were raised in these articles as-if it was about PLM.
At the end it is a plea/cry for change to be more competitive in the future. With the current economical stand still, I believe there is a need and an opportunity for this change also in PLM. I am not pointing to regime changes all around the world, but somehow they are all connected to this new wave of globalization and openness to information.
And as my domain is PLM, I took PLM 2.0 as the vehicle to describe the change currently in the PLM world. Although PLM 2.0 is a term invented by Dassault Systems, I will use it as the placeholder to describe the changes in PLM.
In four posts I will guide you in the upcoming weeks through the thought process:![]()
| This week | : What is PLM 2.0 ? |
| Next | : Challenges in current PLM |
| Next | : Change in business |
| Final post | : Why PLM 2.0 – conclusions |
I hope you will stay with me when going through these four steps and look forward to your immediate feedback.
What is PLM 2.0 ?
In 2006 Dassault Systems announced PLM 2.0 as the new generation of PLM implemented on their V6 platform. If you go to the 3DS website you see the following definition of PLM 2.0
Look for the header PLM 2.0: PLM Online for All
In the DS definition you will find several keywords that will help us further to understand the PLM 2.0 capabilities:
a typical Dassault Systems viewpoint, as they are coming from the world or 3D CAD and virtualization and the company’s vision is around lifelike – and life is mostly in 3D.
3D as interface towards all product related information is a paradigm shift for companies that were used to display only metadata on boring tabular screens where you navigate on numbers and text. The other major CAD-related PLM vendors of course could follow this paradigm too, as 3D visualization of information is known to them. However when coming from an ERP-based PLM system you will see 3D is something far out of reach for these vendors (at this moment).
This is what I believe is a crucial keyword for all PLM future implementations it builds upon the Business Information concepts that became in fashion 8 years ago. Online means direct access to the actual data. No information conversion, no need for import or export, but sharing and filtering. What you are allowed to see is actual data and an actual status. Imagine what kind of impact working on-line would have on your organization. Evaluation of trends, Key Performance Indicators directly available – still of course the interpretation to be done by experts.
Intellectual Property – a topic that should be on every company’s agenda. The reason a company currently exists and will exist in the future is based on how they manage their unique knowledge. This knowledge can be based on how certain processes are done, which components are chosen, which quality steps are critical and more. Working in a global collaboration environment challenges the company to keep their IP hidden for others, for sure when you talk about online data. Losing your IP means for a company to be vulnerable for the future – read in the referenced blog post from Steve Jennings about DELL.
This is currently the platform for change as technologies are now enabling people and companies to implement applications in a different manner. Not only on premises, but it could be online, Software As A Service, Cloud based solutions and through standardized programming interfaces, companies could implement end-to-end business process without a huge, monolithic impact. Also Web 2.0 provides the platform for communities.
The concept of communities opens new perspectives for collaboration. In general people in a community, have a common interest or task, and they share thoughts, deliverables back to the community across all company borders. This is the power of the community and the collective intelligence built inside such a community. Without company borders it should give the people a better perspective on their market on their business due to the global participation
The vision is there – now ….
All the above keywords are capabilities for the future and in the world of PLM you see that every PLM vendor / implementer is struggling with them. How to implement them consistently across their offering is the major challenge for the upcoming years, assuming PLM 2.0 is considered as the next step.
If you look at the PLM vendors beside Dassault Systems, you see that Siemens and PTC are closest to following the PLM 2.0 approach, without mentioning the term PLM 2.0. Other vendors even refuse to talk about PLM, but they share already similar components, for example Autodesk.
Interesting to see that the ERP-based PLM vendors do not follow this trend in their communication, they are still working on consolidating and completing their ‘classical’ PLM components
But the classical PLM vendors struggle with the change in paradigm too.
- What to do with current, huge and structured implementations ?
- Is PLM 2.0 having the same demands or can it be different ?
Here you see opportunities for new comers in this market as you can implement online collaboration, intellectual property creation/handling and communities in different manners with different types of implementation demands.
So far my introduction in PLM 2.0. Browsing on the web, I did not find too much other viewpoints on this specific terminology, so I am curious about your thoughts or and complementary comments on this topic.
In my next post I will zoom in into the challenges of PLM and relate them to the PLM 2.0 vision
My take on PLM (classical) and PLM 2.0
Referenced in this context – not directly mentioned:
- IBM visionary presentation from 2006 – Michael Neukirchen
- The future of PLM – Martin Ohly (global PLM blog)
- PLM 2.0 technology or facelift – Oleg Shilovitsky
- Social Media and PLM explained for Dummies – Jos Voskuil
- Going Social With Product Development – Jim Brown
During this summer holiday, I was looking back on recent implementations and sales efforts related to PLM. Some had particular challenges regarding the PLM implementation and the relation to the IT department. The role of the IT-department was crucial, but always in a positive manner ? Judge yourself.
First this statement:
In many mid-market companies the choice for PLM is not that clear.
Let me explain what I mean by a typical mid-market company – it is not based on size or turn-over. For me a mid-market company is a company, not allocating the resources to have an overall strategic department and in addition the IT-department is limited to a team of people with a main focus to keep the company operational – ERP first.
The impact of this situation is twofold:
- From one way new business initiatives will mostly come from departments, either sales, marketing, engineering, production, service or IT. Companywide business initiatives are not likely to come from a separate department as each department is working on their own issues.
- Secondly IT often has a tendency to ‘standardize’ on certain environments. Some quotes:

“We love/hate Microsoft”
“SharePoint is our standard”
“If it is not Linux it is not reliable”
“Our ERP provider has also a PLM module, so this is going to be the standard”
And this standardization is often at the end the business killer
So where does PLM come from in a mid-market company ?
Example 1: The IT-department in company XYZ had the opinion there was a need to provide a company infrastructure for document management – people complained about not being able to find the right information. Related to the CAD system in use, it often became a kind of PDM implementation with extended document management. The IT-department provided the infrastructure (we need Oracle / SQL /DB2 – based on their standards) and engineering was allowed on top of that infrastructure to define their PDM environment.
As most of the people involved in this project were very familiar with computers, the implemented system was highly customized, due to specific actions the engineers wanted and what IT envisioned users would require. The overall thought was that other users would automatically get enthusiastic when seeing this implementation
In contrary: the regular users refused to work with the new PDM system – too complex, it takes too much time to fill in information and in situations of heavy customization some users became afraid of the system. Making one mistake was hard to undo and could have a chain reaction of events further down in the organization. They preferred the traditional method of sending documents or Excels to the other departments and getting face-to-face feedback. Of course in case of missing information or a mistake this could be clarified easily too.
Conclusion from all the PLM pessimists: PLM is too complex, PLM is hard to implement.
My intermediate conclusion: Good will to improve the company’s business is important, however you need business people to define and lead the implementation.
Example2: IT in a company ABC developed a custom PLM infrastructure for their users and everyone was happy, till …… business changed. Where several years ago, the users decided that the standard PLM software was not good enough as some details were not supported and the standard system PLM system was able to do too much, IT generously decided to build a complete, nice user environment for their company.
Everybody happy for three years, till recently, due to acquisitions, outsourced contracting (engineering and manufacturing), the IT-department has to hire more people to support more and more custom connections and data exchange. Now in an overheated state they are looking for ways to use PLM standard software instead, however IT does not want to write off the previous investments that easy, the users are not aware of the problems in changing business and the future PLM decision is again driven by IT and not by business,
Internal conclusion: The IT-department was very helpful for the end users, who appreciated the simple to-the-point interface – whispering: Therefore never a change process took place anticipating strategic changes upcoming. The result a kind of dead end.
My intermediate conclusion: If you are a mid-market company and you are not in software development, stay out of it. It is always a temporary and people dependent (who can/will leave at some time).
Just two examples out of many, typically for mid-market companies. I think also larger enterprises sometimes demonstrate the same problematic. Good IT-people and IT-department are crucial for every company. The challenge is to keep the balance between business and IT. The risk is that due to the fact that there is a lack of business strategy resources, the IT-department becomes the business standard.
Conclusion: PLM is about business change and PLM is not an IT-tool. However a PLM implementation requires good and intensive support from IT. The challenge for every company is that the IT-department often has the most skilled people for a company-wide implementation, however the business drivers and strategy should come from outside.
Your thoughts ???
Recently i noticed two different discussions. One on LinkedIn in the CMPIC® Configuration Management Trends group, where Chris Jennings started with the following statement:
Product Lifecycle Management (PLM) vs CM
An interesting debate has started up here about PLM vs CM. Not surprisingly it is revealing a variety of opinions on what each mean. So I’m wondering what sort of reaction I might get from this erudite community if I made a potentially provocative statement like …
“Actually, PLM and CM are one and the same thing” ?24 days ago
It became a very active discussion and it was interesting to see that some of the respondents saw PLM as the tool to implement CM. Later the discussion moved more towards system engineering, with a focus on requirements management. Of course requirements management is key for CM, you could say CM starts with the capturing of requirements.
There was some discussion about what is the real definition of PLM and this triggered my post. Is the definition of PLM secured in a book – and if so – in which book as historically we have learned that when the truth comes from one book there is discussion
But initially in the early days of the PLM, requirements management was not part of the focus for PLM vendors. Yes, requirements and specifications existed in their terminology but were not fully integrated. They focused more on the ‘middle part’ of the product lifecycle – digital mockup and virtual manufacturing planning. Only a few years later PLM vendors started to address requirements management (and systems engineering) as part of their portfolio – either by acquisitions of products or by adding it natively.
For me it demonstrates that PLM and CM are not the same. CM initially had a wider scope than early PLM systems supported, although in various definitions of PLM you will see that CM is a key component of the PLM practices.
Still PLM and CM have a lot in common, I wrote about is a year ago in my post: PLM, CM and ALM; not sexy ! and both fighting to get enough management support and investments. There is in the CMIP group another discussion open with the title: What crazy CM quotes have you heard ? You can easily use these quotes also for the current PLM opinion. Read them (if you have access and have fun)
But the same week another post caught my interest. Oleg’s post about Inforbix and Product Data Management. I am aware that also other vendors are working on concepts to provide end users with data without the effort of data management required. Alcove9 and Exalead are products with a similar scope and my excuses to all companies not mentioned here.
What you see it the trend to make PLM more simple by trying to avoid the CM practices that often are considered as “non-value add”, “bureaucracy” and more negative terms. I will be curious to learn how CM practices will be adhered by these “New Generation of PDM” vendors, as I believe you need CM to manage proactively your products.
What is your opinion about CM and PLM – can modern PLM change the way CM is done ?
In the past two weeks I had some interesting observations related to the core of PLM. Reading posts and some in-depth discussions with customers lead to the statements below:
Single version of truth ?
First I am going back to the intent of PLM – companies that implement PLM are not looking for a system where they can store information in a single database. Often the single version of the truth story is translated into technology . To illustrate this statement I was explaining a medical device company some weeks before how in PLM practices the interaction of requirements, integrated with regulatory compliance verification speeds up the product development process as deviations are early discovered during the development stage. The astonishing answer from the customer was; “Yes we already store this information in our well-known ERP system – so no need for PLM to handle this”
For this person the conclusion was that once data is stored in a system, it is managed. However what the company never tried was to track each requirement individually (and its possible change) during the engineering process and have a direct connection to regulatory demands.
In that area Excel, people’s knowledge and stored documents were used to collaborate. Off course with the late discovery of errors and several extra iterations due to it. As long as this company does not understand that the PLM system is not yet another tool to store data, but an enabler to work different and more efficient, these tools based statements will not bring them further. But as nobody get fired for selecting a well-known ERP system, but trying to change the way people work is a risk, often the first option is chosen.
And the more conservative the company culture, the more likely this will happen.
Tools do not make a change
In a last week meeting I met a VP of a business group of a real global company. I am stressing the word real as there are many global companies, that actually have one main location where the IP and influence comes from – as compared to the real global companies where all around the world the knowledge and IP of the company is invented and spread from there. Although the discussion was on the current status and quality of the tools in use, during breaks we concluded that although the discussion is about tools, the hardest part for implementing PLM in their company is to master and motivate the changes in the way of working towards the users.
In several blog posts from Oleg (and others) I see the hope that new user interfaces, user data handling can provide a break through here. I partly agree here – in the eighties/nineties we had the single window terminal screens, which were easy to understand (no multi-tasking / no multi-windows). Slowly the current workforce got used to windows (still no multi-tasking) and the new generation (generation-Y) is less and less single tasking and has different ways of solving issues. New interfaces can contribute to the acceptation of a tool, but as in the end we are still doing the same – storing data in a central system without changing the way we work – there is nothing improved
MBOM in PLM
Another interesting statement of this VP was also that they are in the process of bringing all engineering data coming from different disciplines in their R&D / PLM environment. Originally it was the ERP system that was used to combine all data coming from different disciplines. However the disadvantage was that this product definition resided partly in an ERP (there is no concept of a single ERP as manufacturing differ so much globally) and partly in PLM. Their future plan was therefore to extend the coverage of PLM toward the whole preparation for manufacturing – my favorite topic too: see Where is the MBOM ?
Conclusion so far
In day to day relations customers and PLM vendors, implementers are talking about functions and features to implement and where and how data is stored. The major driver should be the concept of changing the way we work to be more efficient, more clever and with higher quality. This is not reached by storing data, but by having the right data available at the right moment. And this moment changes when implementing PLM
- PLM Customers: Make sure that change of doing business is the target of your PLM implementation – do not look for tools only – check with your implementer and vendor which experience they have.
- PLM Implementers: Schedule time and activities during the implementation to understand the business change and the customer to adapt. It is a different type of skill required but as important.
- PLM Vendors: You have a hard time – as all are talking about the tools, you do not want to talk about the changes PLM implies – a pity but most customers do not want to hear this side during their PLM selection process
In the past months, I have talked and working with various companies about the topic of Asset Lifecycle Management (ALM) based on a PLM system. Conceptual it is a very strong concept and so far only a few companies have implemented this approach, as PLM systems have not been used so much outside the classical engineering world.
Why using a PLM system ?
To use a PLM system for managing all asset related information ( asset parameters, inventory, documents, locations, lifecycle status) in a single system assures the owner / operator that a ‘single version of the truth’ starts to exist. See also one of my older posts about ALM to understand the details.
The beauty lies in the fact that this single version of the truth concept combines the world of as-built for operators and the world of as-defined / as-planned for preparing changes. Instead of individual silos the ALM system provides all information, of course filtered in such a way that a user only sees information related to the user’s role in the system.
The challenge for PLM vendors is to keep the implementation simple as PLM initially in its core industries was managing the complexity. Now the target is to keep it extremely simple and easy to used for the various user roles, meanwhile trying to stay away from heavy customizations to deliver the best Return on Investment.
Having a single version of the truth provides the company with a lot of benefits to enhance operations. Imagine you find information and from its status you know immediately if it is the latest version and if other versions exists. In the current owner / operator world often information is stored and duplicated in many different systems, and finding the information in one system does not mean that this is the right information. I am sure the upcoming event from IDC Manufacturing Insights will also contribute to these findings
It is clear that historically this situation has been created due to the non-intelligent interaction with the EPC contractors building or changing the plant. The EPC contractors use intelligent engineering software, like AVEVA
, Bentley, Autodesk and others, but still during hand-over we provide dumb documents, paper based, tiff, PDF or some vendor specific formats which will become unreadable in the upcoming years. For long-term data security often considered the only way, as neutral standards like ISO-15926 still require additional vision and knowledge from the owner/operator to implement it.
Now back to the discussions…
In many discussions with potential customers the discussion often went into the same direction:
“How to get the management exited and motivated to invest into this vision ? The concept is excellent but applying it to our organization would lead to extra work and costs without immediate visibility of the benefits !”
This is an argument I partly discussed in one of my previous posts: PLM, CM and ALM not sexy. And this seems to be the major issue in western Europe and the US. Business is monitored and measured for the short term, maximum with a plan for the next 4 – 5 years. Nobody is rewarded for a long-term vision and when something severe happens, the current person in power will be to blame or to excuse himself.
As a Dutch inhabitant, I am still proud of what our former Dutch government decided and did in the after the flooding in 1953. The Dutch invested a lot of money and brain power into securing inhabitants behind the coast line in a project called the Delta Works. This was an example of vision instead of share holder value. After the project has been finished in the eighties there was no risk for a severe flooding anymore and the lessons learned from that time, brought the Dutch the knowledge to support other nations at risk for flooding. I am happy that in 1953 the government was not in the mood to optimize their bonus ( an unknown word at that time)
Back to Asset Lifecycle Management ….
Using a PLM system for asset lifecycle management provides the economical benefits by less errors during execution (working on the right information), less human involvement in understanding the information ( lower labor costs) and lower total cost of ownership (less systems to maintain and connect by IT).
But these benefits are in no relation with risk containment. What happens if something goes really wrong ?
If you you are a nuclear plant owner, you are in global trouble. A chemical plant owner or oil company can be in regional trouble, but they also will suffer from the damage done to their brand name globally. Other types of plant owners might come away with less, depending on the damage they potential ‘embank’
The emerging visionaries
For that reason, it is enlightening to see that some companies in Asia think different. There the management understands that they have the opportunity to build their future in a more clever way. Instead of copying the old way EPC contractors and plant owners work together, they start from a single version of the truth concept, pushing their contractors to work more integrated and clever with them. Instead of becoming boiling frogs, they are avoiding to fall into the same trap of many owners / operators in European and US based companies: “Why change the way we work, it does not seem to be so bad”
It requires a vision for the long term, something that will lead to extra benefits in the long term future: more efficient management of their assets, including risk containment and therefore being more competitive. If European and US-based companies want to be dominating in this industry they will need to show their vision too ..
Tomorrow I am attending the European Chemical Manufacturing Masters conference in Berlin, where I hope to learn and discuss this vision with the participants. I will keep you updated if i found the vision …..
This week I was happy to participate in the PLM INNOVATION 2011 conference in London. It was an energizer, which compared to some other PLM conferences, makes the difference. The key of the success, to my opinion was that there was no vendor dominance. And that participants were mainly discussing around their PLM implementation experiences not about products.
Additional as each of the sessions were approximate 30 minutes long, it forced the speakers to focus on their main highlights, instead of going into details. Between the sessions there was significant time to network or to setup prescheduled meetings with other participants. This formula made it for me an energizing event as every half hour you moved into a next experience.
In parallel, I enjoyed and experienced the power of the modern media. Lead by Oleg, a kind of parallel conference took place on Twitter around the hash tag #plminnovation2011. There I met, and communicated with people in the conference (and outside) and felt sorry I was not equipped with all the modern media (iPhone/Pad type equipment) to interact more intensive during these days.
Now some short comments/interpretations on the sessions I was able to attend
Peter Bilello, president of Cimdata opened the conference in the way we are used from Cimdata, explaining the areas and values of PLM, the statistics around markets, major vendors and positive trends for the near future. Interesting was the discussion around the positioning of PLM and ERP functionality and the coverage of these functionalities between PLM and ERP vendors.
Jean-Yves Mondon, EADS’ head of PLM Harmonization (Phenix program) , illustrated by extracts of an interview with their CEO Louis Gallois, how EADS relies on PLM as critical for their business and wants to set standards for PLM in order to have the most efficient interoperability of tools and processes coming from multiple vendors
Due to my own session and some one-to-one sessions, I missed a few parallel sessions in the morning and attended Oleg Shilovitsky’s session around the future of engineering software. Oleg discussed several trends and one of the trends I also see as imminent, it the fact that the PLM world is changing from databases towards networks. It is not about capturing all data inside one single system, but to be able to find the right information through a network of information carriers.
This suits also very well with the new generation of workers (generation-Y) who also learned to live in this type of environments and collect information through their social networks.
The panel discussion with 3 questions for panelist could have been a little better in case the panelist would have had the time to prepare some answers, although some of the improvisations were good. I guess the audience choose Graham McCall’s response on the question: “What will be the Next Biggest Disappointment” as the best. He mentioned the next ‘big world-changing’ product launch from a PLM vendor.
Then I followed the afternoon session from Infor, called Intelligent PLM for Manufacturing. The problem with this session I had (and I have this often with vendor sessions) was that Venkat Rajaj did exactly wrong what most vendors do wrong. They create their own niche definition – Product Lifecycle Intelligence (is there no intelligence in PLM) , being the third software company (where are they on Cimdata’s charts) and further a lot of details on product functions and features. Although the presentation was smooth and well presented, the content did not stick.
A delight that day was the session from Dr. Harminder Singh, associate fellow at Warwick Business School, about managing the cultural change of PLM. Harminder does not come from the world of software or PLM and his outsider information and looks, created a particular atmosphere for those who were in the audience and consider cultural change as an important part of PLM. Here we had a session inspired by a theme not by product or concept. I was happy to have a longer discussion with Harminder that day as I also believe PLM has to do with culture change – it is not only technology and management push as we would say. Looking forward to follow up here.
The next day we started with an excellent session from Nick Sale from TaTa Technologies. Beside a Nano in the lobby of the conference he presented all the innovation and rationalization related to the Nano car and one of his messages was that we should not underestimate the power of innovation coming from India. An excellent sponsor presentation as the focus was on the content.
In the parallel track I was impressed how Philips Healthcare implemented their PLMD architecture with three layers.
Gert-Jan Laurenssen explained they have an authoring layer, where they do global collaboration within one discipline. A PDM layer where they manage the interdisciplinary collaboration, which of course in the case of Healthcare is a mix of mechanical, electrical and software. And above these two layers they connect to the layer of transactional systems, that need the product definition data. Impressive was their implementation speed for sure due to some of the guidelines Gert-Jan gave – see Oleg’s picture from his slide here. Unfortunate I did not have the time to discuss deeper with Gert-Jan as I am curious about the culture change and the amount of resources they have in this project. Interesting observation was that the project was driven by IT-managers and Engineering managers, confirming the trend that PLM more and more becomes business focussed instead of IT-focused.
Peter Thorne from Cambashi brought in his session called Trends and Maximizing PLM investments an interesting visual historical review on engineering software investments using Google Earth as the presentation layer. Impressing to see the trends visualized this way and scary the way Europe is not really a major area of investment and growth.
Keith Connolly explained in his session how S&C Electric integrated their PLM environment with ERP. Everything sounded so easy and rational but as I know the guys from S&C for a longer time, I know it is a result of having a clear vision and working for many years towards implementing this vision.
Leon Lauritsen from Minerva gave a presentation around Open Source PLM and he did an excellent job around explaining where Open Source PLM could/should become attractive. Unfortunate his presentation quickly went into the direction of Open Source PLM equals Aras and he continued with a demo of Aras capabilities. I would have preferred to have a longer presentations around the Open Source PLM business model instead of spending time on looking at a product.
I believe Aras has a huge potential, for sure in the mid-market and perhaps beyond, but I keep coming back on my experiences I also have with SmarTeam: An open and easy to install PLM system with a lot of features is a risk in the hand of IT-people with no focus on business. Without proper vision and guiding (coming from ????? ) it will become again an IT-project, for cheaper to the outside world (as internal investments often are not so clear), but achieving the real PLM goals depends on how you implement.
After lunch we really reached to the speed of light with David Widgren, who gave us the insight of data management at CERN. Their problematic, somehow a single ‘product’ – the accelerators and all its equipment plus a long lifecycle (20 years development before operational), surviving all technologies and data formats requires them to think all time on pragmatic data storage and migration. In parallel as the consumers of data are not familiar with the complexity of IT-systems they build lots of specific interfaces for specific roles to provide the relevant information in a single environment. Knowing a lot of European funds are going there, David is a good ambassador for the CERN, explaining in a comic manner he is working at the coolest place on Earth.
Last session I could attend was Roger Tempest around Data Management. Roger is a co-founder of the PLMIG (PLM Interest group) and they strive for openness, standards and interoperability for PLM systems. I was disappointed by this session as I was not able to connect to the content. Roger was presenting his axioms as it seemed. I had the feeling he would come down the stage with his 10 commandments. I would be interested to understand where these definitions came from. Is it a common understanding or it it just again another set of definitions coming from another direction and what is the value or message for existing customers using particular PLM software.
I missed the closing keynote session from John Unsworth from Bentley. I learned later this was also an interesting session but cannot comment it.
My conclusion:
An inspiring event, both due to its organization and agenda and thanks to the attendees who made a real PLM centric event. Cannot wait for 2012





Interesting reflection, Jos. In my experience, the situation you describe is very recognizable. At the company where I work, sustainability…
[…] (The following post from PLM Green Global Alliance cofounder Jos Voskuil first appeared in his European PLM-focused blog HERE.) […]
[…] recent discussions in the PLM ecosystem, including PSC Transition Technologies (EcoPLM), CIMPA PLM services (LCA), and the Design for…
Jos, all interesting and relevant. There are additional elements to be mentioned and Ontologies seem to be one of the…
Jos, as usual, you've provided a buffet of "food for thought". Where do you see AI being trained by a…