You are currently browsing the tag archive for the ‘Dassault Systèmes’ tag.
Early this year Dassault Systèmes (DS) announced their future strategy called 3DExperience and it is only now after their two major events, the 3DExperience forums in the US and Europe, that the discussion has started around the meaning of 3DExperience.
In February, I thought 3DExperience was just a new marketing approach from DS to differentiate themselves from other vendors. A little more 3D, PLM has a bad connotation and as some of the newcomers redefined what is PLM, it made sense to be different again..
One of my fellow European bloggers, Yoann Maingon started a discussion with his provoking blog post: Different Marketing Strategies And Naming in PLM. Read the post and specially the comments from Jim Brown and Joe Barkai, who bring perspective to this post. In addition this post got traction in some closed LinkedIn PLM groups and it was interesting to observe that different interpretations of PLM created somehow the same feeling that I have with religion.
Stay with the book and the definition of PLM and complete your portfolio was the message. But which book and what is PLM ? Even if I would write the book: “The Truth about PLM”, who would consider my book as the authority.
We have learned from religion that concepts based on a book can lead to wars.
I am sure PLM will not go into that direction and it remains important not to focus on the definition of PLM, but at the end you want your customers (current and future) to be more efficient, more innovative and profitable. And in order to achieve that, you need to look at the whole process, starting with market interaction and delivery to the customer.
And for that there are many tasks to perform in a company. During your sales process, you need to make sure you address in the best manner the demands from the customer or market, so you can differentiate yourself from others. It can be based on your track record (best in class since 1845), your price (always the cheapest as you manage the process efficient) or your experience (the price and the good feeling it gives justifies the decision)
What has become clear in the past ten years is that we are in a global, changing market and specially traditional companies struggle to make a change which is future oriented. Customer loyalty was in the past based on the fact that you were in the same region, later the same country, but now everyone is shopping or sourcing around the world. Traditional markets and business are no longer secure.
So companies have to change and to my opinion one of the most important changes they have to go through is managing all information in their company (and from outside) in a shared manner. Products can no longer be defined without taking into account feedback and interaction from the market. Trends (positive or negative) related to your company or products need to be followed as they can kill or hype a product or your company.
To realize this change a company needs to start working different and this leads at then end to the need for different tools to support your changing processes. Here I see PLM systems coming into the picture. And here there are the two approaches: will you be selecting a single vendor with the richest PLM platform, or will you integrate a set of best in class applications. As we saw in the Tech4PD discussion there is no ultimate decision here.
I see the 3DExperience strategy from DS in this light. The classical scope of PLM tools and practices does not provide a base for the current and future markets. The solution is bigger than tools, it is the focus on the total experience (I could not find another name either).
It is a way to become attractive for your customers and not focus only on the product but also the way you can influence your potential customers to choose your product or service above others. DS call this the new era of 3DExperience, others will market it different.
In a consumer market we select products based on experience. Has anyone ever tried to justify the purchase of an iPad as an affordable device they need for their work ?
It is the experience.
There is one thing I dislike from the 3DEXperience approach. Blogging becomes expensive, as writing down the word 3DExperience – a mix of numbers and characters – slows down my efficiency. I prefer 3DE or3DX as the most efficient set of keystrokes related to a TLA.
To my opinion PLM is not dead at all for DS. They just market the bigger picture different to be different from the classical PLM platforms. All PLM vendors have their unique marketing approach. Companies need to define what is their next step to remain in business and they are afraid for the old PLM, due to the horror stories – complexity, expensive, etc.
Is it selling experiences or perhaps is it making sure a new generation of workers will be motivated to work for your company. It remains a mix of classical PLM functionality, but it is also big data, social media and more interactive and friendly interfaces which are expected.
Finally one observation from the 3DExperience forum in Brussels where I believe they could have done a better job. Usually when customers and prospects go to this kind of events, they want to hear that they have chosen the right software provider. So it should be a mix of assuring them they are not alone (many others have chosen our solutions) and excited by the future vision your vendor has. Here they message that all companies need to sell experiences in the future otherwise they will be a commodity created a bad mood. Fear does not push people to change, it paralyzes people.
Conclusion: Dassault Systemes new 3DExperiences is understandable as a way to introduce a bigger picture than PLM alone. If every company needs THE EXPERIENCE approach has to be seen. In addition I believe DS still needs to work on more understandable examples where the 3DE approach is a differentiator. For sure there is PLM inside
A week later after the PLM Innovation conference in the US, I have time to write down my impressions. It was the first time this event was organized in the US, after having successful events the past years in Europe. For me it was a pleasure to meet some of my PLM friends in reality as most of my activities are in Europe.
With an audience of approximate 300 people, there were a lot of interesting sessions. Some of them in parallel, but as all session are recorded I will soon catch up with the sessions I have been missing.
My overall impression of the event: Loud en Positive, which is perhaps a typical difference between the US and Old Europe.
Here some impressions from sessions that caught my attention
Kevin Fowler, Chief Architect Commercial Airplanes Processes and Tools from The Boeing Company presented the PLM journey BCA went through. Their evolution path is very similar to the way Siemens and Dassault Systemes went through (driven by Boeing’s challenges).
Impressive was the amount of parts that need to be managed aircraft (up to a billion) and with that all its related information. Interesting to see that the amount of parts for the 787 have strongly decreased.
After PLM Generation 1 based on Teamcenter and Generation 2 based on Dassault Kevin demonstrated that functionality and cost of ownership increased due to more complexity, it was evident that usability decreased.
And this will be a serious point of attention for Generation 3, the PLM system BCA will be selecting for 2015 and beyond. Usability has to increase.
And as we were among all the PLM vendors and customers, during the breaks there was a discussion, which PLM vendor would be the preferred next partner for PLM. I had a discussion related to PLM vision and visibility with one of the SAP partners (DSC software Inc.). He is convinced that SAP provides one of the best PLM platforms. I am not convinced as I see SAP still as a company that wants to do everything, starting from ERP. And as long as their management and websites do not reflect a PLM spirit I am not convinced. In 2015 I might be wrong with my impression that PLM, Usability and SAP are not connected.
Note: browse to this SAP PLM rapid-deployment solution page and view the Step by Step guide. Now the heading becomes SAP CRM rapid-deployment solution. A missing link, marketing or do they know the difference between PLM and CRM ?
Next Nathan Hartman from Purdue University described his view on future PLM which will be model-based and he presented how PLM tools could work together describing a generic architecture and interfaces. This is somehow the way the big PLM Vendors are describing their platform too, only in their situation more in a proprietary environment.
- Nathan gave an interesting anecdote related to data sharing. He mentioned as example a 3D model that was built by one student and he asked another student to make modifications on it. This was already a challenge and even working with the same software lead to knowledge issues, trying to understand the way the model was built. Demonstrating PLM data sharing is not only about having the right format and application, but also the underlying knowledge needs to be exposed
Monica Schnitger, as business analyst presented her thoughts on PLM justification. Where in Munich I presented Making the case for PLM session, Monica focused on a set of basic questions that you need to ask (as a company) and how you can justify a PLM investment. It is not for the big kids anymore and you can find her presentation here (with another PLM definition).
I liked the approach of keeping things simple, as sometimes people make PLM too complex. (Also as it serves their own businesses). Monica presented that a company should define its own reasons for why and how PLM. Here I have a slight different approach. Often mid-market companies do not want PLM, they have pains they want to get rid of or problems that they want to solve. Often starting from the pain and with guidance from a consultant companies will understand which PLM practices they could use and how it fits in a bigger picture instead of using plasters to fix the pain.
Beth Lange, Chief Scientific Officer from Mary Kay presented how her organization, operating from the US (Texas), managed a portfolio of skin care products sold around the world by an independent local sales force all around the world. In order to do this successfully and meet all the local regulatory requirements, they implemented a PLM system where a central repository of global information is managed.
The challenge for Mary Kay is that from origin a company with a focus on skin care products and an indirect sales force, where sometimes the sales person has no IT skills, this project was also a big cultural change. Beth explained that indeed the support from Kalypso was crucial to manage the change. Something which I believe is always crucial in a global PLM project where the ideal implementation is so different from current, mainly isolated practices.
As regulatory compliance is an important topic for skin care products, Beth explained that due to the compliancy rules for China, where they have to expose their whole IP, the only way to protect their IP was putting a patent on everything, even on changes.
Would NPI mean New Patent Introduction in the CPG market ?
Ron Watson, Director, PLM COE and IT Architecture
from Xylem Inc. presented their global PLM approach. As the company is is relative young (2011) but is a collection of businesses all around the world, they have the challenge to operate as a single company and sharing the synergy.
Ron introduced PDLM (Product Data Lifecycle Management) and he explained there was first a focus getting all data under control and make it the single source for all product data in a digital format, preferably with a minimum of translation needed.
Here you see xylem has chosen for an integrated platform and not the best of breed applications. After having the product data under control the focus can be on standardizing processed overall the company. Something which other companies that have followed this approach, confirm it brings huge benefits.
As it was a PTC case study, Graham Birch, senior director of Product Management from PTC did the closing part. Unfortunate by demoing some pieces of the software. A pity as I believe people do not get impressed by seeing some data on the screen they recognize. Only when there is a new paradigm to demonstrate related to usability I would be interested.
And as-if they have read my mind, Daniel Armour from Joy Global demonstrated the value and attractiveness of 3D Visualization tools in their organization. Joy Global is manufacturer of some of the biggest mining equipment and he demonstrated how 3D Visualization can be used in the sales and marketing process, but also during training and analysis of work scenarios.
His demonstration showed again that 3D as a communication layer is attractive and appeals to the user (serious gaming in some cases).
As it was a SAP case, I was surprised to hear the words from Brian Soaper, explaining the power of 3D for SAP users and how SAP users will benefit from better understanding, higher usability etc. Iw as as-if a 3D-CAD/PLM was talking, was this a dream ?
I woke up out of this dream when someone from the audience asked to Daniel how they would keep the visualizations actual, is there a kind of version management ? Daniel mentioned currently not but you could build a database to perform check-in/checkout of data. Apparently all the 3D we have seen is not connected to this single database SAP always promotes.
Peter Bilello, CIMdata’s president had a closing session with the title: Evaluate the tangible benefits from PLM can prove complex, which indeed is true. Peter’s presentation was partly similar to the presentation he gave early this year in Munich. And this is what I appreciate about CIMdata. Some people in the audience mentioned that many times it is the same story and many of the issues Peter was presenting are somehow known facts. And this is what I like about CIMdata, PLM is not changing per conference or new IT-hype. If you want to understand PLM, you need to keep to the purpose and meaning of PLM. And these known facts apparently are not so known, a majority of PLM projects are executed or lead by people that decided to invent the wheel,as inventing the wheel seems cheaper than renting a wheel, and this lead again to issues later that every experienced consultant could foresee.
The evening with a champagne reception on the paddle boat making a tour around the lake and a dinner at the lake side concluded this first day.
The combination of presentations, scheduled network meetings and enough network time made it a successful first day
Next day I started with a BOM management Think Thank were in the target was to come to some common practices and understanding of BOM management. As the amount of participants was large and the time was short we only had a chance to touch the surface of the cases brought in.
What was clear from this session to me is that most challenges reported were due to the fact that the tools were already in place and only afterwards the PLM team (mostly engineering people) had to struggle to make it into a consistent process. They do not get a real help from PLM vendors or implementers, as their focus is on selling more tools and services.
What is missing for these organizations is a PLM for Dummies methodology guide, which is business centric instead of technology centric. For sure there are people who have published PLM books, but either they are not found of relevant. And as nothing comes for free, these companies try to invent the wheel again. PLM is serious business.
The first keynote speech from the second day was from Dantar Oosterwal, Partner and President of the Milwaukee Consulting Group, who inspired us with Lean and PLM: Operation Excellence and this all related to his experiences with Harley Davidson.
It was interesting how described the process of focusing on the throughput to get market results. There are various parameters how you can influence market share, by a price strategy, by increased marketing , but the most impact on Harley Davidson sales result was the effect of innovation. More model variants being the choice for more potential customers. By measuring and analyzing the throughput of the organization an optimal tuning could be found.
Dantar also shared an interesting anecdote about an engineer that had to study the impact of ethanol as fuel for a certain engine. And after a certain time the engineer came back with the answer: yes we can. He answered the question but left no knowledge behind. Where a similar question was asked about performance to a supplier and he came back with an answer and graphs explaining where the answered was based upon. This answer created knowledge as it could be reused for similar questions. It is a good example how companies should focus on collecting knowledge in their PLM environment instead of answers on a question.
The second keynote speaker was from the world biggest brand, Christopher Boudard, PLM Director from the Coca Cola Company. With its multiple brands and global operations it is a challenge to work towards a single PLM platform. He explained that at this stage they are still busy loading data into the system, where a lot of time is spent on data cleansing as the system has only value when the data is clean and accurate.
And this requires a lot of motivation for the PLM team to keep the executive management involved and sponsoring a project that takes five years to consolidate data and only then through the right processes make sure the data remains correct.
Christopher demonstrated in a passionate manner that leadership is crucial for such a project to be successful and implemented. For me as an European it was interesting to see that the world biggest brand PLM Director is a French citizen inspiring the management of such an American company.
Monica Schnitger conducted an interesting session about the state-of-the-state of multi-platform PLM.
If you cannot understand this tittle, it was a debate between the PLM vendors ( Aras, Autodesk, Dassault Systemes, PTC, SAP and Siemens) about openness, interoperability, cloud and open source.
After the first question from Monica about the openness of each of the vendor’s systems, it was clear there are no problems to expect in the future. All systems were extremely open according to the respondents and I lost my attention for the debate somehow as I had the feeling I was listening to an election debate. Monica did her best to make it an unbiased discussion, however I feel when some people want to make a specific point and use every question to jump on that it becomes an irritation.
Chad Jackson, this time dressed up as the guy that is always killed in the first 5 minutes of a Star Trek episode, shared with us the early findings of the 2012 State of PLM. Tech4PD followers, and who is not a follower, understood he lost the bet of the second episode.
Chad let me know if this picture needs to be removed, as it can kill your future career.
The preliminary findings Chad was sharing with us that manufacturing and service where significant interested and consumers of PLM data, but do not consider it as their data, where they have to contribute too also. The fact that it is available makes them involved in using the data, still these departments do not show active participating in PLM. Somehow this confirms the observation that PLM is still considered as an engineering tool, not as an enterprise wide platform.
As the initial group of participants (n = 100) is small and not random selected from an overall population, the questions remains what the state of PLM is in 2012. I assume Chad will come back on that in a later time.
The last plenary sessions, David Karamian from Flextronics and Michael Grieves, a Virtual Perfect Future, had the ungrateful position being the last two speakers of this event. I have to review David’s presentation again as it was not easy to digest and recall a week later what were his highlights. Michael’s presentation was easier to digest and I also believe with the new upcoming concepts and technology the virtual perfect future is there.
Looking back on a successful event, where I met many of my PLM peers from across the ocean, I will take the upcoming weeks to review the sessions I missed. Final good news for all PLM mind sharers, is the fact that CIMdata and MarketKey announced the coordination of their upcoming events next year – more content and more attendees guaranteed.
The past three weeks I had time to observe some PLM Vendors marketing messages (Autodesk as the major newbie). Some of these message lead to discussions in blogs or (LinkedIn) forums. Always a good moment to smile and think about reality.
In addition the sessions from PLM Innovation 2012 became available for the attendees (thanks MarketKey – good quality). I had the chance to see the sessions I missed. On my wish list was “The future of PLM Business Models” moderated by Oleg as here according to Oleg some interesting viewpoints came up. This related to my post where I mentioned the various definitions of PLM.
All the above inspired me to write this post, which made me realize we keep on pushing misconceptions around PLM in our customer’s mind, with the main goal to differentiate.
I will address the following four misconceptions. The last one is probably not a surprise, therefore on the last position. Still sometimes taken for granted.
- PLM = PLM
- On the cloud = Open and Upgradeable
- Data = Process Support
- Marketing = Reality
1. PLM = PLM
It is interesting to observe that the definition of PLM becomes more and more a marketing term instead of a common definition which applies to all.
Let me try to formulate again a very generic definition which captures most of what PLM Vendors target to do.
PLM is about connecting and sharing the company’s intellectual property through the whole product lifecycle. This includes knowledge created at the concept phase going through the whole lifecycle till a product is serviced in the field or decommissioned.
Experiences from the field (services / customers / market input) serve again for the other lifecycle phases as input to deliver a better or innovative product.
Innovation is an iterative process. It is not only about storing data, PLM is also covering the processes of managing the data, especially the change processes. Sharing data is not easy. It requires a different mind set, data is not only created for personal or departmental usage, but also should be found and extended by other roles in the organization. This all makes it a serious implementation, as aligning people is a business change, not an IT driven approach.
Based on this (too long) high-level PLM definition, it does not imply you cannot do PLM without a PLM system. You might also have a collection of tools that are able to provide a complete coverage of the PLM needs.
Oleg talks about DIY (Do It Yourself) PLM, and I have seen examples of Excel spreadsheets managing Excel spreadsheets and Email archives. The challenge I see with this type of PLM implementations is that after several years it is extremely difficult for a company to change. Possible reasons: the initial gurus do not longer work for the company, new employees need years of experience to find and interpret the right data.
A quick and simple solution can become a burden in the long term if you analyze the possible risks.
Where in the early years of PLM, it was mainly a Dassault Systemes, Siemens and PTC driven approach with deep CAD integrations, the later years other companies like Aras and now Autodesk, started to change the focus from classical PLM more to managing enterprise metadata. A similar approach SAP PLM is offering. Deep integrations with CAD are the most complex parts of PLM and by avoiding them, you can claim your system is easier to implement, etc., etc.
A Single version of the truth is a fancy PLM expression. It would be nice if this was also valid for the definition of PLM. The PLM Innovation 2012 session at the future of PLM models demonstrated that the vendors in this panel discussion had a complete different opinion about PLM. So how can people inside their company explain to the management and others why the need PLM and which PLM they have in mind ?
2. On the cloud = Open and Upgradeable
During the panel discussion Grant Rochelle from Autodesk mentioned the simplicity of their software and how easy it will be upgradeable in the future. Also he referred to Salesforce.com as a proof point.They provide online updates from the software, without the customer having to do anything.
The above statement is true as long as you keep your business coverage simple and do not anticipate changes in the future. Let me share you an analogy with SmarTeam, how it started in 1995
At that time SmarTeam was insanely configurable. The Data Model Wizard contained several PDM templates an within hours you could create a company specific data model. A non-IT skilled person could add attributes, data types, anything they wanted and build the application, almost the same as Autodesk 360. The only difference, SmarTeam was not on the cloud, but it was running on Windows, a revolution at that time as all serious PDM systems were Unix based.
The complexity came however when SmarTeam started to integrate deeply with CAD systems. These integrations created the need for a more standardized data model per CAD system. And as the SmarTeam R&D was not aware of each and every customer’s implementation, it became hard to define a common business logic in the data (and to remain easily upgradable).
I foresee similar issues with the new cloud based PLM systems. They seem to be very easy to implement (add what you want – it is easy). As long as you do not integrate to other systems it remains safe. Integrating with other and future systems requires either a common data definition (which most vendors do not like) or specific integrations with the cost of upgrading.
In the beginning everything is always possible with a well-defined system. But be aware looking back in history, every 10 years a disruptive wave comes in, changing the scope and upgradability.
And to challenge the cloud-based PLM vendors: in the generic definition of PLM that I shared above, PLM integrates also design data.
3. Data = Process Support
Another misconception, which originates from the beginning of PLM is the idea that once you have support for specific data in your system, you support the process.
First example: Items defined in ERP. When engineers started to use a PDM system and started to define a a new item there were challenges. I had many discussions with IT-departments, that they did not need or wanted items in PDM. ERP was the source for an item, and when a designer needed a new item, (s)he had to create it in ERP. So we have a single definition of the item.
Or the designer had to request a new item number from the ERP system. And please do not request numbers too often as we do not want to waste them was the message.
Ten years later this looks like a joke, as most companies have an integrated PDM/ERP process and understand that the initial definition of a new item comes from PDM and at a certain stage the matured item is shared (and completed) by the ERP system. It is clear that the most efficient manner to create a new item is through PLM as the virtual definition (specs / CAD data) also reside there and information is handled in that context.
A second more actual example is the fact that compliancy is often handled in ERP. It is correct that in the case you manufacture a product for a specific target market, you need to be able to have the compliancy information available.
However would you do this in your ERP system, where you are late (almost at the end) of the design lifecycle or is it more logical that during your design stages at all time you verify and check compliancy ? The process will work much more efficient and with less cost of change when done in PLM but most companies still see ERP as their primary IT system and PLM is an engineering tool.
Finally on this topic a remark to the simplified PLM vendors. Having the ability to store for example requirements in your system does not mean you have support for a complete requirements management process. It is also about the change and validation of requirements, which should be integrated for a relevant role during product definition (often CAD) and validation. As long as the data is disconnected there is not such a big advantage compared to Excel.
4. Marketing = Reality
In the future of PLM Business Models
Oleg showed a slide with the functional architectures of the major PLM Vendors. In the diagram all seems to be connected as a single system, but in reality this is usually not the case.
As certain components / technologies are acquired, they provide the process coverage and only in the future you can imagine it works integrated. You cannot blame marketing for doing so, as their role is to position their products in the most appealing way customers will buy it. Without marketing perhaps no-one would buy a PLM system, when understanding the details
Autodesk as a newcomer in PLM has a strong background in marketing. This is understandable as similar to Microsoft, their main revenue comes from selling a large volume of products, where the classical PLM vendors often have a combination with services and business change. And therefore a different price point.
When in the eighties Autodesk introduced AutoCAD, it was a simple, open 2D CAD environment, able to run on a PC. Autodesk’s statement at that time: “We provide 80 percent of the functionality for 20 % of the price”.
Does this sound familiar nowadays ?
As AutoCAD was a basic platform allowing customers and resellers to build their solutions on top of it, this became the mid-market success for Autodesk with AutoCAD.
The challenge with Autodesk PLM 360 is that although the same logic seems to make sense, I believe the challenge is not in the flexible platform. The challenge is in the future, when people want to do more complex things with the system, like integrations with design, enterprise collaboration.
At that time you need people who can specify the change, guide the change and implement the change. And this is usually not a DIY job.
Autodesk is still learning to find the right PLM messages I noticed recently. When attending the Autodesk PLM session during PLM Innovation 2012 (end of February), one of their launching customers ElectronVault presented their implementation – it took only two weeks !!! Incredible
However reading Rob Cohee’s blog post the end of March, he mentions ElectronVault again. Quote:
ElectronVault was searching for something like this for over two years and after 6 weeks they have implemented Project Management, EBOM, MBOM, and starting on their APQP project. Six Weeks!!!
As you see, four weeks later the incredible two weeks have become six weeks and again everything is implemented. Still incredible and I am looking forward to meet ElectronVault in the future as I believe they are a typical young company and they will go through all of the maturity phases a company will go through: people, processes and tools (in this order). A tool driven implementation is more likely to slow down in the long term.
Conclusion: Misconceptions are not new. History can teach us a lot about what we experience now. New technology, new concepts can be a break through. However implementing them at companies requires a change in organizations and this has been the biggest challenge the past 100 years.
Related articles
- The Question of PLM or Not to PLM (arnoldit.com)
- Innovation @ PLM Innovation 2012 ? (virtualdutchman.com)
Jos, what a ride you have had! And looking at some of the spaghetti system architectures of even today's businesses,…
Congratulations, Jos! I'm very happy that you'll stay active in the PLM world and continue with your blogs - during…
Jos, welcome to the world of (part-time) retirement. Enjoy your AOW. Thanks Dick, you have the experience now - enjoy…
Thanks for all the valuable thoughts you have shared with us Jos, hope your 'new career' will bring you lots…
Great.. Congratulations on reaching yet another milestone... your blog is very thought proving and helps us to think in multiple…