You are currently browsing the tag archive for the ‘Cloud’ tag.
Three weeks ago, we published our first PLM Global Green Alliance interview discussing the relationship between PLM and Sustainability with the main vendors. We talked with Darren West from SAP.
You can find the interview here: PLM and Sustainability: talking with SAP.
When we published the interview, it was also the moment a Russian dictator started the invasion of Ukraine, making it difficult for me to focus on our sustainability mission, having friends in both countries.
Now, three weeks later, with even more horrifying news coming from Ukraine, my thoughts are with the heroic people there, who resist and fight for their lives to exist. And it is not only in Ukraine. Also, people suffering under other totalitarian regimes are fighting this unfair battle.
Meanwhile, another battle that concerns us all might get stalled if the conflict in Ukraine continues. This decade requires us to focus on the transition towards a sustainable planet, where the focus is on reducing carbon emissions. It is clear from the latest IPPC report: Impacts Adaptation and Vulnerability that we need to act.
Autodesk
Therefore, I am happy we can continue our discussion on PLM and Sustainability, this time with Autodesk. In the conversation with SAP, we discovered SAP’s strength lies in measuring the environmental impact of materials and production processes. However, most (environmental) impact-related decisions are made before the engineering & design phase.
Autodesk is a well-known software company in the Design & Manufacturing industry and the AEC (Architecture, Engineering and Construction) industry.
Autodesk was open to sharing its sustainability activities with us. So we spoke with Zoé Bezpalko, Autodesk’s Sustainability Strategy Manager for the Design & Manufacturing Industries, and Jon den Hartog, Product Manager for Autodesk’s PDM and PLM solutions. So we were talking with the right persons for our PLM Global Green Alliance.
Watch the 30 minutes recording below, learn more about Autodesk’s sustainability goals and offerings and get motivated to (re)act.
The slides shown in this presentation can be downloaded HERE
What we have learned
The interview showed that Autodesk is actively working on a sustainable future. Both by acting internally, but, and even more important, by helping their customers to have a positive impact, using technologies like generative design and more environmentally friendly building projects. We talked about the renovation project of our famous Dutch Afsluitdijk.
The second observation is that Autodesk is working on empowering the designer to make better decisions regarding material usage or reuse. Life Cycle Assessment done by engineers will be a future required skill. As we discussed, this bottom-up user empowerment should be combined with a company strategy.
Want to learn more?
As you can see from the image shown in the recording, there is a lot to learn about Autodesk Forge. Click on the image for your favorite link, or open the PDF connected to the recording for your sustainability plans.
And there is the link to the Autodesk sustainability hub: Autodesk.com/sustainability
Conclusion
This was a motivating session to see Autodesk acting on Sustainability, and they are encouraging their customers to act.
It is necessary that companies and consumers get motivated and supported for more sustainable products and activities. We look forward to coming back with Autodesk in a second round with the PLM vendors to discover and discuss progress.
This post is based on a mix of interactions I had the last two weeks in my network, mainly on LinkedIn. First, I enjoyed the discussion that started around Yoann Maingon post: Thoughts about PLM Business models. Yoann is quite seasoned in PLM, as you can see from his LinkedIn profile, and we have had interesting discussions in the past, and recently about a new PLM-system, he is developing Ganister PLM, based on a flexible Graph database.
Perhaps in that context, Yoann was exploring the various business models. Do you pay for the software (and maintenance), do you pay through subscription, what about a modular approach or a full license for all the functionality? All these questions made me think about the various business models that I encountered and how hard it is for a customer to choose the optimal solution. And is the space for a new type of PLM? Is there space for free PLM? Some of my thoughts here:
PLM vendors need to be profitable
One of the most essential points to consider is that whatever PLM solution you are aiming to buy, make sure that your PLM vendor has a profitable business model. As once you started with a PLM solution, it is your company’s IP that will be stored in this environment, and you do not want to change every few years your PLM system. Switching PLM systems would be affordable if the PLM system would store their data in a standard format – I will share a more in-depth link under PLM and standards.
For the moment, you cannot state PLM vendors endorse standards. None of the real PLM vendors have a standardized data model, perhaps closest to standards are Eurostep, who have based that ShareAspace solution on top of the PLCS (ISO 10303) standard. However, ShareAspace is more positioned as a type of middleware, connecting between OEMs/Owner/Operators and their suppliers to benefit for standardized connectivity.
Coming back to the statement, PLM Vendors need to be profitable to provide a guarantee for the future of your company’s data is the first step. The major PLM Vendors are now profitable as during a consolidation phase starting 15 years ago, a lot of non-profitable PLM Vendors disappeared. Matrix One, Agile, Eigner & Partner PLM are the best-known companies that were bought for either their technology or market share. In that context, you might also look at OnShape.
Would they be profitable as a separate company, or would investors give up? To survive, you need to be profitable, so giving software away for free is not a good sign (see the software for free paragraph) as a company needs continuity.
PLM startups
In the past 10 years, I have seen and evaluated several new PLM companies. All of them did not really change the PLM paradigm, most of them were still focusing on being an engineering collaboration tools. Several of these companies have in their visionary statement that they are going to be the “Excel killer.” We all know Excel has the best user interface and capabilities to manipulate a collection of metadata.
Very popular is the BOM in Excel, extracted from the CAD-system (no need for an “expensive” PDM or PLM) or BOM used to share with suppliers and stakeholders (ERP is too rigid, purchasing does not work with PDM).
The challenge I see here is that these startups do not bring real new value. The cost of manipulating Excels is a hidden cost, and companies relying on Excel communication are the type of companies that do not have a strategic point of view. This is typical for Small and Medium businesses where execution (“let’s do it”) gets all the attention.
PLM startups often collect investor’s money because they promise to kill Excel, but is Excel the real problem? Modern PLM is about data sharing, which is an attitude change, not necessarily a technology change from Excel tables to (cloud) shared tables. However, will one of these “new Excel killers” PLMs be disruptive? I don’t think so.
PLM disruption?
A week ago, I read an interview with Clayton Christensen (thanks Hakan Karden), which I shared on LinkedIn a week ago. Clayton Christensen is the father of the Disruptive Innovation theory, and I have cited him several times in my blogs. His theory is, in my opinion, fundamental to understand how traditional businesses can be disrupted. The interview took place shortly before he died at the age of 67. He died due to complications caused by leukemia.
A favorite part of this interview is, where he restates what is really Disruptive Innovation as we often talk about disruption without understanding the context, just echoing other people:
Christensen: Disruptive innovation describes a process by which a product or service powered by a technology enabler initially takes root in simple applications at the low end of a market — typically by being less expensive and more accessible — and then relentlessly moves upmarket, eventually displacing established competitors. Disruptive innovations are not breakthrough innovations or “ambitious upstarts” that dramatically alter how business is done but, rather, consist of products and services that are simple, accessible, and affordable. These products and services often appear modest at their outset but over time have the potential to transform an industry.
Many of the PLM startups dream and position themselves as the new disruptor. Will they succeed? I do not believe so if they only focus on replacing Excel, there is a different paradigm needed. Voice control and analysis perhaps (“Hey PLM if I change Part XYZ what will be affected”)?
This would be disruptive and open new options. I think PLM startups should focus here if they want my investment money.
PLM for free?
There are some voices that PLM should be free in an analogy to software management and collaboration tools. There are so many open-source software management tools, why not using them for PLM? I think there are two issues here:
- PLM data is not like software data. A lot of PLM data is based on design models (3D CAD / Simulation), which is different from software. Designs are often not that modular as software for various reasons. Companies want to be modular in their products, but do they have the time and resources to reinvent their existing product. For software, these costs are so much lower as it is only a brain exercise. For hardware, the impact is significant. Bringing me to the second point.
- The cost of change for hardware is entirely different compared to software. Changing software does not have an impact on existing stock or suppliers and, therefore, can be implemented once tested for its purpose. A hardware change impacts the existing production process. First, use the old parts before introducing the change, or do we accept the (costs) of scrap. Is our supply chain, or are our production tools ready to deliver continuity for the new version? Hardware changes are costly, and you want to avoid them. Software changes are cheap, therefore design your products to be configurable based on software (For example Tesla’s software controlling the features to be allowed)
Now imagine, with enough funding, you could provide a PLM for free. Because of ease of deployment, this would be very likely a cloud offering, easy and scalable. However, all your IP is in that cloud too, and let’s imagine that the cloud is safer than on-premise, so it does not matter in which country your data is hosted (does it ?).
Next, the “free” PLM provider starts asking a small service fee after five years, as the promised ROI on the model hasn’t delivered enough value for the shareholders, they become anxious. Of course, you do not like to pay the fee. However, where is your data, and what happens when you do not pay?
If the PLM provider switches you off, you are without your IP. If you ask the PLM provider to provide your data, what will you get? A blob of XML-files, anything you can use?
In general, this is a challenge for all cloud solutions.
- What if you want to stop your subscription?
- What is the allowed Exit-strategy?
Here I believe customers should ask for clarity, and perhaps these questions will lead to a renewed understanding that we need standards.
PLM and standards
We had a vivid discussion in the blogging community in September last year. You can read more related to this topic in my post: PLM and the need for standards which describes the aspects of lock-in and needs for openness.
Finally, a remark related to the PLM-acronym. Another interesting discussion started around Joe Barkai’s post: Why I do not do PLM . Read the comments and the various viewpoint on PLM here. It is clear that the word PLM unites us all; however, the interpretation is different.
If someone in the street asks me what is your profession, I never mention I do PLM. I say: “I assist mainly manufacturing companies in redesigning their business processes using best practices and modern digital technologies”. The focus is on the business value, not on the ultimate definition of PLM
Conclusion
There are many business aspects related to PLM to consider. Yoann Maingon’s post started the thinking process, and we ended up with the PLM-definition. It all illustrates that being involved in PLM is never a boring journey. I am curious to learn about your journey and where we meet.
To avoid that software geeks are getting curious about the title – in this context, ALM means Asset Lifecycle Management. In 2008 I was active for SmarTeam to promote PLM concepts relevant for Asset Lifecycle Management. The focus was on PLM being complementary to asset operation management (EAM Enterprise Asset Management and MRO – Maintenance Repair and Overhaul).
This topic has become actual for me in the past two months, having discussed and seen (PDT) the concepts of a model-based approach for assets and constructions. PLM, ALM, and BIM converge conceptually. Every year I give a one-day update from the field for students doing a master for PLM & BIM on top of their engineering/architectural background. Five years ago, there was no mentioning of BIM, now the ratio of BIM-oriented students has become significant. For me it is always great to see young students willing to learn PLM or BIM on top of their own skillset. Read more about this particular Master class in French when you click on the logo to the left.
In 2012 I started to explain PLM benefits to EPC companies (Engineering Procurement Construction), targeting a more profitable and efficient delivery of their constructions (oil platform, plant, building, infrastructure). The simplified reasoning behind using PLM was related to a more efficient and quality of multidisciplinary collaboration, reducing costly fixes during construction, and smoothening the intensive process of data handover.
More and more in the process industry, standards, like ISO 15926 (Process Industry) and ISO 19650 (BIM – mainly in the UK), became crucial. At that time, it was difficult to convince companies to focus on the horizontal-integrated process instead of dedicated, disconnected tools. Meanwhile, this has changed, thanks to the Digital Twin hype. Let’s have a look.
PLM and ALM
The initial value for using PLM concepts complementary to MRO systems came from the fact that MRO systems are mainly focusing on plant operations. You could compare these systems with ERP systems for manufacturing companies, focusing execution and continuous operation. Scheduled maintenance and inspections are also driven by the MRO system. Typical MRO systems are Maximo and SAP PM. PLM could deliver configuration management, linking the design intent to the physical implementation. Therefore provide higher data quality, visibility, and traceability of the asset history.
In 2010, I shared these concepts in two posts: Asset Lifecycle Management using a PLM-system and PLM for Asset Lifecycle Management and Asset Development based on lessons learned with some (nuclear) plant owner/operators. They started to discover the need for configuration management to ensure data quality for operations. In 2010-2014 the business case using PLM complementary to MRO was data quality and therefore reduced down-time when executing large maintenance programs (dependencies between the individual projects were not visible without PLM)
In MRO-systems, like in ERP-systems, the data for execution is based on information coming from various engineering sources – specifications, PFDs, P&IDs. Questions owner/operators ask themselves are:
- What are the designed operational settings?
- Are the asset parameters currently running as designed?
- What is the optimized maintenance period?
- Can we stretch maintenance intervals?
- Can we reduce inspections?
- Can we reduce downtime for maintenance and overhaul?
- What about predictive maintenance?
Most of these questions are answered by experts that use their tacit knowledge and experience to give the best so far answers. And when the answers were wrong, they were accepted as new learning points. Next time we won’t make this mistake, and the experts become even more knowledgeable.
Now, these questions could be answered if you can model your asset in a virtual environment. In the virtual world, you would use simulation models, logical models, and 3D Models to describe the asset. This is where Model-Based Systems Engineering practices are used. However, these models need to be calibrated based on reality. And that is where IoT and Asset Operation Monitoring comes in connecting physical behavior with virtual predicted behavior. You can read more about this relationship in my post: Will MBSE the new PLM instead of IoT?
PLM and BIM
In 2014 when I started to discuss PLM concepts with EPC-companies (Engineering, Procurement, and Construction), mainly in the Oil & Gas industry. Here excellent asset development tools (AVEVA, Intergraph, Bentley) are the standard, and as the purpose of an EPC company is to deliver a plant or platform. Each software tool has its purpose and there is no lifecycle strategy. The value PLM could bring was providing a program overview (complementary with Primavera), standardization, multidisciplinary coordination and visibility across projects to capture knowledge.
Most of the time, the EPC companies did not see the value of optimizing themselves as this was accepted in the process. Even while their productivity and cost due to poor quality (fixing during construction /commissioning) were absurd (10-20 % of the project budget). Cultural change – think longer instead of fix later – was hard to explain. In the end, the EPC was not responsible for operations, so why bother that much?
My blog posts: PLM for all Industries and 2014 – the year that the construction industry did not discover PLM illustrate the challenge at that time. None of the EPCs and construction companies had the, that improving collaboration based on information-continuity (not data-driven yet) could bring the significant benefits, despite their relatively low-profit margin (1- 3 % is considered excellent). Breaking the silos is too.
Two recent trends, however, changed the status quo that existed.
First of all, more and more, the owner/operator does not want to be responsible for the maintenance and operations of the asset. The typical EPC-companies now became DBO-companies (Design Build and Operate), this requires lifecycle thinking for these companies as most of the costs of an asset are during its maintenance and operation phase.
Advanced Thinking (read: (Model-Based) Systems Engineering) can help these companies to shift their focus on a more sustainable design of the asset for the future and get rewarded for that. In the old EPC-model, the target was “just” to deliver as specified.
A second significant trend is the availability of cloud infrastructure for the construction world. A cloud infrastructure does not require considerable investment for the stakeholders in a construction project. By introducing BIM in a common data environment (CDE), a comparable infrastructure to PLM is created and likely the Maintenance-and-Operatie stakeholder is eager to have the full virtual definition here for the future.
Read more about BIM and CDE for example, here: CDE – strategic BIM process tool.
Of course, technology and standards are there to collaborate. Now it is up to the stakeholders involved to develop new skills for collaboration (learn or hire) and implement them through new ways of working. A learning process can never be pushed by a big-bang, so make sure your company operates in two modes while learning.
As I mentioned the Maintenance-and-Operate stakeholders or in traditional cases, the Owner/Operators are incredibly interested in a well-defined virtual model of the asset. This allows them to analyze and simulate the implementation of fixes and enhancements for the future with an optimum result. Again we are talking about a digital twin of the asset here
Conclusion
Even though the digital twin is on the top of the Gartner Hype cycle, it has become already a vital principle to implement in particular for substantial, critical assets. As these precious assets, minor inefficiencies in data continuity can still be afforded to learn. From the moment companies have established a digital continuity between their virtual and physical assets, the concept for Digital Twin can also be profitable (and required) for other industries. In particular when these companies want to deliver their products as a service.
Note: I have been talking this year a lot about the challenges of digital transformation applied to PLM in particular. During PI PLMx London 2020 on February 3 and 4, I will lead a Think Thank session related to the challenge of connecting your PLM transformation to your executives’ vision (and budget). See you there ?
As a genuine Dutchman, I was able to spend time last month in the Netherlands, and I attended two interesting events: BIMOpen2015, where I was invited to speak about what BIM could learn from PLM (see Dutch review here) and the second event: Where engineering meets supply chain organized by two startup companies located in Yes!Delft an incubator place working close to the technical university of Delft (Dutch announcement here)
Two different worlds and I realized later, they potential have the same future. So let’s see what happened.
BIMopen 2015
BIMopen 2015 had the theme: From Design to Operations and the idea of the conference was to bring together construction companies (the builders) and the facility managers (the operators) and discuss the business value they see from BIM.
First I have to mention that BIM is a confusing TLA like PLM. So many interpretations of what BIM means. For me, when I talk about BIM I mean Building Information Management. In a narrower meaning, BIM is often considered as a Building Information Model – a model that contains all multidisciplinary information. The last definition does not deal with typical lifecycle operations, like change management, planning, and execution.
The BIMopen conference started with Ellen Joyce Dijkema from BDO consultants who addressed the cost of failure and the concepts of lean. Thinking. The high cost of failure is known and accepted in the construction industry, where at the end of the year profitability can be 1 % of turnover (with a margin of +/- 3 % – so being profitable is hard).
Lean thinking requires a cultural change, which according to Ellen Joyce is an enormous challenge, where according to a study done by Prof Dr. A. Cozijnsen there is only 19 % of chance this will be successful, compared to 40 % chance of success for new technology and 30 % of chance for new work processes.
It is clear changing culture is difficult and in the construction industry it might be even harder. I had the feeling a large part of the audience did not grasp the opportunity or could find a way to apply it to their own world.
My presentation about what BIM could learn from PLM was similar. Construction companies have to spend more time on upfront thinking instead of fixing it later (costly). In addition thinking about the whole lifecycle of a construction, also in operations can bring substantial revenue for the owner or operator of a construction. Where traditional manufacturing companies take the entire lifecycle into account, this is still not understood in the construction industry.
This point was illustrated by the fact that there was only one person in the audience with the primary interest to learn what BIM could contribute to his job as facility manager and half-way the conference he still was not convinced BIM had any value for him.
A significant challenge for the construction industry is that there is no end-to-end ownership of data, therefore having a single company responsible for all the relevant and needed data does not exist. Ownership of data can result in legal responsibility at the end (if you know what to ask for) and in a risk shifting business like the construction industry companies try to avoid responsibility for anything that is not directly related to the primary activities.
Some larger companies during the conference like Ballast Nedam and HFB talked about the need to have a centralized database to collect all the data related to a construction (project). They were building these systems themselves, probably because they were not aware of PLM systems or did not see through the first complexity of a PLM system, therefore deciding a standard system will not be enough.
I believe this is short-term thinking as with a custom system you can get quick results and user acceptance (it works the way the user is asking for) however custom systems have always been a blockage for the future after 10-15 years as they are developed with a mindset from that time.
If you want to know, learn more about my thoughts have a look at 2014 the year the construction industry did not discover PLM. I will write a new post at the end of the year with some positive trends. Construction companies start to realize the benefits of a centralized data-driven environment instead of shifting documents and risks.
The cloud might be an option they are looking for. Which brings me to the second event.
Engineering meets Supply Chain
This was more an interactive workshop / conference where two startups KE-Works and TradeCloud illustrated the individual value of their solution and how it could work in an integrated way. I had been in touch with KE-Works before because they are an example of the future trend, platform-thinking. Instead of having one (or two) large enterprise system(s), the future is about connecting data-centric services, where most of them can run in the cloud for scalability and performance.
KE-Works provides a real-time workflow for engineering teams based on knowledge rules. Their solution runs in the cloud but connects to systems used by their customers. One of their clients Fokker Elmo explained how they want to speed up their delivery process by investing in a knowledge library using KE-works knowledge rules (an approach the construction industry could apply too)
In general if you look at what KE-works does, it is complementary to what PLM-systems or platforms do. They add the rules for the flow of data, where PLM-systems are more static and depend on predefined processes.
TradeCloud provides a real-time platform for the supply chain connecting purchasing and vendors through a data-driven approach instead of exchanging files and emails. TradeCloud again is another example of a collection of dedicated services, targeting, in this case, the bottom of the market. TradeCloud connects to the purchaser’s ERP and can also connect to the vendor’s system through web services.
The CADAC group, a large Dutch Autodesk solution provided also showed their web-services based solution connecting Autodesk Vault with TradeCloud to make sure the right drawings are available. The name of their solution, the “Cadac Organice Vault TradeCloud Adapter” is more complicated than the solution itself.
What I saw that afternoon was three solutions providers connected using the cloud and web services to support a part of a company’s business flow. I could imagine that adding services from other companies like OnShape (CAD in the cloud), Kimonex (BOM Management for product design in the cloud) and probably 20 more candidates can already build and deliver a simplified business flow in an organization without having a single, large enterprise system in place that connects all.
The Future
I believe this is the future and potential a breakthrough for the construction industry. As the connections between the stakeholders can vary per project, having a configurable combination of business services supported by a cloud infrastructure enables an efficient flow of data.
As a PLM expert, you might think all these startups with their solutions are not good enough for the real world of PLM. And currently they are not – I agree. However disruption always comes unnoticed. I wrote about it in 2012 (The Innovators Dilemma and PLM)
Conclusion
Innovation happens when you meet people, observe and associate in areas outside your day-to-day business. For me, these two events connected some of the dots for the future. What do you think? Will a business process based on connected services become the future?
Sometimes we have to study careful to see patterns have a look here what is possible according to some scientists (click on the picture for the article)
The past three weeks I had time to observe some PLM Vendors marketing messages (Autodesk as the major newbie). Some of these message lead to discussions in blogs or (LinkedIn) forums. Always a good moment to smile and think about reality.
In addition the sessions from PLM Innovation 2012 became available for the attendees (thanks MarketKey – good quality). I had the chance to see the sessions I missed. On my wish list was “The future of PLM Business Models” moderated by Oleg as here according to Oleg some interesting viewpoints came up. This related to my post where I mentioned the various definitions of PLM.
All the above inspired me to write this post, which made me realize we keep on pushing misconceptions around PLM in our customer’s mind, with the main goal to differentiate.
I will address the following four misconceptions. The last one is probably not a surprise, therefore on the last position. Still sometimes taken for granted.
- PLM = PLM
- On the cloud = Open and Upgradeable
- Data = Process Support
- Marketing = Reality
1. PLM = PLM
It is interesting to observe that the definition of PLM becomes more and more a marketing term instead of a common definition which applies to all.
Let me try to formulate again a very generic definition which captures most of what PLM Vendors target to do.
PLM is about connecting and sharing the company’s intellectual property through the whole product lifecycle. This includes knowledge created at the concept phase going through the whole lifecycle till a product is serviced in the field or decommissioned.
Experiences from the field (services / customers / market input) serve again for the other lifecycle phases as input to deliver a better or innovative product.
Innovation is an iterative process. It is not only about storing data, PLM is also covering the processes of managing the data, especially the change processes. Sharing data is not easy. It requires a different mind set, data is not only created for personal or departmental usage, but also should be found and extended by other roles in the organization. This all makes it a serious implementation, as aligning people is a business change, not an IT driven approach.
Based on this (too long) high-level PLM definition, it does not imply you cannot do PLM without a PLM system. You might also have a collection of tools that are able to provide a complete coverage of the PLM needs.
Oleg talks about DIY (Do It Yourself) PLM, and I have seen examples of Excel spreadsheets managing Excel spreadsheets and Email archives. The challenge I see with this type of PLM implementations is that after several years it is extremely difficult for a company to change. Possible reasons: the initial gurus do not longer work for the company, new employees need years of experience to find and interpret the right data.
A quick and simple solution can become a burden in the long term if you analyze the possible risks.
Where in the early years of PLM, it was mainly a Dassault Systemes, Siemens and PTC driven approach with deep CAD integrations, the later years other companies like Aras and now Autodesk, started to change the focus from classical PLM more to managing enterprise metadata. A similar approach SAP PLM is offering. Deep integrations with CAD are the most complex parts of PLM and by avoiding them, you can claim your system is easier to implement, etc., etc.
A Single version of the truth is a fancy PLM expression. It would be nice if this was also valid for the definition of PLM. The PLM Innovation 2012 session at the future of PLM models demonstrated that the vendors in this panel discussion had a complete different opinion about PLM. So how can people inside their company explain to the management and others why the need PLM and which PLM they have in mind ?
2. On the cloud = Open and Upgradeable
During the panel discussion Grant Rochelle from Autodesk mentioned the simplicity of their software and how easy it will be upgradeable in the future. Also he referred to Salesforce.com as a proof point.They provide online updates from the software, without the customer having to do anything.
The above statement is true as long as you keep your business coverage simple and do not anticipate changes in the future. Let me share you an analogy with SmarTeam, how it started in 1995
At that time SmarTeam was insanely configurable. The Data Model Wizard contained several PDM templates an within hours you could create a company specific data model. A non-IT skilled person could add attributes, data types, anything they wanted and build the application, almost the same as Autodesk 360. The only difference, SmarTeam was not on the cloud, but it was running on Windows, a revolution at that time as all serious PDM systems were Unix based.
The complexity came however when SmarTeam started to integrate deeply with CAD systems. These integrations created the need for a more standardized data model per CAD system. And as the SmarTeam R&D was not aware of each and every customer’s implementation, it became hard to define a common business logic in the data (and to remain easily upgradable).
I foresee similar issues with the new cloud based PLM systems. They seem to be very easy to implement (add what you want – it is easy). As long as you do not integrate to other systems it remains safe. Integrating with other and future systems requires either a common data definition (which most vendors do not like) or specific integrations with the cost of upgrading.
In the beginning everything is always possible with a well-defined system. But be aware looking back in history, every 10 years a disruptive wave comes in, changing the scope and upgradability.
And to challenge the cloud-based PLM vendors: in the generic definition of PLM that I shared above, PLM integrates also design data.
3. Data = Process Support
Another misconception, which originates from the beginning of PLM is the idea that once you have support for specific data in your system, you support the process.
First example: Items defined in ERP. When engineers started to use a PDM system and started to define a a new item there were challenges. I had many discussions with IT-departments, that they did not need or wanted items in PDM. ERP was the source for an item, and when a designer needed a new item, (s)he had to create it in ERP. So we have a single definition of the item.
Or the designer had to request a new item number from the ERP system. And please do not request numbers too often as we do not want to waste them was the message.
Ten years later this looks like a joke, as most companies have an integrated PDM/ERP process and understand that the initial definition of a new item comes from PDM and at a certain stage the matured item is shared (and completed) by the ERP system. It is clear that the most efficient manner to create a new item is through PLM as the virtual definition (specs / CAD data) also reside there and information is handled in that context.
A second more actual example is the fact that compliancy is often handled in ERP. It is correct that in the case you manufacture a product for a specific target market, you need to be able to have the compliancy information available.
However would you do this in your ERP system, where you are late (almost at the end) of the design lifecycle or is it more logical that during your design stages at all time you verify and check compliancy ? The process will work much more efficient and with less cost of change when done in PLM but most companies still see ERP as their primary IT system and PLM is an engineering tool.
Finally on this topic a remark to the simplified PLM vendors. Having the ability to store for example requirements in your system does not mean you have support for a complete requirements management process. It is also about the change and validation of requirements, which should be integrated for a relevant role during product definition (often CAD) and validation. As long as the data is disconnected there is not such a big advantage compared to Excel.
4. Marketing = Reality
In the future of PLM Business Models
Oleg showed a slide with the functional architectures of the major PLM Vendors. In the diagram all seems to be connected as a single system, but in reality this is usually not the case.
As certain components / technologies are acquired, they provide the process coverage and only in the future you can imagine it works integrated. You cannot blame marketing for doing so, as their role is to position their products in the most appealing way customers will buy it. Without marketing perhaps no-one would buy a PLM system, when understanding the details
Autodesk as a newcomer in PLM has a strong background in marketing. This is understandable as similar to Microsoft, their main revenue comes from selling a large volume of products, where the classical PLM vendors often have a combination with services and business change. And therefore a different price point.
When in the eighties Autodesk introduced AutoCAD, it was a simple, open 2D CAD environment, able to run on a PC. Autodesk’s statement at that time: “We provide 80 percent of the functionality for 20 % of the price”.
Does this sound familiar nowadays ?
As AutoCAD was a basic platform allowing customers and resellers to build their solutions on top of it, this became the mid-market success for Autodesk with AutoCAD.
The challenge with Autodesk PLM 360 is that although the same logic seems to make sense, I believe the challenge is not in the flexible platform. The challenge is in the future, when people want to do more complex things with the system, like integrations with design, enterprise collaboration.
At that time you need people who can specify the change, guide the change and implement the change. And this is usually not a DIY job.
Autodesk is still learning to find the right PLM messages I noticed recently. When attending the Autodesk PLM session during PLM Innovation 2012 (end of February), one of their launching customers ElectronVault presented their implementation – it took only two weeks !!! Incredible
However reading Rob Cohee’s blog post the end of March, he mentions ElectronVault again. Quote:
ElectronVault was searching for something like this for over two years and after 6 weeks they have implemented Project Management, EBOM, MBOM, and starting on their APQP project. Six Weeks!!!
As you see, four weeks later the incredible two weeks have become six weeks and again everything is implemented. Still incredible and I am looking forward to meet ElectronVault in the future as I believe they are a typical young company and they will go through all of the maturity phases a company will go through: people, processes and tools (in this order). A tool driven implementation is more likely to slow down in the long term.
Conclusion: Misconceptions are not new. History can teach us a lot about what we experience now. New technology, new concepts can be a break through. However implementing them at companies requires a change in organizations and this has been the biggest challenge the past 100 years.
Related articles
- The Question of PLM or Not to PLM (arnoldit.com)
- Innovation @ PLM Innovation 2012 ? (virtualdutchman.com)
Jos, what a ride you have had! And looking at some of the spaghetti system architectures of even today's businesses,…
Congratulations, Jos! I'm very happy that you'll stay active in the PLM world and continue with your blogs - during…
Jos, welcome to the world of (part-time) retirement. Enjoy your AOW. Thanks Dick, you have the experience now - enjoy…
Thanks for all the valuable thoughts you have shared with us Jos, hope your 'new career' will bring you lots…
Great.. Congratulations on reaching yet another milestone... your blog is very thought proving and helps us to think in multiple…