You are currently browsing the category archive for the ‘Cloud’ category.
As a genuine Dutchman, I was able to spend time last month in the Netherlands, and I attended two interesting events: BIMOpen2015, where I was invited to speak about what BIM could learn from PLM (see Dutch review here) and the second event: Where engineering meets supply chain organized by two startup companies located in Yes!Delft an incubator place working close to the technical university of Delft (Dutch announcement here)
Two different worlds and I realized later, they potential have the same future. So let’s see what happened.
BIMopen 2015 had the theme: From Design to Operations and the idea of the conference was to bring together construction companies (the builders) and the facility managers (the operators) and discuss the business value they see from BIM.
First I have to mention that BIM is a confusing TLA like PLM. So many interpretations of what BIM means. For me, when I talk about BIM I mean Building Information Management. In a narrower meaning, BIM is often considered as a Building Information Model – a model that contains all multidisciplinary information. The last definition does not deal with typical lifecycle operations, like change management, planning, and execution.
The BIMopen conference started with Ellen Joyce Dijkema from BDO consultants who addressed the cost of failure and the concepts of lean. Thinking. The high cost of failure is known and accepted in the construction industry, where at the end of the year profitability can be 1 % of turnover (with a margin of +/- 3 % – so being profitable is hard).
Lean thinking requires a cultural change, which according to Ellen Joyce is an enormous challenge, where according to a study done by Prof Dr. A. Cozijnsen there is only 19 % of chance this will be successful, compared to 40 % chance of success for new technology and 30 % of chance for new work processes.
It is clear changing culture is difficult and in the construction industry it might be even harder. I had the feeling a large part of the audience did not grasp the opportunity or could find a way to apply it to their own world.
My presentation about what BIM could learn from PLM was similar. Construction companies have to spend more time on upfront thinking instead of fixing it later (costly). In addition thinking about the whole lifecycle of a construction, also in operations can bring substantial revenue for the owner or operator of a construction. Where traditional manufacturing companies take the entire lifecycle into account, this is still not understood in the construction industry.
This point was illustrated by the fact that there was only one person in the audience with the primary interest to learn what BIM could contribute to his job as facility manager and half-way the conference he still was not convinced BIM had any value for him.
A significant challenge for the construction industry is that there is no end-to-end ownership of data, therefore having a single company responsible for all the relevant and needed data does not exist. Ownership of data can result in legal responsibility at the end (if you know what to ask for) and in a risk shifting business like the construction industry companies try to avoid responsibility for anything that is not directly related to the primary activities.
Some larger companies during the conference like Ballast Nedam and HFB talked about the need to have a centralized database to collect all the data related to a construction (project). They were building these systems themselves, probably because they were not aware of PLM systems or did not see through the first complexity of a PLM system, therefore deciding a standard system will not be enough.
I believe this is short-term thinking as with a custom system you can get quick results and user acceptance (it works the way the user is asking for) however custom systems have always been a blockage for the future after 10-15 years as they are developed with a mindset from that time.
If you want to know, learn more about my thoughts have a look at 2014 the year the construction industry did not discover PLM. I will write a new post at the end of the year with some positive trends. Construction companies start to realize the benefits of a centralized data-driven environment instead of shifting documents and risks.
The cloud might be an option they are looking for. Which brings me to the second event.
Engineering meets Supply Chain
This was more an interactive workshop / conference where two startups KE-Works and TradeCloud illustrated the individual value of their solution and how it could work in an integrated way. I had been in touch with KE-Works before because they are an example of the future trend, platform-thinking. Instead of having one (or two) large enterprise system(s), the future is about connecting data-centric services, where most of them can run in the cloud for scalability and performance.
KE-Works provides a real-time workflow for engineering teams based on knowledge rules. Their solution runs in the cloud but connects to systems used by their customers. One of their clients Fokker Elmo explained how they want to speed up their delivery process by investing in a knowledge library using KE-works knowledge rules (an approach the construction industry could apply too)
In general if you look at what KE-works does, it is complementary to what PLM-systems or platforms do. They add the rules for the flow of data, where PLM-systems are more static and depend on predefined processes.
TradeCloud provides a real-time platform for the supply chain connecting purchasing and vendors through a data-driven approach instead of exchanging files and emails. TradeCloud again is another example of a collection of dedicated services, targeting, in this case, the bottom of the market. TradeCloud connects to the purchaser’s ERP and can also connect to the vendor’s system through web services.
The CADAC group, a large Dutch Autodesk solution provided also showed their web-services based solution connecting Autodesk Vault with TradeCloud to make sure the right drawings are available. The name of their solution, the “Cadac Organice Vault TradeCloud Adapter” is more complicated than the solution itself.
What I saw that afternoon was three solutions providers connected using the cloud and web services to support a part of a company’s business flow. I could imagine that adding services from other companies like OnShape (CAD in the cloud), Kimonex (BOM Management for product design in the cloud) and probably 20 more candidates can already build and deliver a simplified business flow in an organization without having a single, large enterprise system in place that connects all.
I believe this is the future and potential a breakthrough for the construction industry. As the connections between the stakeholders can vary per project, having a configurable combination of business services supported by a cloud infrastructure enables an efficient flow of data.
As a PLM expert, you might think all these startups with their solutions are not good enough for the real world of PLM. And currently they are not – I agree. However disruption always comes unnoticed. I wrote about it in 2012 (The Innovators Dilemma and PLM)
Innovation happens when you meet people, observe and associate in areas outside your day-to-day business. For me, these two events connected some of the dots for the future. What do you think? Will a business process based on connected services become the future?
Sometimes we have to study careful to see patterns have a look here what is possible according to some scientists (click on the picture for the article)
Three weeks ago there was the Product Innovation conference in Düsseldorf. In my earlier post (here) I described what I experienced during this event. Now, after all the information is somehow digested, here a more high-level post, describing the visible change in business and how it relates to PLM. Trying to describe this change in non-academic wording but in images. Therefore, I described the upcoming change in the title: from linear to circular and fast.
Let me explain this image step by step
In the middle of the previous century, we were thinking linear in education and in business. Everything had a predictable path and manufacturing companies were pushing their products to the market. First local, later in time, more global. Still the delivery process was pretty linear:
This linear approach is reflected in how organizations are structured, how they are aligned to the different steps of the product development and manufacturing process. Below a slide I used at the end of the nineties to describe the situation and the pain; lack of visibility what happens overall.
It is discouraging to see that this situation still exists in many companies.
At the end of the nineties, early 2000, PLM was introduced, conceptually managing the whole lifecycle. In reality, it was mainly a more tight connection between design and manufacturing preparation, pushing data into ERP. The main purpose was managing the collaboration between different design disciplines and dispersed teams.
Jim Brown (Tech-Clarity) wrote at that time a white paper, which is still valid for many businesses, describing the complementary roles of PLM and ERP. See the picture below:
Jim introduced the circle and the arrow. PLM: a circle with iterations, interacting with ERP: the arrow for execution. Here visual it became already clear an arrow does not have the same behavior as a circle. The 100 % linearity in business was gone.
Let´s have a closer look at the PLM circle
This is how PLM is deployed in most organizations:
Information is pushed in the ERP system as disconnected information, no longer managed and connected to its design intent.
Next, the ERP system is most of the time not well-equipped for managing after sales and services content. Another disconnect comes up.
Yes, spare parts could be ordered through ERP, but issues appearing at the customer base are not stored in ERP, often stored in a separate system again (if stored beyond email).
The result is that when working in the concept phase, there is no information available for R&D to have a good understanding of how the market or customers work with their product. So how good will it be? Check in your company how well your R&D is connected with the field?
And then the change started …
This could have stayed reality for a long time if there were not a huge business change upcoming. The world becomes digital and connected. As a result, local inefficiencies or regional underperformance will be replaced by better-performing companies. The Darwin principle. And most likely the better performing companies are coming from the emerging markets as there they do not suffer from the historical processes and “knowledge of the past”. They can step into the digital world much faster.
In parallel with these fast growing emerging markets, we discovered that we have to reconsider the ways we use our natural resources to guarantee a future for next generations. Instead of spilling resources to deliver our products, there is a need to reuse materials and resources, introducing a new circle: the circular economy.
The circular economy can have an impact on how companies bring products to the market. Instead of buying products (CAPEX) more and more organizations (and modern people) start using products or services in a rental model (OPEX). No capital investment anymore, pay as you go for usage or capacity.
The digital and connected world can have a huge impact on the products or services available in the near future. You are probably familiar with the buzz around “The Internet of Things” or “Smart and Connected”.
No longer are products depending on mechanical behavior only, more and more products are relying on electrical components with adaptive behavior through software. Devices that connect with their environment report back information to the manufacturer. This allows companies to understand what happens with their products in the field and how to react on that.
Remember the first PLM circle?
Now we can create continuity of data !
Combine the circular economy, the digital and connected world and you will discover everything can go much faster. A crucial inhibitor is how companies can reorganize themselves around this faster changing, circular approach. Companies need to understand and react to market trends in the fastest and adequate way. The future will be probably about lower volumes of the same products, higher variability towards the market and most likely more and more combining products with services (the Experience Model). This requires a flexible organization and most likely a new business model which will differ from the sequential, hierarchical organizations that we know at this moment.
The future business model ?
The flexibility in products and services will more and more come from embedded software or supported by software services. Software services will be more and more cloud based, to avoid IT-complexity and give scalability.
Software development and integration with products and services are already a challenge for classical mechanical companies. They are struggling to transform their mechanical-oriented design process towards support for software. In the long-term, the software design process could become the primary process, which would mean a change from (sequential – streamlined) lean towards (iterative – SCRUM) agile.
Once again, we see the linear process becoming challenged by the circular iterations.
This might be the end of lean organizations, potentially having to mix with agile conepts..
If it was a coincidence or not, I cannot judge, however during the PI Conference I learned about W.L. Gore & Associates, with their unique business model supporting this more dynamic future. No need to have a massive organization re-org to align the business, as the business is all the time aligning itself through its employees.
Last weekend, I discovered Semco Partners in the newspaper and I am sure there are more companies organizing themselves to become reactive instead of linear – for sure in high-tech world.
Linearity is disappearing in business, it is all about reactive, multidisciplinary teams within organizations in order to support customers and their fast changing demands.
Fast reactions need new business organizations models (flexible, non-hierarchical) and new IT-support models (business information platforms – no longer PLM/ERP system thinking)
What do you think ? The end of linear ?
I have talked enough about platforms recently. Still if you want to read more about it:
Engineering.com: Prod. Innovation Platform PlugnPlay in next generation PLM
Gartner: Product Innovation Platforms
VirtualDutchman: Platform, Backbone, Service Bus or BI
Currently, I am preparing my sessions for the upcoming Product Innovation conference in Düsseldorf. See: www.picongress.com. My first session will be about PLM upgrades and how to deal with them for the future. It is a challenging topic as some PLM vendors claim using their product, there will be no upgrade problems and cloud-based solutions also provide seamless upgrades in the future.
Don’t cheer to early when you see this kind of messages. I had the chance to look back the past twenty years what happened with PLM and tried to look forward to the upcoming ten years what might happen.
In addition, this lead to some interesting thoughts that I will share in detail during the conference. I will come back to this topic in this blog after the conference. Here some unstructured thoughts that passed my mind recently when preparing this session.
Not every upgrade is the same!
First there was an interesting blog post from Ed Lopategui from E(E) with the title There is No Upgrade, where he addresses the difference between consumer software and enterprise software. Where consumer software will be used by millions and tested through long Alfa and beta cycles, PLM software often comes to the market in what you could consider a beta stage with limited testing.
Most PLM vendors invest a lot of their revenue in providing new functionality and technology based on their high-end customer demands. They do not have the time and budget to invest in the details of the solution; for this reason PLM solutions will remain a kind of framework.
In addition, when a solution is not 100 % complete there will be an adaptation from the customer, making upgrades later, not 100 percent guaranteed or compatible. More details on PLM Upgrades after the conference, let’s look into the near future.
The Future of PLM resides in Brussels!
Some weeks ago I was positively amused by some messages coming from Roger Tempest (PLM Interest Group) related to the future of PLM. Roger claims the PLM industry is effectively rudderless. For that Roger announces the Launch Meeting for the PLM International Research Foundation,
“simple because such a platform does not yet exist.”
I checked if perhaps an ERP International Research Foundation existed, but I only found references to SAP, so what makes the PLM International Research Foundation unique ?
According to Roger, the reason behind this initiative is the lack of clear targets for PLM. I quote:
The lack of detailed thought means that many future possibilities for PLM are just not being considered; and the lack of collective thought means that even the current initiatives to improve PLM remain fragmented and ineffective
As I mentioned in the previous paragraph, PLM vendors are in a kind of rat race to keep up with market demands, rapidly changing business, meanwhile building on their core technology. Not an easy game, as they cannot start from scratch, but for sure, and here I agree, they do not optimize their portfolio.
Who can and will take part in such a research forum?
This is the same for companies implementing PLM systems. They are looking for solutions in the market that improve their businesses. This might be a PLM system, but perhaps other components bring even a higher value. Is ALM or SLM part of PLM, for example? This is a challenge as who defined what PLM is and where are the boundaries ?
This leaves the activity to the academics for sure they will have the most advanced and futuristic vision of what is possible conceptually. From my observations, the main challenges currently with PLM are that even the vendors are ten years ahead in their capabilities compared to what most companies are asking for. For the academic approach, I still have to think about Monty Python’s sketch related to soccer. See below
Sorry for the generalization, I believe we should not focus on what is PLM and how PLM should be defined. What we now call PLM is entirely different from what we called PLM 10 years ago, see my last year´s post PLM is changing. I think the future should focus how we are going to deal with business platforms, which contain PLM facets.
The PLM future
Interesting enough we are on the brink of a new business paradigm due to globalization and digitization as you might have read from my recent posts. There are analysts, consultancy firms and research foundations all describing this challenging future.
Have a look at this post from Verdi Ogewell’s article at Engineering.com: Product Innovation Platform: Plug’n’play next generation PLM. The post is a summary of the platform discussion during the PDT 2014 conference, which I consider as one of the best conferences if you want to go into the details. See also my post: The weekend after PDT 2014.
The future is about innovation and/or business platforms where data is available based on a federated approach, not necessary based on a single, monolithic PLM platform.
Focusing on standardization and openness of such a platform is for me the central mission we have.
Remember: Openness is a right, not a privilege.
Let PLM vendors and other application providers develop their optimized services for individual business scenarios that will remove the borders of system thinking. Academic support will be needed to solve interoperability and openness required for initiatives like Industry 4.0 and IDC´s third platform.
I am looking forward to interesting discussions at the upcoming
PI conference but also with peers in my network.
The future is challenging and will it still be named PLM?
A year ago I wrote a blog post questioning if the construction industry would learn from PLM practices in relation to BIM.
In that post, I described several lessons learned from other industries. Topics like:
- Working on a single, shared repository of on-line data (the Digital Mock Up). Continuity of data based on a common data model – not only 3D
- It is a mindset. People need to learn to share instead of own data
- Early validation and verification based on a virtual model. Working in the full context
- Planning and anticipation for service and maintenance during the design phase. Design with the whole lifecycle in mind (and being able to verify the design)
The comments to that blog post already demonstrated that the worlds of PLM and BIM are not 100 percent comparable and that there are some serious inhibitors preventing them to come closer. One year later, let´s see where we are:
BIM moving into VDC (or BLM ?)
The first trend that becomes visible is that people in the construction industry start to use more and more the term Virtual Design and Construction (VDC) instead of BIM (Building Information Model or Building Information Management?).
The good news here is that there is less ambiguity with the term VDC instead of BIM. Does this mean many BIM managers will change their job title? Probably not as most construction companies are still in the learning phase what a digital enterprise means for them.
Still Virtual Design and Construction focuses a lot on the middle part of the full lifecycle of a construction. VDC does not necessary connect the early concept phase and for sure almost neglects the operational phase. The last phase is often ignored as construction companies are not thinking (yet) about Repair & Maintenance contracts (the service economy).
And surprisingly, last week I saw a blog post from Dassault Systemes, where Dassault introduced the word BLM (Building Lifecycle Management). Related to this blog post also some LinkedIn discussions started. BLM, according to Dassault Systemes, is the combination of BIM and PLM – read this post here.
The challenge however for construction companies is to, what are the related data sets they require and how can you create this continuity of data. This brings us to one of the most important inhibitors.
Where in other industries a clear product data owner exists, the ownership of data in EPC (Engineering, Procurement, Construction) companies, typical for the construction industry or oil & gas industry is most of the times on purpose vague.
First of all the owner of a construction often does not know which data could be relevant to maintain. And secondly, as soon as the owner asks for more detailed information, he will have to pay for that, raising the costs, which not directly flow back to benefits, only later during the FM (Facility Management) /Operational stage.
And let´s imagine the owner could get the all the data required. Next the owner is at risk, as potentially having the information might makes you liable for mistakes and claims.
From discussion with construction owners I learned their policy is not to aim for the full dataset related to a construction. It reduces the risk to be liable. Imagine Boeing and Airbus would follow this approach. This brings us to another important inhibitor.
A risk shifting business
The construction industry on its own is still a risk shifting business, where each party tries to pass the risk of cost of failure to another stakeholder in the pyramid. The most powerful owners / operators of the construction industry quickly play down the risk to their contractors and suppliers. And these companies then then distribute the risk further down to their subcontractors.
If you do not accept the risk, you are no longer in the game. This is different from other industries and I have seen this approach in a few situations.
For example, I was dealing with an EPC company that wanted to implement PLM. The company expected that the PLM implementer would take a large part of the risk for the implementation. As they were always taking the risk too for their big customers when applying for a project. Here there was a clash of cultures, as PLM implementers learned that the risk of a successful PLM implementation is vague as many soft values define the success. It is not a machine or platform that has to work after some time.
Another example was related to requirements management. Here the EPC company wanted to become clear and specific to their customer. However their customer reacted very strange. Instead of being happy that the EPC company invested in more upfront thinking and analysis, the customer got annoyed as they were not used to be specific so early in the process. They told the EPC company, “if you have so many questions, probably you do not understand the business”.
So everyone in the EPC business is pushed to accept a higher risk and uncertainty than other industries. However, the big reward is that you are allowed to have a cost of failure above 15 – 20 percent without feeling bad. Which this percentage you would be out of business in other industries. And this brings us to another important inhibitor.
Accepted high cost of failure
As the industry accepts this high cost of failure, companies are not triggered to work different or to redesign their processes in order to lower the inefficiencies. The UK government mandates BIM Level 2 for their projects starting in 2016 and beyond, to reduce costs through inefficiencies.
But will the UK government invest to facilitate and aim for data ownership? Probably not, as the aim of governments is not to be extreme economical. Being not liable has a bigger value than being more efficient for governments as I learned. Being more efficient is the message to the outside world to keep the taxpayer satisfied.
It is hard to change this way of thinking. It requires a cultural change through the whole value chain. And cultural change is the “worst” thing that can happen to a company. The biggest inhibitor.
Cultural change is a point that touches all industries and there is no difference between the construction industry and for example a classical discrete manufacturing company. Because of global competition and comparable products other industries have been forced already to work different, in order to survive (and are still challenged)
The cultural change lies in people. We (the older generation) are educated and brought up in classical engineering models that reflect the post second world war best practices. Being important in a process is your job justification and job guarantee.
New paradigms, based on a digital world instead of a document-shifting world, need to be defined and matured and will make many classical data processing jobs redundant. Read this interesting article from the Economist: The Onrushing Wave
This is a challenge for every company. The highest need to implement this cultural change is ironically for those countries with the highest legacy: Western Europe / the United-States.
As these countries also have the highest labor cost, the impact of, keep on doing the old stuff, will reduce their competitiveness. The impact for construction companies is less, as the construction industry is still a local business, as at the end resources will not travel the globe to execute projects.
However cheaper labor costs become more and more available in every country. If companies want to utilize them, they need to change the process. They need shift towards more thinking and knowledge in the early lifecycle to avoid the need for high qualified people to be in the field to the fix errors.
Sharing instead of owning
For me the major purpose of PLM is to provide an infrastructure for people to share information in such a manner that others, not aware of the information, can still easily find and use the information in a relevant context of their activities. The value: People will decide on actual information and no longer become reactive on fixing errors due to lack of understanding the context.
The problem for the construction industry is that I have not seen any vendor focusing on sharing the big picture. Perhaps the BLM discussion will be a first step. For the major tool providers, like Autodesk and Bentley, their business focus is on the continuity of their tools, not on the continuity of data.
Last week I noticed a cloud based Issue Management solution, delivered by Kubus. Issue Management is one of the typical and easy benefits a PLM infrastructure can deliver. In particular if issues can be linked to projects, construction parts, processes, customers. If this solution becomes successful, the extension might be to add more data elements to the cloud solution. Main question will remain: Who owns the data ? Have a look:
For continuity of data, you need standards and openness – IFC is one of the many standards needed in the full scope of collaboration. Other industries are further developed in their standards driven by end-user organizations instead of vendors. Companies should argue with their vendors that openness is a right, not a privilege.
A year ago, I was more optimistic about the construction industry adopting PLM practices. What I have learned this year, and based on feedback from others, were are not at the turning point yet. Change is difficult to achieve from one day to the other. Meanwhile, the whole value chain in the construction industry has different objectives. Nobody will take the risk or can afford the risk.
I remain interested to see where the construction industry is heading.
What do you think will 2015 be the year of a breakthrough?
Shaping the PLM platform of the Future
It was the first time I attended this event. I was positively surprised about the audience and content. Where other PLM conferences were often more focusing on current business issues, here a smaller audience (130 persons) was looking into more details around the future of PLM. Themes like PLM platforms, the Circular Economy, Open Standards and longevity of data were presented and discussed here.
The emergence of the PLM platform
1. The product lifecycle will become more and more circular due to changing business models and in parallel the different usage/availability of materials will have an impact how we design and deliver products
Can current processes and tools support today’s complexity. And what about tomorrow? According to a CIMdata survey there is a clear difference in profit and performance between leaders and followers, and the gap is increasing faster. “Can you afford yourself to be a follower ?” is a question companies should ask themselves.
Rethinking PLM platform does not bring the 2-3 % efficiency benefit but can bring benefits from 20 % and more.
Peter sees a federated platform as a must for companies to survive. I in particular likes his statement:
The new business platform paradigm is one in which solutions from multiple providers must be seamlessly deployed using a resilient architecture that can withstand rapid changes in business functions and delivery modalities
Industry voices on the Future PLM platform
Steven Vetterman from ProSTEP talked about PLM in the automotive industry. Steven started describing the change in the automotive industry, by quoting Heraclitus Τα πάντα ρεί – the only constant is change. Steven described two major changes in the automotive industry:
1. The effect of globalization, technology and laws & ecology
2. The change of the role of IT and the impact of culture & collaboration
Interesting observation is that the preferred automotive market will shift to the BRIC countries. In 2050 more than 50 % of the world population (estimate almost 10 billion people at that time) will be living in Asia, 25 percent in Africa. Europe and Japan are aging. They will not invest in new cars.
For Steven, it was clear that current automotive companies are not yet organized to support and integrate modern technologies (systems engineering / electrical / software) beyond mechanical designs. Neither are they open for a true global collaboration between all players in the industry. Some of the big automotive companies are still struggling with their rigid PLM implementation. There is a need for open PLM, not driven from a single PLM system, but based on a federated environment of information.
Yves Baudier spoke on behalf of the aerospace industry about the standardization effort at their Strategic Standardization Group around Airbus and some of its strategic suppliers, like Thales, Safran, BAE systems and more. If you look at the ASD Radar, you might get a feeling for the complexity of standards that exist and are relevant for the Airbus group.
It is a complex network of evolving standard all providing (future) benefits in some domains. Yves was talking about the through Lifecycle support which is striving for data creation once and reuse many times during the lifecycle. The conclusion from Yves, like all the previous speakers is that: The PLM Platform of the Future will be federative, and standards will enable PLM Interoperability
Energy and Marine
Shefali Arora from Wärtsilä spoke on behalf of the energy and marine sector and gave an overview of the current trends in their business and the role of PLM in Wärtsilä. With PLM, Wärtsilä wants to capitalize on its knowledge, drive costs down and above all improve business agility. As the future is in flexibility. Shefali gave an overview of their PLM roadmap covering the aspects of PDM (with Teamcenter), ERP (SAP) and a PLM backbone (Share-A-space). The PLM backbone providing connectivity of data between all lifecycle stages and external partners (customer / suppliers) based on the PLCS standard. Again another session demonstrating the future of PLM is in an open and federated environment
The future PLM platform is a federated platform which adheres to standards provides openness of interfaces that permit the platform to be reliable over multiple upgrade cycles and being able to integrate third-parties (Peter Bilello)
The afternoon session I followed the Systems Engineering track. Peter Bilello gave an overview of Model-Based Systems engineering and illustrated based on a CIMdata survey that even though many companies have a systems engineering strategy in place it is not applied consistently. And indeed several companies I have been dealing with recently expressed their desire to integrate systems engineering into their overall product development strategy. Often this approach is confused by believing requirements management and product development equal systems engineering. Still a way to go.
Dieter Scheithauer presented his vision that Systems Engineering should be a part of PLM, and he gave a very decent, academic overview how all is related. Important for companies that want to go into that direction, you need to understand where you aiming at. I liked his comparison of a system product structure and a physical product structure, helping companies to grab the difference between a virtual, system view and a physical product view:
More Industry voices
The afternoon session started with Christophe Castaing, explaining BIM (Building Information Modeling) and the typical characteristics of the construction industry. Although many construction companies focus on the construction phase, for 100 pieces of information/exchange to be managed during the full life cycle only 5 will be managed during the initial design phase (BIM), 20 will be managed during the construction phase (BAM) and finally 75 will be managed during the operation phase (BOOM). I wrote about PLM and BIM last year: Will 2014 become the year the construction industry will discover PLM?
Christophe presented the themes from the French MINnD project, where the aim is starting from an Information Model to come to a platform, supporting and integrated with the particular civil and construction standards, like IFC. CityGml but also PLCS standard (isostep ISO 10303-239
Amir Rashid described the need for PLM in the consumer product markets stating the circular economy as one of the main drivers. Especially in consumer markets, product waste can be extremely high due to the short lifetime of the product and everything is scrapped to land waste afterward. Interesting quote from Amir: Sustainability’s goal is to create possibilities not to limit options. He illustrated how Xerox already has sustainability as part of their product development since 1984. The diagram below demonstrates how the circular economy can impact all business today when well-orchestrated.
Marc Halpern closed the tracks with his presentation around Product Innovation Platforms, describing how Product Design and PLM might evolve in the upcoming digital era. Gartner believes that future PLM platforms will provide insight (understand and analyze Big Data), Adaptability (flexible to integrate and maintain through an open service oriented architecture), promoting reuse (identifying similarity based on metadata and geometry), discovery (the integration of search analysis and simulation) and finally community (using the social paradigm).
If you look to current PLM systems, most of them are far from this definition, and if you support Gartner’s vision, there is still a lot of work for PLM vendor to do.
Interesting Marc also identified five significant risks that could delay or prevent from implementing this vision:
- inadequate openness (pushing back open collaboration)
- incomplete standards (blocking implementation of openness)
- uncertain cloud performance (the future is in cloud services)
- the steep learning curve (it is a big mind shift for companies)
- Cyber-terrorism (where is your data safe?)
After Marc´s session there was an interesting panel discussion with some the speakers from that day, briefly answering discussing questions from the audience. As the presentations have been fairly technical, it was logical that the first question that came up was: What about change management?
A topic that could fill the rest of the week but the PDT dinner was waiting – a good place to network and digest the day.
Day 2 started with two interesting topics. The first presentation was a joined presentation from Max Fouache (IBM) and Jean-Bernard Hentz (Airbus – CAD/CAM/PDM R&T and IT Backbones). The topic was about the obsolescence of information systems: Hardware and PLM applications. As in the aerospace industry some data needs to be available for 75 years. You can imagine that during 75 years a lot can change to hardware and software systems. At Airbus, there are currently 2500 applications, provided by approximate 600 suppliers that need to be maintained. IBM and Airbus presented a Proof of Concept done with virtualization of different platforms supporting CATIA V4/V5 using Linux, Windows XP, W7, W8 which is just a small part of all the data.
The conclusion from this session was:
To benefit from PLM of the future, the PLM of the past has to be managed. Migration is not the only answer. Look for solutions that exist to mitigate risks and reduce costs of PLM Obsolescence. Usage and compliance to Standards is crucial.
Next Howard Mason, Corporate Information Standards Manager took us on a nice journey through the history of standards developed in his business. I loved his statement: Interoperability is a right, not a privilege
In the systems engineering track Kent Freeland talked about Nuclear Knowledge Management and CM in Systems Engineering. As this is one of my favorite domains, we had a good discussion on the need for pro-active Knowledge Management, which somehow implies a CM approach through the whole lifecycle of a plant. Knowledge management is not equal to store information in a central place. It is about building and providing data in context that it can be used.
Ontology for systems engineering
Leo van Ruijven provided a session for insiders: An ontology for Systems Engineering based on ISO 15926-11. His simplified approach compared to the ISO 15288 lead to several discussion between supporters and opponents during lunch time.
Master Data Management
Based on the type of information companies want to manage in relation to each other supported by various applications (PLM, ERP, MES, MRO, …) this can be a complex exercise and Marc ended with recommendations and an action plan for the MDM lead. In my customer engagements I also see more and more the digital transformation leads to MDM questions. Can we replace Excel files by mastered data in a database?
Almost at the end of the day I was speaking about the PDM platform of the people targeted for the people from the future. Here I highlighted the fundamental change in skills that’s upcoming. Where my generation was trained to own and capture information as much as possible information in your brain (or cabinet), future generations are trained and skilled in finding data and building information out of it. Owning (information) is not crucial for them. Perhaps as the world is moving fast. See this nice YouTube movie at the end.
Ella Jamsin ended the conference on behalf of the Ellen MacArthur Foundation explaining the need to move to a circular economy and the PLM should play a role in that. No longer is PLM from cradle-to-grave but PLM should support the lifecycle from cradle-to-cradle.
Unfortunate I could not attend all sessions as there were several parallel sessions. Neither have I written about all sessions I attended. The PDT Europe conference, a conference for people who mind about the details around the PLM future concepts and the usage of standards, is a must for future strategists.
This is for the moment the last post about the difference between files and a data-oriented approach. This time I will focus on the need for open exchange standards and the relation to proprietary systems. In my first post, I explained that a data-centric approach can bring many business benefits and is pointing to background information for those who want to learn more in detail. In my second post, I gave the example of dealing with specifications.
It demonstrated that the real value for a data-centric approach comes at the moment there are changes of the information over time. For a specification that is right the first time and never changes there is less value to win with a data-centric approach. Moreover, aren’t we still dreaming that we do everything right the first time.
The specification example was based on dealing with text documents (sometimes called 1D information). The same benefits are valid for diagrams, schematics (2D information) and CAD models (3D information)
The challenge for a data-oriented approach is that information needs to be stored in data elements in a database, independent of an individual file format. For text, this might be easy to comprehend. Text elements are relative simple to understand. Still the OpenDocument standard for Office documents is in the background based on a lot of technical know-how and experience to make it widely acceptable. For 2D and 3D information this is less obvious as this is for the domain of the CAD vendors.
CAD vendors have various reasons not to store their information in a neutral format.
- First of all, and most important for their business, a neutral format would reduce the dependency on their products. Other vendors could work with these formats too, therefore reducing the potential market capture. You could say that in a certain manner the Autodesk 2D format for DXF (and even DWG) have become a neutral format for 2D data as many other vendors have applications that read and write back information in the DXF-data format. So far DXF is stored in a file but you could store DXF data also inside a database and make it available as elements.
- This brings us to the second reason why using neutral data formats are not that evident for CAD vendors. It reduces their flexibility to change the format and optimize it for maximal performance. Commercially the significant, immediate disadvantage of working in neutral formats is that it has not been designed for particular needs in an individual application and therefore any “intelligent” manipulations on the data are hard to achieve..
The same reasoning can be applied to 3D data, where different neutral formats exist (IGES, STEP, …. ). It is very difficult to identify a common 3D standard without losing many benefits that an individual 3D CAD format brings currently. For example, CATIA is handling 3D CAD data in a complete different way as Creo does, and again handled different compared to NX, SolidWorks, Solid Edge and Inventor. Even some of them might use the same CAD kernel.
However, it is not only about the geometry anymore; the shapes represent virtual objects that have metadata describing the objects. In addition other related information exists, not necessarily coming from the design world, like tasks (planning), parts (physical), suppliers, resources and more
PLM, ERP, systems and single source of truth
This brings us in the world of data management, in my world mainly PLM systems and ERP systems. An ERP system is already a data-centric application, the BOM is already available as metadata as well as all the scheduling and interaction with resources, suppliers and financial transactions. Still ERP systems store a lot of related documents and drawings, containing content that does not match their data model.
PLM systems have gradually becoming more and more data centric as the origin was around engineering data, mostly stored in files. In a data-centric approach, there is the challenge to exchange data between a PLM system and an ERP system. Usually there is a need to share information between two systems, mainly the items. Different definitions of an item on the PLM and ERP side make it hard to exchange information from one system to the other. It is for that reason why there are many discussions around PLM and ERP integration and the BOM.
In the modern data-centric approach however we should think less and less in systems and more and more in business processes performed on actual data elements. This requires a company-wide, actually an enterprise-wide or industry-wide data definition of all information that is relevant for the business processes. This leads into Master Data Management, the new required skill for enterprise solution architects
The data-centric approach creates the impression that you can achieve a single source of the truth as all objects are stored uniquely in a database. SAP solves the problem by stating everything fits in their single database. To my opinion this is more a black hole approach: Everything gets inside, but even light cannot escape. Usability and reuse of information that was stored with the intention not to be found is the big challenge here.
Other PLM and ERP vendors have different approaches. Either they choose for a service bus architecture where applications in the background link and synchronize common data elements from each application. Therefore, there is some redundancy, however everything is connected. More and more PLM vendors focus on building a platform of connected data elements, where on top applications will run, like the 3DExperience platform from Dassault Systèmes.
As users we are more and more used to platforms as Google, Apple provide these platforms already in the cloud for common use on our smartphones. The large amount of apps run on shared data elements (contacts, locations …) and store additional proprietary data.
Platforms, Networks and standards
And here we enter an interesting area of discussion. I think it is a given that a single database concept is a utopia. Therefore, it will be all about how systems and platforms communicate with each other to provide in the end the right information to the user. The systems and platforms need to be data-centric as we learned from the discussion around the document (file centric) or data-centric approach.
In this domain, there are several companies already active for years. Datamation from Dr. Kais Al-Timimi in the UK is such a company. Kais is a veteran in the PLM and data modeling industry, and they provide a platform for data-centric collaboration. This quote from one of his presentations, illustrates we share the same vision:
“……. the root cause of all interoperability and data challenges is the need to transform data between systems using different, and often incompatible, data models.
It is fundamentally different from the current Application Centric Approach, in that data is SHARED, and therefore, ‘NOT OWNED’ by the applications that create it.
This means in a Data Centric Approach data can deliver MORE VALUE, as it is readily sharable and reusable by multiple applications. In addition, it removes the overhead of having to build and maintain non-value-added processes, e.g. to move data between applications.”
Another company in the same domain is Eurostep, who are also focusing on business collaboration between in various industries. Eurostep has been working with various industry standards, like AP203/214, PLCS and AP233. Eurostep has developed their Share-A-space platform to enable a data-centric collaboration.
This type of data collaboration is crucial for all industries. Where the aerospace and automotive industry are probably the most mature on this topic, the process industry and construction industry are currently also focusing on discovering data standards and collaboration models (ISO 15926 / BIM). It will be probably the innovators in these industries that clear the path for others. For sure it will not come from the software vendors as I discussed before.
If you reach this line, it means the topic has been interesting in depth for you. In the past three post starting from the future trend, an example and the data modeling background, I have tried to describe what is happening in a simplified manner.
If you really want to dive into the PLM for the future, I recommend you visit the upcoming PDT 2014 conference in Paris on October 14 and 15. Here experts from different industries will present and discuss the future PLM platform and its benefits. I hope to meet you there.
Some more to read:
Last week I attended the PI Apparel conference in London. It was the second time this event was organized and approximate 100 participants were there for two full days of presentations and arranged network meetings. Last year I was extremely excited about this event as the different audience, compare to classical PLM events, and was much more business focused.
Read my review from last year here: The weekend after PI Apparel 2013
This year I had the feeling that the audience was somewhat smaller, missing some of the US representatives and perhaps there was a slightly more, visible influence from the sponsoring vendors. Still an enjoyable event and hopefully next year when this event will be hosted in New York, it will be as active as last year.
Here are some of my observations.
Again the event had several tracks in parallel beside the keynotes, and I look forward in the upcoming month to see the sessions I could not attend. Obvious where possible I followed the PLM focused sessions.
First keynote came from Micaela le Divelec Lemmi, Executive Vice President and Chief Corporate Operations Officer of Gucci. She talked us through the areas she is supervising and gave some great insights. She talked about how Gucci addresses sustainability through risk and cost control. Which raw materials to use, how to ensure the brands reputation is not at risk, price volatility and the war on talent. As Gucci is a brand in the high-end price segment, image and reputation are critical, and they have the margins to assure it is managed. Micaela spoke about the short-term financial goals that a company as Gucci has related to their investors. Topics she mentioned (I did not write them down as I was tweeting when I heard them) were certainly worthwhile to consider and discuss in detail with a PLM consultant.
Micaela further described Gucci´s cooperate social responsibility program with a focus on taking care of the people, environment and culture. Good to learn that human working conditions and rights are a priority even for their supply chain. Although it might be noted that 75 % of Gucci´s supply chain is in Italy. One of the few brands that still has the “Made in Italy” label.
My conclusion was that Micaela did an excellent PR job for Gucci, which you would expect for a brand with such a reputation. Later during the conference we had a discussion would other brands with less exclusivity and more operating in the mass consumer domain be able to come even close to such programs?
The company is successful in manufacturing and selling licensed products from Pierre Cardin, Cacharel and US Polo Association mainly outside the US and Western Europe.
Their primary focus was to provide access to the most accurate and most updated information from one source. In parallel, standardization of codes and tech packs was a driver. Through standardization quality and (re)use could be improved, and people would better understand the details. Additional goals are typical PLM goals: following the product development stages during the timeline, notify relevant users about changes in the design, work on libraries and reuse and integrate with SAP.
Interesting Hakan mentioned that in their case SAP did not recommend to use their system for the PLM related part due to lack of knowledge of the apparel industry. A wise decision which would need followup for other industries.
In general the PLM implementation described by Göktug and Hakan was well phased and with a top-down push to secure there is no escape to making the change. As of all PLM implementations in apparel they went live in their first phase rather fast as the complex CAD integrations from classical PLM implementations were not needed here.
Next I attended the Infor session with the title: Work the Way you Live: PLM built for the User. A smooth marketing session with a function / feature demo demonstrating the flexibility and configuration capabilities of the interface. Ease of use is crucial in the apparel industry, where Excel is still the biggest competitor. Excel might satisfy the needs from the individual, it lacks the integration and collaboration aspect a PLM system can offer.
More interesting was the next session that I attended from Marcel Oosthuis, who was responsible as Process Re-Engineering Director (read PLM leader). Marcel described how they had implemented PLM at Tommy Hilfiger, and it was an excellent story (perhaps too good to be true).
I believe larger companies with the right focus and investment in PLM resources can achieve this kind of results. The target for Tommy Hilfiger´s PLM implementation was beyond 1000 users, therefore, a serious implementation.
Upfront the team defined first what the expected from the PLM system to select (excellent !!). As the fashion industry is fast, demanding and changing all the time, the PLM system needs to be Swift, Flexible and Prepared for Change. This was not a classical PLM requirement.
In addition, they were looking for a high-configurable system, providing best practices and a vendor with a roadmap they could influence. Here I got a little more worried as high-configurable and best practices not always match the prepared for change approach. A company might be tempted to automate the way they should work based on the past (best practices from the past)
It was good to hear that Marcel did not have to go into the classical ROI approach for the system. His statement, which I fully endorse that it is about the capability to implement new and better processes. They are often not comparable with the past (and nobody measured the past)
Marcel described how the PLM team (eight people + three external from the PLM vendor) made sure that the implementation was done with the involvement of the end users. End user adoption was crucial as also key user involvement when building and configuring the system.
It was one of the few PLM stories where I hear how all levels of the organization were connected and involved.
Next Sue Butler, director from Kurt Salmon, described how to maximize ROI from your PLM investment. It is clear that many PLM consultants are aligned, and Sue brought up all the relevant points and angles you needed to look at for successful PLM implementation.
Main points: PLM is about changing the organization and processes, not about implementing a tool. She made a point that piloting the software is necessary as part of the learning and validation process. I agree on that under the condition that it is an agile pilot which does not take months to define and perform. In that case, you might be already locked in into the tool vision too much – focus on the new processes you want to achieve.
Moreover, because Sue was talking about maximize ROI from a PLM implementation, the topics focus on business areas that support evolving business processes and measure (make sure you have performance metrics) came up.
The next session Staying Ahead of the Curve through PLM Roadmap Reinvention conducted by Austin Mallis, VP Operations, Fashion Avenue Sweater Knits, beautifully completed previous sessions related to PLM.
Austin nicely talked about setting the right expectations for the future (There is no perfect solution / Success does not mean stop / Keeping the PLM vision / No True End). In addition, he described the human side of the implementation. How to on-board everyone (if possible) and admitting you cannot get everyone on-board for the new way of working.
Luckily the speakers before me that day already addressed many of the relevant topics, and I could focus on three main thoughts completing the story:
1. Who decides on PLM and Why?
I published the results from a small survey I did a month ago via my blog (A quick PLM survey). See the main results below.
It was interesting to observe that both the management and the users in the field are the majority demanding for PLM. Consultants have some influence and PLM vendors even less. The big challenge for a company is that the management and consultants often talk about PLM from a strategic point of view, where the PLM vendor and the users in the field are more focused on the tool(s).
From the expectations you can see the majority of PLM implementations is about improving collaboration, next time to market, increase quality and centralizing and managing all related information.
2. Sharing data instead of owning data
(You might have read about it several times in my blog) and the trend that we move to platforms with connected data instead of file repositories. This should have an impact on your future PLM decisions.
3. Choosing the right people
The third and final thought was about choosing the right people and understanding the blocker. I elaborated on that topic already before in my recent blog post: PLM and Blockers
My conclusions for the day were:
A successful PLM implementation requires a connection in communication and explanation between all these levels. These to get a company aligned and have an anchored vision before even starting to implement a system (with the best partner)
The day was closed by the final keynote of the day from Lauren Bowker heading T H E U N S E E N. She and her team are exploring the combinations of chemistry and materials to create new fashion artifacts. Clothes and materials that change color based on air vent, air pollution or brain patterns. New and inspiring directions for the fashion lovers.
Have a look here: http://seetheunseen.co.uk/
The morning started with Suzanne Lee, heading BioCouture who is working on various innovative methodologies to create materials for the apparel industry by using all kind of live micro-organisms like bacteria, fungi and algae and using materials like cellulose, chitin and protein fibers, which all can provide new possibilities for sustainability, comfort, design, etc. Suzanne´s research is about exploring these directions perhaps shaping some new trends in the 5 – 10 years future ahead. Have a look into the future here:
Renate Eder took us into the journey of visualization within Adidas, with her session: Utilizing Virtualization to Create and Sell Products in a Sustainable Manner.
It was interesting to learn that ten years ago she started the process of having more 3D models in the sales catalogue. Where classical manufacturing companies nowadays start from a 3D design, here at Adidas at the end of the sales cycle 3D starts. Logical if you see the importance and value 3D can have for mass market products.
Adidas was able to get 16000 in their 3D catalogue thanks to the work from 60 of their key suppliers who were fully integrated in the catalogue process. The benefit from this 3D catalogue was that their customers, often the large stores, need lesser samples, and the savings are significant here (plus a digital process instead of transferring goods).
Interesting discussion during the Q&A part was that the virtual product might even look more perfect than the real product, demonstrating how lifelike virtual products can be.
And now Adidas is working further backwards from production patterns (using 3D) till at the end 3D design. Although a virtual 3D product cannot 100 % replace the fit and material feeling, Renate believes that also introducing 3D during design can reduce the work done during pilots.
Finally for those who stayed till the end there was something entirely different. Di Mainstone elaborating on her project: Merging Architecture & the Body in Transforming the Brooklyn Bridge into a Playable Harp. If you want something entirely different, watch here:
The apparel industry remains an exciting industry to follow. For some of the concepts – being data-centric, insane flexible, continuous change and rapid time to market are crucial here.
This might lead development of PLM vendors for the future, including using it based on cloud technology.
From the other side, the PLM markets in apparel is still very basic and learning, see this card that I picked up from one of the vendors. Focus on features and functions, not touching the value (yet)
Everyone wants to be a game changer and in reality almost no one is a game changer. Game changing is a popular term and personally I believe that in old Europe and probably also in the old US, we should have the courage and understanding changing the game in our industries.
Why ? Read the next analogy.
With my Dutch roots and passion for soccer, I saw the first example of game changing happening in 1974 with soccer. The game where 22 players kick a ball from side to side, and the Germans win in the last minute.
My passion and trauma started that year where the Dutch national team changed the soccer game tactics by introducing totaalvoetbal.
Defenders could play as forwards and they other way around. Combined with the offside-trap; the Dutch team reached the finals of the world championship soccer both in 1974 and 1978. Of course losing the final in both situations to the home playing teams (Germany in 74 – Argentina in 78 with some help of the referee we believe)
This concept brought the Dutch team for several years at the top, as the changed tactics brought a competitive advantage. Other teams and players, not educated in the Dutch soccer school could not copy that concept so fast
At the same time, there was a game changer for business upcoming in 1974, the PC.
On the picture, you see Steve Jobs and Steve Wozniak testing their Apple 1 design. The abbreviation IT was not common yet and the first mouse device and Intel 8008 processor were coming to the market.
This was disruptive innovation at that time, as we would realize 20 years later. The PC was a game changer for business.
Johan Cruyff remained a game changer and when starting to coach and influence the Barcelona team, it was his playing concept tika-taka that brought the Spanish soccer team and the Barcelona team to the highest, unbeatable level in the world for the past 8 years
Instead of having strong and tall players to force yourself to the goal, it was all about possession and control of the ball. As long as you have the ball the opponent cannot score. And if you all play very close together around the ball, there is never a big distance to pass when trying to recapture the ball.
This was a game changer, hard to copy overnight, till the past two years. Now other national teams and club teams have learned to use these tactics too, and the Spanish team and Barcelona are no longer lonely at the top.
Game changers have a competitive advantage as it takes time for the competition to master the new concept. And the larger the change, the bigger the impact on business.
Also, PLM was supposed to be a game changer in 2006. The term PLM became more and more accepted in business, but was PLM really changing the game ?
PLM at that time was connecting departments and disciplines in a digital manner with each other, no matter where they were around the globe. And since the information was stored in centralized places, databases and file sharing vaults, it created the illusion that everyone was working along the same sets of data.
The major successes of PLM in this approach are coming from efficiency through digitization of data exchange between departments and the digitization of processes. Already a significant step forward and bringing enough benefits to justify a PLM implementation.
Still I do not consider PLM in 2006 a real game changer. There was often no departmental or business change combined with it. If you look at the soccer analogy, the game change is all about a different behavior to reach the goal, it is not about better tools (or shoes).
The PLM picture shows the ideal 2006 picture, how each department forwards information to the next department. But where is PLM supporting after sales/services in 2006 ? And the connection between After Sales/Services and Concept is in most of the companies not formalized or existing. And exactly that connection should give the feedback from the market, from the field to deliver better products.
The real game changer starts when people learn and understand sharing data across the whole product or project lifecycle. The complexity is in the word sharing. There is a big difference between storing everything in a central place and sharing data so other people can find it and use it.
People are not used to share data. We like to own data, and when we create or store data, we hate the overhead of making data sharable (understandable) or useful for others. As long as we know where it is, we believe our job is safe.
But our job is no longer safe as we see in the declining economies in Europe and the US. And the reason for that:
Data is changing the game
In the recent years the discussion about BI (Business Intelligence) and Big Data emerged. There is more and more digital information available. And it became impossible for companies to own all the data or even think about storing the data themselves and share it among their dispersed enterprises. Combined with the rise of cloud-based platforms, where data can be shared (theoretically) no matter where you are, no matter which device you are using, there is a huge potential to change the game.
It is a game changer as it is not about just installing the new tools and new software. There are two major mind shifts to make.
- It is about moving from documents towards data. This is an extreme slow process. Even if your company is 100 % digital, it might be that your customer, supplier still requires a printed and wet-signed document or drawing, as a legal confirmation for the transaction. Documents are comfortable containers to share, but they are killing for fast and accurate processing of the data that is inside them.
- It is about sharing and combining data. It does not make sense to dump data again in huge databases. The value only comes when the data is shared between disciplines and partners. For example, a part definition can have hundreds of attributes, where some are created by engineering, other attributes created by purchasing and some other attributes directly come from the supplier. Do not fall in the ERP-trap that everything needs to be in one system and controlled by one organization.
Because of the availability of data, the world has become global and more transparent for companies. And what you see here is that the traditional companies in Europe and the US struggle with that. Their current practices are not tuned towards a digital world, more towards the classical, departmental approach. To change this, you need to be a game changer, and I believe many CEOs know that they need to change the game.
The upcoming economies have two major benefits:
- Not so much legacy, therefore, building a digital enterprise for them is easier. They do not have to break down ivory towers and 150 years of proud ownership.
- The average cost of labor is lower than the costs in Europe and the US, therefore, even if they do not do it right at the first time; there is enough margin to spend more resources to meet the objectives.
The diagram I showed in July during the PI Apparel conference was my interpretation of the future of PLM. However, if you analyze the diagram, you see that it is not a 100 % classical PLM scope anymore. It is also about social interaction, supplier execution and logistics. These areas are not classical PLM domains and therefore I mentioned in the past, the typical PLM system might dissolve in something bigger. It will be all about digital processes based on data coming for various sources, structured and unstructured. Will it still be PLM or will we call it different ?
The big consultancy firms are all addressing this topic – not necessary on the PLM level:
2012 Cap Gemini – The Digital advantage: …..
2013 Accenture – Dealing with digital technology’s disruptive impact on the workforce
For CEOs it is important to understand that the new, upcoming generations are already thinking in data (generation Y and beyond). By nature, they are used to share data instead of owning data in many aspects. Making the transition to the future is, therefore, also a process of connecting and understanding the future generations. I wrote about it last year: Mixing past and future generations with a PLM sauce
This cannot be learned from an ivory tower. The easiest way is not to be worried by this trend and continue working as before, losing business and margin slowly year by year.
As in many businesses people are fired for making big mistakes, doing nothing unfortunate is most of the time not considered as a big mistake, although it is the biggest mistake.
During the upcoming PI Conference in Berlin I will talk about this topic in more detail and look forward to meet and discuss this trend with those of you who can participate.
The soccer analogy stops here, as the data approach kills the the old game.
In soccer, the maximum remains 11 players on each side and one ball. In business, thanks to global connectivity, the amount of players and balls involved can be unlimited.
Because the leagues I was playing in, were always limited in scope: by age, local,regional, etc. Therefore it was easy to win in a certain scope and there are millions of soccer champions beside me. For business, however, there are almost no borders.
Global competition will require real champions to make it work !!!
The last month I haven’t been able to publish much of my experiences as I have been in the middle of several PLM selection processes for various industries. Now in a quiet moment looking back, I understand it is difficult for a company to choose a PLM solution for the future.
I hope this post will generate some clarity and may lead to some further discussion with other experts in the audience. I wrote about the do’s and don’ts of PLM selection in 2010, and most of it is still actual; however, there is more. Some of the topics explained:
Do you really need PLM ?
This is where it starts. PLM is not Haarlemerolie, an old Dutch medicine that was a cure for everything since the 17th century. The first step is that you need to know what you want to achieve and how you are aiming to achieve it. Just because a competitor has a PLM system installed, does not mean they use it properly or that your company should do it too. If you do not know why your company needs PLM, stop reading and start investigating.
If you are still reading this, you are part of the happy few, as justifying the need for PLM is not easy. Numerous of companies have purchased a PLM system just because they think they needed PLM. Or there was someone convinced that this software would bring PLM.
Most of these cases there was the confusion with PDM. Simply stating: PDM is more a departmental tool (engineering – multidisciplinary) where PLM is a mix of software, infrastructure to connect all departments in a company and support the product through its entire lifecycle.
Implementing “real” PLM is a business change, as people have to start sharing data instead of pushing documents from department to department. And this business transformation is a journey. It is not a fun journey, nicely characterized in Ed Lopategui’s blog post, the PLM Trail.
Although I believe it is not always that dramatic, Ed set the expectations right. Be well prepared before you start.
Why do companies still want PLM, while it is so difficult to implement?
The main reason is to remain competitive. If margins are under pressure, you can try to be more efficient, get better and faster tools. But by working in the old way, you can only be a little better.
Moving from a sequential, information pushing approach towards an on-line, global information sharing manner is a change in business processes. It is interaction between all stakeholders. Doing things different requires courage, understanding and trust you made the right choice. When it goes wrong, there are enough people around you to point fingers at why it went wrong – hindsight is so easy.
Doing nothing and becoming less and less competitive is easier (the boiling frog again) as in that case the outside world will be blamed, and there is nobody to point fingers at (although if you understand the issue you should make the organization aware the future is at stake)
Why is PLM so expensive?
Assuming you are still reading, and you and your management are aligned there is a need for PLM, a first investigation into possible solutions will reveal that PLM is not cheap.
When you calculate the overall investment required in PLM, the management often gets discouraged by the estimated costs. Yes, the benefits are much higher, but to realize these benefits, you need to have a clear understanding of your own business and a realistic idea how the future would look like. The benefits are not in efficiency. The main benefits come from capabilities that allow you to respond better and faster than by just optimizing your departments. I read a clarifying post recently, which is addressing this issue: Why PLM should be on every Executive’s agenda !
From my experience with PLM projects, it is surprising to learn that companies do not object to spend 5 to 20 times more money for an ERP implementation. It is related to the topic: management by results or management by means.
PLM is not expensive compared to other enterprise systems. It can become expensive (like ERP implementations) if you lose control. Software vendors have a business in selling software modules, like car resellers have a business in selling you all the comfort beyond the basics.
The same for implementation partners, they have a business in selling services to your company, and they need to find the balance between making money and delivering explainable value. Squeezing your implementation partner will cause a poor delivery. But giving them an open check means that, at a certain moment, someone will stand up and shutdown the money drain as the results are no longer justifiable. Often I meet companies in this stage, the spirit has gone. It is all about the balance between costs and benefits.
This happens in all enterprise software projects, and the only cure is investing in your own people. Give your employees time and priority to work in a PLM project. People with knowledge of the business are essential, and you need IT resources to implement. Do not make the mistake to leave business uncommitted to the PLM implementation. Management and middle management does not take the time to understand PLM as they are too busy or not educated / interested.
Make business owners accountable for the PLM implementation – you will see stress (it is not their daily job – they are busy), but in the longer time you will see understanding and readiness of the organization to achieve the expected results.
We are the largest – why select the largest ?
When your assignment is to select a new enterprise system, life could be easy for you. Select a product or service from the largest business and your career is saved. Nobody gets blamed for selecting the largest vendor, although if you work for a small mid-sized company, you might think twice.
Many vendors and implementers start their message with:
“…. Market leader in ABC, though leader in XYZ, recognized by 123”
The only thing you should learn from this message is that this company probably has delivered a trustworthy solution in the past. Looking at the past you get an impression of its readiness and robustness for the future. Many promising companies have been absorbed by the larger ones and disappeared. As Clayton Christensen wrote in The Innovators Dilemma:
“What goes up does not go down”.
Meaning these large companies focus on their largest clients and will focus less on the base of the business pyramid (where the majority is), making them vulnerable for disruptive innovation.
Related to this issue there is an interesting post (and its comments), written by Oleg Shilovitsky recently: How many PLM vendors disappear in disruption predicted by Gartner.
Still when selecting a PLM vendor it is essential to know if they have the scale to support you in the future and if they have the vision to guide you into the future.
The future of PLM is towards managing data in a connected manner, not necessary coming from a single database, not necessary using only structured data. If your PLM vendor or implementer is pushing you to realize document and file management, they are years late and not the best for your future.
PLM is a big elephant
PLM is considered as a big elephant, and I agree if you address everything in one shot that PLM can do. PLM has multiple directions to start from – I wrote about it: PLM at risk – it does not have a single job
PLM has a huge advantage compared to a transactional system like ERP and probably CRM. You can implement a PLM infrastructure and its functionality step by step in the organization, start with areas that are essential and produce clear benefits for the organization. That is the main reason that PLM implementations can take 2 – 3 years. You give the organization time to learn, to adapt and to extend.
We lose our flexibility ?
Nobody in an organization likes to be pushed in a cooperate way of working, which by definition is not as enjoyable and as flexible as they way you currently work. It is still an area where PLM implementations can improve: provide the user with an environment that is not too rigid and does not feel like a rigid system. You seen this problem with old traditional large PLM implementations for example with automotive OEMs. For them, it is almost impossible to switch to a new PLM implementation as everything has been built and connected in such a proprietary way, almost impossible to move to more standard systems and technologies. Late PLM implementations should learn from these lessons learned.
PLM vendor A says PLM vendor B will be out of business
One of the things I personally dislike is FUD (Fear, Uncertainty and Doubt). It has become a common practice in politics and I have seen PLM vendors and implementers using the same tactics. The problem with FUD is that it works. Even if the message is not verifiable, the company looking for a PLM system might think there must be some truth in this statement.
My recommendation to a company that gets involved in FUD during a PLM selection process, they should be worried about the company spreading the FUD. Apparently they have no stronger arguments to explain to you why they are the perfect solution; instead they tell you indirectly we are the less worst.
Is the future in the cloud ?
I think there are two different worlds. There is the world of smaller businesses that do not want to invest in an IT-infrastructure and will try anything that looks promising – often tools oriented. This is one of my generalizations of how US businesses work – sorry for that. They will start working with cloud based systems and not be scared by performance, scalability and security. As long all is easy and does not disturb the business too much.
Larger organizations, especially with a domicile in Europe, are not embracing cloud solutions at this moment. They think more in private or on-premise environments. Less in cloud solutions as security of information is still an issue. The NSA revelations prove that there is no moral limit for information in the sake of security – combined with the fear of IP theft from Asia, I think European companies have a natural resistance for storing data outside of their control.
For sure you will see cloud advocates, primarily coming from the US, claiming this is the future (and they are right), but there is still work to do and confidence to be built.
PLM selection often has a focus on checking hundreds of requirements coming from different departments. They want a dream system. I hope this post will convince you that there are so many other thoughts relevant to a PLM selection you should take into account. And yes you still need requirements (and a vision).
Your thoughts ?
- CIMdata Publishes PLM Geography Report (detroit.cbslocal.com)
Who does not remember this tagline from the first official Soap series starting in 1977 and released in the Netherlands in 1979?
Every week the Campbells and the Tates entertained us with all the ingredients of a real soap: murder, infidelity, aliens’ abduction, criminality, homosexuality and more.
The episode always ended with a set of questions, leaving you for a week in suspense , hoping the next episode would give you the answers.
For those who do not remember the series or those who never saw it because they were too young, this was the mother of all Soaps.
What has it to do with PLM?
Soap has to do with strange people that do weird things (I do not want to be more specific). Recently I noticed that this is happening even in the PLM blogger’s world. Two of my favorite blogs demonstrated something of this weird behavior.
First Steve Ammann in his Zero Wait-State blog post: A PLM junkie at sea point-solutions versus comprehensive mentioned sailing from Ventura CA to Cabo San Lucas, Mexico on a 35 foot sailboat and started thinking about PLM during his night shift. My favorite quote:
Besides dealing with a couple of visits from Mexican coast guard patrol boats hunting for suspected drug runners, I had time alone to think about my work in the PLM industry and specifically how people make decisions about what type of software system or systems they choose for managing product development information. Yes only a PLM “junkie” would think about PLM on a sailing trip and maybe this is why the Mexican coast guard was suspicious.
Second Oleg in his doomsday blog post: The End of PLM Communism, was thinking about PLM all the weekend. My favorite quote:
I’ve been thinking about PLM implementations over the weekend and some perspective on PLM concepts. In addition to that, I had some healthy debates over the weekend with my friends online about ideas of centralization and decentralization. All together made me think about potential roots and future paths in PLM projects.
It demonstrates the best thinking is done during out-of-office time and on casual locations. Knowing this from my long cycling tours in the weekend, I know it is true.
I must confess that I have PLM thoughts during cycling.
Perhaps the best thinking happens outside an office?
I leave the follow up on this observation to my favorite Dutch psychologist Diederik Stapel, who apparently is out of office too.
Both posts touch the topic of a single comprehensive solution versus best-of-breed solutions. Steve is very clear in his post. He believes that in the long term a single comprehensive solution serves companies better, although user performance (usability) is still an issue to consider. He provides guidance in making the decision for either a point solution or an integrated solution.
And I am aligned with what Steve is proposing.
Oleg is coming from a different background and in his current position he believes more in a distributed or network approach. He looks at PLM vendors/implementations and their centralized approach through the eyes of someone who knows the former Soviet Union way of thinking: “Centralize and control”.
The association with communism which was probably not the best choice when you read the comments. This association makes you think as the former Soviet Union does not exist anymore, what about former PLM implementations and the future? According to Oleg PLM implementations should be more focused on distributed systems (on the cloud ?), working and interacting together connecting data and processes.
And I am aligned with what Oleg is proposing.
Confused? You want be after reading my recent experience.
I have been involved in the discussion around the best possible solution for an EPC contractor (Engineering Procurement Construction) in the Oil & Gas industry. The characteristic of their business is different from standard manufacturing companies. EPC contractors provide services for an owner/operator of a plant and they are selected because of their knowledge, their price, their price, their price, quality and time to deliver.
This means an EPC contractor is focusing on execution, making sure they have the best tools for each discipline and this is the way they are organized and used to work. The downside of this approach is everyone is working on its own island and there is no knowledge capitalization or sharing of information. The result each solution is unique, which brings a higher risk for errors and fixes required during construction. And the knowledge is in the head of experience people ….. and they retire at a certain moment.
So this EPC contractor wanted to build an integrated system, where all disciplines are connected and sharing information where relevant. In the Oil & Gas industry, ISO15926 is the standard. This standard is relative mature to serve as the neutral exchange standard of information between disciplines. The ideal world for best in class tools communicating with each other, or not ?
Imagine there are 6 discipline tools, an engineering environment optimized for plant engineering, a project management environment, an execution environment connecting suppliers and materials, a delivery environment assuring the content of a project is delivered in the right stages and finally a knowledge environment, capitalizing lessons learned, standards and best practices.
This results in 6 tools and 12 interfaces to a common service bus connecting these tools. 12 interfaces as information needs to be send and received from the service bus per application. Each tools will have redundant data for its own execution.
What happens if a PLM provider could offer three of these tools on a common platform? This would result into 4 tools to install and only 8 interfaces. The functionality in the common PLM system does not require data redundancy but shares common information and therefore will provide better performance in a cross-discipline scenario.
In the ultimate world all tools will be on one platform, providing the best performance and support for this EPC contractor. However this is utopia. It is almost impossible to have a 100 % optimized system for a group of independent companies working together. Suppliers will not give up their environment and own IP to embed it in a customer´s ideal environment. So there is always a compromise to find between a best integrated platform (optimal performance – reduced cost of interfaces and cost of ownership) and the best connected environment (tools connection through open standards).
And this is why both Steve and Oleg have a viewpoint that makes sense. Depending on the performance of the tools and the interaction with the supplier network the PLM platform can provide the majority of functionality. If you are a market dominating OEM you might even reach 100 % coverage for your own purpose, although the modern society is more about connecting information where possible.
MY CONCLUSION after reading both posts:
- Oleg tries to provoke, and like a soap, you might end up confused after each episode.
- Steve in his post gives a common sense guidance, useful if you spend time on digesting it, not a soap.
Now I hope you are not longer confused and wish you all a successful and meaningful 2013. The PLM soap will continue in alphabetical order:
- Will Aras survive 21-12-2012 and support the Next generation ?
- Will Autodesk get of the cloud or have a coming out ?
- Will Dassault get more Experienced ?
- Will Oracle PLM customers understand it is not a database ?
- Will PTC get out of the CAD jail and receive $ 200 ?
- Will SAP PLM be really 3D and user friendly ?
- Will Siemens PLM become a DIN or ISO standard ?