You are currently browsing the category archive for the ‘Digital Enterprise’ category.
In my previous post describing the various facets of the EBOM, I mentioned several times classification as an important topic related to the PLM data model. Classification is crucial to support people to reuse information and, in addition, there are business processes that are only relevant for a particular class of information, so it is not only related to search/reuse support.
In 2008, I wrote a post about classification, you can read it here. Meanwhile, the world has moved on, and I believe more modern classification methods exist.
Why classification ?
First of all classification is used to structure information and to support retrieval of the information at a later moment, either for reuse or for reference later in the product lifecycle. Related to reuse, companies can save significant money when parts are reused. It is not only the design time or sourcing time that is reduced. Additional benefits are lower risks for errors (fewer discoveries), reduced process and approval time (human overhead), reduced stock (if applicable), and more volume discount (if applicable) and reduced End-Of-Life handling.
An interesting discussion about reuse started by Joe Barkai can also be found on LinkedIn here, including interesting comments
Classification can also be used to control access to certain information (mainly document classification), or classification can be used to make sure certain processes are followed, e.g. export control, hazardous materials, budget approvals, etc. Although I will speak mainly about part classification in this post, classification can be used for any type of information in the PLM data model.
Classification standards
Depending on the industry you are working in, there are various classification standards for parts. When I worked in the German-speaking countries (the DACH-länder) the most discussed classification at that time was DIN4000 (Sachmerkmal-liste), a must have standard for many of the small and medium sized manufacturing companies. The DIN 4000 standard had a predefined part hierarchy and did not describe the necessary properties per class. I haven’t met a similar standard in other countries at that time.
Another very generic classification I have seen are the UNSPC standard, again a hierarchical classification supporting everything in the universe but no definition of attributes.
Other classification standards like ISO13399, RosettaNET, ISO15926 and IFC exist to support collaboration and/or the supply chain. When you want to exchange data with other disciplines or partners. The advantage of a standard definition (with attributes) is that you can exchange data with less human processing (saving labor costs and time – the benefit of a digital enterprise).
I will not go deeper into the various standards here as I am not the expert for all the standards. Every industry has its own classification standards, a hierarchical standard, and if more advanced the hierarchy is also supported by attributes related to each class. But let´s go into the data model part.
Classification and data model
The first lesson I learned when implementing PLM was that you should not build your classification hard-coded into the PLM, data model. When working with SmarTeam is was very easy to define part classes and attributes to inherit. Some customers had more than 300 classes represented in their data model just for parts. You can imagine that it looks nice in a demo. However when it comes to reality, a hard-coded classification becomes a pain in the model. (left image, one of the bad examples from the past)
1 – First of all, classification should be dynamic, easy to extend.
2 – The second problem however with a hard-coded classification was that once a part is defined for the first time the information object has a fixed class. Later changes need a lot of work (relinking of information / approval processes for the new information).
3 – Finally, the third point against a hard-coded classification is that it is likely that parts will be classified according to different classifications at the same time. The image bellow shows such a multiple classification.
So the best approach is to have a generic part definition in your data model and perhaps a few subtypes. Companies tend to differentiate still between hardware (mechanical / electrical) parts and software parts.
Next a part should be assigned at least to one class, and the assignment to this class would bring more attributes to the part. Most of the PLM systems that support classification have the ability to navigate through a class hierarchy and find similar parts.
When parts are relevant for ERP they might belong to a manufacturing parts class, which add particular attributes required for a smooth PLM – ERP link. Manufacturing part types can be used as templates for ERP to be completed.
This concept is also shared by Ed Lopategui as commented to my earlier post about EBOM Part types. Ed states:
Think part of the challenge moving forward is we’ve always handled these as parts under different methodologies, which requires specific data structures for each, etc. The next gen take on all this needs to be more malleable perhaps. So there are just parts. Be they service or make/buy or some combination – say a long lead functional standard part and they would acquire the properties, synchronizations, and behaviors accordingly. People have trouble picking the right bucket, and sometimes the buckets change. Let the infrastructure do the work. That would help the burden of multiple transitions, where CAD BOM to EBOM to MBOM to SBOM eventually ends up in a chain of confusion.
I fully agree with his statement and consider this as the future trend of modern PLM: Shared data that will be enriched by different usage through the lifecycle.
Why don’t we classify all data in PLM?
There are two challenges for classification in general.
- The first one is that the value of classification only becomes visible in the long-term, and I have seen several young companies that were only focusing on engineering. No metadata in the file properties, no part-centric data management structure and several years later they face the lack of visibility what has been done in the past. Only if one of the engineers remembers a similar situation, there is a chance of reuse.
- The second challenge is that through a merger or acquisition suddenly the company has to manage two classifications. If the data model was clean (no hard-coded subclasses) there is hope to merge the information together. Otherwise, it might become a painful activity to discover similarities.
SO THINK AHEAD EVEN IF YOU DO NOT SEE THE NEED NOW !
Modern search based applications
There are ways to improve classification and reuse by using search-based application which can index archives and try to find similarity in properties / attributes. Again if the engineers never filled the properties in the CAD model, there is little to nothing to recover as I experienced in a customer situation. My PLM US peer, Dick Bourke, wrote several articles about search-based applications and classification for engineering.com, which are interesting to read if you want to learn more: Useful Search Applications for Finding Engineering Data
So much to discuss on this topic, however I reached my 1000 words again ![]()
Conclusion
Classification brings benefits for reuse and discovery of information although benefits are long-term. Think long-term too when you define classifications. Keep the data model simple and add attributes groups to parts based on functional classifications. This enables a data-driven PLM implementation where the power is in the attributes not longer in the part number. In the future, search-based applications will offer a quick start to classify and structure data.
Two weeks ago I got this message from WordPress, reminding me that I started blogging about PLM on May 22nd in 2008. During some of my spare time during weekends, I began to read my old posts again and started to fix links that have been disappearing.
Initially when I started blogging, I wanted to educate mid-market companies about PLM. A sentence with a lot of ambiguities. How do you define the mid-market and how do you define PLM are already a good start for a boring discussion. And as I do not want to go into a discussion, here are my “definitions”
Warning: This is a long post, full of generalizations and a conclusion.
PLM and Mid-market
The mid-market companies can be characterized as having a low-level of staff for IT and strategic thinking. Mid-market companies are do-ers and most of the time they are good in their domain based on their IP and flexibility to deliver this to their customer base. I did not meet mid-market companies with a 5-year and beyond business vision. Mid-market companies buy systems. They bought an ERP system 25-30 years ago (the biggest trauma at that time). They renewed their ERP system for the Y2K problem/fear and they switched from drawing board towards a 2D CAD system. Later they bought a 3D CAD system, introducing the need for a PDM system to manage all data.
PLM is for me a vision, a business approach supported by an IT-infrastructure that allows companies to share and discover and connect product related information through the whole lifecycle. PLM enables companies to react earlier and better in the go-to-market process. Better by involving customer inputs and experience from the start in the concept and design phases. Earlier thanks to sharing and involving other disciplines/suppliers before crucial decisions are made, reducing the amount of iterations and the higher costs of late changes.
Seven years ago I believed that a packaged solution, combined with a pre-configured environment and standard processes would be the answer for mid-market companies. The same thought currently PLM vendors have with a cloud-based solution. Take it, us it as it is and enjoy.
Here I have changed my opinion in the past seven years. Mid-market companies consider PLM as a more complex extension of PDM and still consider ERP (and what comes with that system) as the primary system in the enterprise. PLM in mid-market companies is often seen as an engineering tool.
LESSON 1 for me:
The benefits of PLM are not well-understood by the mid-market
To read more:
PLM for the mid-market – mission impossible?
PLM for the SMB – a process or culture change ?
Culture change in a mid-sized company – a management responsibility
Mid-market PLM – what did I learn in 2009 ?
Implementing PLM is a change not a tool
Who decides for PLM in a mid-market company ?
More on: Who decides for PLM in a mid-market company ?
Globalization and Education
In the past seven years, globalization became an important factor for all type of companies. Companies started offshoring labor intensive work to low-labor-cost countries introducing the need for sharing product data outside their local and controlled premises. Also, acquisitions by larger enterprises and by some of the dominant mid-market companies, these acquisitions introduced a new area of rethinking. Acquisitions introduced discussions about: what are real best practices for our organization? How can we remain flexible, meanwhile adapt and converge our business processes to be future ready?
Here I saw two major trends in the mid-market:
Lack of (PLM) Education
To understand and implement the value of PLM, you need to have skills and understanding of more than just a vendor-specific PLM system. You need to understand the basics of change processes (Engineering Change Request, Engineering Change Order, Manufacturing Change Order and more). And you need to understand the characteristics of a CAD document structure, a (multidisciplinary) EBOM, the MBOM (generic and/or plant specific) and the related Bill of Processes. This education does not exist in many countries and people are (mis-)guided by their PLM/ERP vendor, explaining why their system is the only system that can do the job.
Interesting enough the most read posts on my blog are about the MBOM, the ETO, BTO and CTO processes. This illustrates there is a need for a proper, vendor-independent and global accepted terminology for PLM
Some educational posts:
Bill of Materials for Dummies – ETO ranked #1
ECR/ECO for Dummies ranked #2
BOM for Dummies – CTO ranked #4
BOM for Dummies: BOM and CAD ranked #7
BOM for Dummies – BTO
Where does PLM start beyond document management ?
The dominance of ERP
As ERP systems were introduced long before PLM (and PDM), these systems are often considered by the management of a mid-market company as the core. All the other tools should be (preferably) seen as an extension of ERP and if possible, let´s implement ERP vendor´s functionality to support PLM – the Swiss knife approach – one tool for everything. This approach is understandable as at the board level there are no PLM discussions. Companies want to keep their “Let´s do it”-spirit and not reshuffle or reorganize their company, according to modern insights of sharing. Strangely enough, you see in many businesses the initiative to standardize on a single ERP system first, instead of standardizing on a single PLM approach first. PLM can bring the global benefits of product portfolio management and IP-sharing, where ERP is much more about local execution.
LESSON 2:
PLM is not understood at the board level, still considered as a tool
Some post related to PLM and ERP
Where is the MBOM ? ranked #3
Connecting PLM and ERP (post 1) – (post 2) – (post 3) ranked #8
PLM and ERP – the culture change
5 reasons not to implement PLM – Reason #3 We already have an ERP system
The human factor
A lot of the reasons why PLM has the challenge to become successful have to do with its broad scope. PLM has an unclear definition and most important, PLM forces people to share data and work outside their comfort zones. Nobody likes to share by default. Sharing makes day-to-day life more complicated, sharing might create visibility on what you actually contribute or fix. In many of my posts, I described these issues from various viewpoints: the human brain, the innovators dilemma, the way the older generation (my generation) is raised and used to work. Combined with the fact that many initial PLM/PDM implementations have created so many legacies, the need to change has become a risk. In the discussion and selection of PLM I have seen many times that in the end a company decides to keep the old status quo (with new tools) instead of really having the guts to move toward the future. Often this was a result of investors not understanding (and willing to see) the long term benefits of PLM.
LESSON 3:
PLM requires a long-term vision and understanding, which most of the time does not fit current executive understanding (lack of education/time to educate) and priority (shareholders)
Many recent posts are about the human factor:
The Innovator´s dilemma and PLM
Our brain blocks PLM acceptance
How to get users excited or more committed to a new PLM system?
The digital transformation
The final and most significant upcoming change is the fact that we are entering a complete new era: From linear and predictable towards fast and iterative, meaning that classical ways we push products to the market will become obsolete. The traditional approach was based on lessons learned from mechanical products after the second world-war. Now through globalization and the importance of embedded software in our products, companies need to deliver and adapt products faster than the classical delivery process as their customers have higher expectations and a much larger range to choose from. The result from this global competitiveness is that companies will change from delivering products towards a more-and-more customer related business model (continuous upgrades/services). This requires companies to revisit their business and organization, which will be extremely difficult. Business wise and human change require new IT concepts – platform? / cloud services? / Big data?
Older enterprises, mid-market and large enterprises will be extremely challenged to make this change in the upcoming 10 years. It will be a matter of survival and I believe the Innovator´s Dilemma applies here the most.
LESSON 4:
The digital transformation is apparent as a trend for young companies and strategic consultants. This message is not yet understood at the board level of many businesses.
Some recent post related to this fast upcoming trend:
From a linear world to fast and circular ?
Did you notice PLM is changing?
Documents or Intelligent Data ?
The difference between files and data-oriented – a tutorial (part 1) – (part 2) – (part 3)
PLM and/or SLM? – (part 1) – (part 2)
Breaking down the silos with data
ROI (Return On Investment)
I also wrote about ROI – a difficult topic to address as in most discussions related to ROI, companies are talking about the costs of the implementation, not about the tremendous larger impact a new business approach or model can have, once enabled through PLM. Most PLM ROI discussions are related to efficiency and quality gains, which are significant and relevant. However these benefits are relative small and not comparable with the ability to change your business (model) to become more customer centric and stay in business.
Some of the ROI posts:
To PLM or Not to PLM – measuring the planning phase ranked #5
Free PLM Software does not help companies ranked #6
PLM selection–additional thoughts
PLM Selection: Proof Of Concept observations
Where is my PLM Return On Investment (ROI) ?
Conclusion
A (too) long post this time however perhaps a good post to mark 7 years of blogging and use it as a reference for the topics I briefly touched here. PLM has many aspects. You can do the further reading through the links.
From the statistics it is clear that the education part scores the best – see rankings. For future post, let me know by creating a comment what you are looking for in this blog: PLM Mid-Market, Education, PLM and ERP, Business Change, ROI, Digitalization, or …??
Also I have to remain customer centric – thanks for reading and providing your feedback
Above Image courtesy of the marketoonist.com – Tom Fishburne
Image related to digital transformation: The Economist – the onrushing wave
Three weeks ago there was the Product Innovation conference in Düsseldorf. In my earlier post (here) I described what I experienced during this event. Now, after all the information is somehow digested, here a more high-level post, describing the visible change in business and how it relates to PLM. Trying to describe this change in non-academic wording but in images. Therefore, I described the upcoming change in the title: from linear to circular and fast.
Let me explain this image step by step
In the middle of the previous century, we were thinking linear in education and in business. Everything had a predictable path and manufacturing companies were pushing their products to the market. First local, later in time, more global. Still the delivery process was pretty linear:
This linear approach is reflected in how organizations are structured, how they are aligned to the different steps of the product development and manufacturing process. Below a slide I used at the end of the nineties to describe the situation and the pain; lack of visibility what happens overall.
It is discouraging to see that this situation still exists in many companies.
At the end of the nineties, early 2000, PLM was introduced, conceptually managing the whole lifecycle. In reality, it was mainly a more tight connection between design and manufacturing preparation, pushing data into ERP. The main purpose was managing the collaboration between different design disciplines and dispersed teams.
Jim Brown (Tech-Clarity) wrote at that time a white paper, which is still valid for many businesses, describing the complementary roles of PLM and ERP. See the picture below:
Jim introduced the circle and the arrow. PLM: a circle with iterations, interacting with ERP: the arrow for execution. Here visual it became already clear an arrow does not have the same behavior as a circle. The 100 % linearity in business was gone.
Let´s have a closer look at the PLM circle
This is how PLM is deployed in most organizations:
Due to the implementation of siloed systems for PDM, ERP, SCM and more, the flow of information is disconnected when moving from the design domain to the execution domain.
Information is pushed in the ERP system as disconnected information, no longer managed and connected to its design intent.
Next, the ERP system is most of the time not well-equipped for managing after sales and services content. Another disconnect comes up.
Yes, spare parts could be ordered through ERP, but issues appearing at the customer base are not stored in ERP, often stored in a separate system again (if stored beyond email).
The result is that when working in the concept phase, there is no information available for R&D to have a good understanding of how the market or customers work with their product. So how good will it be? Check in your company how well your R&D is connected with the field?
And then the change started …
This could have stayed reality for a long time if there were not a huge business change upcoming. The world becomes digital and connected. As a result, local inefficiencies or regional underperformance will be replaced by better-performing companies. The Darwin principle. And most likely the better performing companies are coming from the emerging markets as there they do not suffer from the historical processes and “knowledge of the past”. They can step into the digital world much faster.
In parallel with these fast growing emerging markets, we discovered that we have to reconsider the ways we use our natural resources to guarantee a future for next generations. Instead of spilling resources to deliver our products, there is a need to reuse materials and resources, introducing a new circle: the circular economy.
The circular economy can have an impact on how companies bring products to the market. Instead of buying products (CAPEX) more and more organizations (and modern people) start using products or services in a rental model (OPEX). No capital investment anymore, pay as you go for usage or capacity.
This, however, has an impact how traditional companies are organized – you need to be connected to your customers or you are out of business – a commodity.
The digital and connected world can have a huge impact on the products or services available in the near future. You are probably familiar with the buzz around “The Internet of Things” or “Smart and Connected”.
No longer are products depending on mechanical behavior only, more and more products are relying on electrical components with adaptive behavior through software. Devices that connect with their environment report back information to the manufacturer. This allows companies to understand what happens with their products in the field and how to react on that.
Remember the first PLM circle?
Now we can create continuity of data !
Combine the circular economy, the digital and connected world and you will discover everything can go much faster. A crucial inhibitor is how companies can reorganize themselves around this faster changing, circular approach. Companies need to understand and react to market trends in the fastest and adequate way. The future will be probably about lower volumes of the same products, higher variability towards the market and most likely more and more combining products with services (the Experience Model). This requires a flexible organization and most likely a new business model which will differ from the sequential, hierarchical organizations that we know at this moment.
The future business model ?
The flexibility in products and services will more and more come from embedded software or supported by software services. Software services will be more and more cloud based, to avoid IT-complexity and give scalability.
Software development and integration with products and services are already a challenge for classical mechanical companies. They are struggling to transform their mechanical-oriented design process towards support for software. In the long-term, the software design process could become the primary process, which would mean a change from (sequential – streamlined) lean towards (iterative – SCRUM) agile.
Once again, we see the linear process becoming challenged by the circular iterations.
This might be the end of lean organizations, potentially having to mix with agile conepts..
If it was a coincidence or not, I cannot judge, however during the PI Conference I learned about W.L. Gore & Associates, with their unique business model supporting this more dynamic future. No need to have a massive organization re-org to align the business, as the business is all the time aligning itself through its employees.
Last weekend, I discovered Semco Partners in the newspaper and I am sure there are more companies organizing themselves to become reactive instead of linear – for sure in high-tech world.
Conclusion:
Linearity is disappearing in business, it is all about reactive, multidisciplinary teams within organizations in order to support customers and their fast changing demands.
Fast reactions need new business organizations models (flexible, non-hierarchical) and new IT-support models (business information platforms – no longer PLM/ERP system thinking)
What do you think ? The end of linear ?
I have talked enough about platforms recently. Still if you want to read more about it:
Cimdata: Business strategy and platformization position paper
Engineering.com: Prod. Innovation Platform PlugnPlay in next generation PLM
Gartner: Product Innovation Platforms
VirtualDutchman: Platform, Backbone, Service Bus or BI
This is the fifth year that marketkey organized their vendor-independent conference in Europe around Product Innovation, where PLM is the major cornerstone. Approximate 100 companies attended this conference coming from various industries. As there were most of the time two till four parallel tracks (program here), it will still take time for me to digest all the content. However here a first impression and a comparison to what has changed since the PI Conference in 2014 – you can read my review from that conference here.
First of all the keynote speeches for this conference were excellent and were a good foundation for attendees to discuss and open their mind. Secondly I felt that this conference was actually dealing with the imminent shift from classic, centralized businesses towards the data-centric approach to connectivity of information coming from anyone / anything connected. Naturally the Internet of Everything (IoE) and the Internet of Things (IoT) were part of the discussion combined with changing business models: moving from delivering products toward offering services (CAPEX versus OPEX).
Some of the highlights here:
The first keynote speaker was Carlo Rati Director, MIT Senseable Lab. He illustrated through various experiments and examples how being connected through devices we can change and improve our world: tagging waste, mobile phone activity in a city and the Copenhagen Wheel. His main conclusion (not a surprise): For innovation there is a need to change collaboration. Instead of staying within the company / discipline boundaries solving problems through collaboration between different disciplines will lead to different thinking. How is your company dealing with innovation?
The second session I attended was John Housego from W.L. Gore and Associates who explained the company’s model for continuous growth and innovation. The company’s future is not based on management but based on leadership of people working in teams in a flat organization. Every employee is an associate, directly involved and challenged to define the company’s future. Have a read about the company’s background here on Wikipedia.
Although the company is 50 years old, I realized that their cultural model is a perfect match with the future of many businesses. More and more companies need to be lean and flexible and support direct contact between the field, customers, market and experts inside the company. Implementing a modern PLM platform should be “a piece of cake” if the technology exists, as W.L. Gore’s associates will not block the change if they understand the value. No silos to break down.
My presentation “The Challenge of PLM Upgrades as We See the Rules of Business Change” was based around two themes (perpetual software ? / seamless upgrades ?) and from there look towards the future what to expect in business. When we look back, we see that every 10 years there is a major technology change, which makes the past incompatible to upgrade. Now we are dreaming that cloud-based solutions are the future to guarantee seamless upgrades (let’s wait 10 years). To my opinion companies should not consider a PLM upgrade at this moment.
The changes in business models, people behavior and skills plus technology change, will enable companies to move towards a data-centric approach. Companies need to break with the past (a linear, mechanical-design-based, product development approach) and redesign a platform for the future (a business-innovation platform based on the data). In my upcoming blog post(s) I will give more background on this statement.
Trond Zimmerman from the Volvo Group Truck explained the challenges and solution concept they experienced as they are currently implementing answering the challenge of working in a joint venture with Dongfeng Commercial Vehicles. As in a joined venture you want to optimize sharing of common parts, still you cannot expect a single PLM solution for the total joint venture. For that reason, Volvo Group Truck is implementing Share-A-Space from Eurostep to have a controlled collaboration layer between the two joint venture partners.
This is, to my opinion, one of the examples of future PLM practices, where data will not be stored in a single monolithic system, but data will be connected through information layers and services. The case is similar to what has been presented last year at Product Innovation 2014 where Eurostep and Siemens Industrial Turbomachinery implemented a similar layer on top of their PDM environment to enable controlled sharing with their suppliers.
David Rowan from wired.co.uk closed the day with his keynote: Understanding the New Rules of Product Innovation. He touched the same topic as John Housego from W.L. Gore somehow: it is all about democratization. Instead of hierarchy we are moving to network-based activities. And this approach has a huge impact on businesses. David’s message: Prepare for constant change. Where in the past we lived in a “linear” century, change according to Moore’s law, we are entering now an exponential century where change is going faster and faster. Besides examples of the Internet of Thing, David also gave some examples of the Internet of Stupid Things. He showed a quote from Steve Balmer stating that nobody would pay $ 500 for a phone (Apple). The risk he made is that by claiming some of these stupid inventions might lead to a quote in the future. I think the challenge is always to stay open-minded without judging as at the end the market will decide.
PLM and ERP
I spent the evening networking with a lot of people, most of them excited about the future capabilities that have been presented. In parallel, the discussion was also about the conservative behavior of many companies. Topics that are already for ten years under discussion – how to deal and connect PLM and ERP, where is the MBOM, what are the roles of PLM and ERP for an organization, are still thankful topics for a discussion, showing where most companies now are with their business understanding.
In parallel to a product innovation conference apparently there is still a need to agree on basic PLM concepts from the previous century.
The second day opened with an excellent keynote speech from Dirk Schlesinger from Cisco. He talked about the Internet of Everything and provided examples of the main components of IoE: Connectivity, Sensors, Platform, Analytics, and Mobility. In particular the example of Connectivity was demonstrating the future benefits modern PLM platforms can bring. Dirk talked about a project with Dundee Mining where everything in the mine was tagged with RFI devices (people, equipment, vehicles, and resources) and the whole mine was equipped with Wi-Fi.
Based on this approach the execution and planning of what happened was done in their HQ through a virtual environment, giving planners immediate visibility of what happens and allowing them to decide on real data. This is exactly the message I have posted in my recent blog posts.
The most fascinating part were the reported results. This project is ongoing now for 3 years and the first year they achieved a production increase of 30 %. Now they are aiming for this year for a 400 % production increase and a 250 % efficiency increase. These are the numbers to imagine when you implement a digital strategy. It is no longer any more about making our classical processes more efficient, it is about everyone connected and everyone collaborates.
Marc Halpern from Gartner gave an good presentation connecting the hype of the Internet of Things with the world of PLM again, talking about Product Innovation Platforms. Marc also touched on the (needed) upcoming change in engineering processes. More and more we will develop complex products, which need system thinking. Systems of Systems to handle this complexity, As Marc stated: “Product, process, culture is based on electro-mechanical products where the future trend is all about software.” We should reconsider our Bill of Materials (mechanical) and think probably more about a Bill of Features (software). Much of Marc’s presentation contained the same elements as I discussed in my PDT2014 blog post from October last year.
I was happy to see Jenni Ala-Mantila presenting the usage of PLM system for Skanska Oy. Skanska is one of the largest construction companies operating global. See one of their beautiful corporate videos here. I always have been an advocate to use PLM practices and PLM infrastructure to enhance, in particular, the data-continuity in a business where people work in silos with separate tools. There are so many benefits to gain by having an end-to-end visibility of the project and its related data. Jenni’s presentation was confirming this.
By implementing a PLM backbone with a focus on project management, supplier collaboration and risk management, she confirmed that PLM has contributed significant to their Five Zero – vision: Zero loss-making projects, Zero Environmental incidents, Zero Accidents, Zero Ethical breaches and Zero Defects. Skanska is really a visionary company although it was frustrating to learn that there was still a need to build a SharePoint connection with their PLM environment. The future of data-centric has not reached everyone in the organization yet.
The last two sessions of the conference, a panel discussion “Why is Process Innovation Challenging & What can be done about it” plus the final keynote “Sourcing Growth where Growth Takes Place” had some commonality which I expressed in some twitter quotes:
Conclusion
Where last year I had the impression that the PLM world was somehow in a static mode, not so much news in 2014. It became clear in this 2015 conference that the change towards new business paradigms is really happening and at a faster pace than expected. From mechanical development processes to software processes, from linear towards continuous changes. Moe to come this year
A year ago I wrote a blog post questioning if the construction industry would learn from PLM practices in relation to BIM.
In that post, I described several lessons learned from other industries. Topics like:
- Working on a single, shared repository of on-line data (the Digital Mock Up). Continuity of data based on a common data model – not only 3D
- It is a mindset. People need to learn to share instead of own data
- Early validation and verification based on a virtual model. Working in the full context
- Planning and anticipation for service and maintenance during the design phase. Design with the whole lifecycle in mind (and being able to verify the design)
The comments to that blog post already demonstrated that the worlds of PLM and BIM are not 100 percent comparable and that there are some serious inhibitors preventing them to come closer. One year later, let´s see where we are:
BIM moving into VDC (or BLM ?)
The first trend that becomes visible is that people in the construction industry start to use more and more the term Virtual Design and Construction (VDC) instead of BIM (Building Information Model or Building Information Management?).
The good news here is that there is less ambiguity with the term VDC instead of BIM. Does this mean many BIM managers will change their job title? Probably not as most construction companies are still in the learning phase what a digital enterprise means for them.
Still Virtual Design and Construction focuses a lot on the middle part of the full lifecycle of a construction. VDC does not necessary connect the early concept phase and for sure almost neglects the operational phase. The last phase is often ignored as construction companies are not thinking (yet) about Repair & Maintenance contracts (the service economy).
And surprisingly, last week I saw a blog post from Dassault Systemes, where Dassault introduced the word BLM (Building Lifecycle Management). Related to this blog post also some LinkedIn discussions started. BLM, according to Dassault Systemes, is the combination of BIM and PLM – read this post here.
The challenge however for construction companies is to, what are the related data sets they require and how can you create this continuity of data. This brings us to one of the most important inhibitors.
Data Ownership
Where in other industries a clear product data owner exists, the ownership of data in EPC (Engineering, Procurement, Construction) companies, typical for the construction industry or oil & gas industry is most of the times on purpose vague.
First of all the owner of a construction often does not know which data could be relevant to maintain. And secondly, as soon as the owner asks for more detailed information, he will have to pay for that, raising the costs, which not directly flow back to benefits, only later during the FM (Facility Management) /Operational stage.
And let´s imagine the owner could get the all the data required. Next the owner is at risk, as potentially having the information might makes you liable for mistakes and claims.
From discussion with construction owners I learned their policy is not to aim for the full dataset related to a construction. It reduces the risk to be liable. Imagine Boeing and Airbus would follow this approach. This brings us to another important inhibitor.
A risk shifting business
The construction industry on its own is still a risk shifting business, where each party tries to pass the risk of cost of failure to another stakeholder in the pyramid. The most powerful owners / operators of the construction industry quickly play down the risk to their contractors and suppliers. And these companies then then distribute the risk further down to their subcontractors.
If you do not accept the risk, you are no longer in the game. This is different from other industries and I have seen this approach in a few situations.
For example, I was dealing with an EPC company that wanted to implement PLM. The company expected that the PLM implementer would take a large part of the risk for the implementation. As they were always taking the risk too for their big customers when applying for a project. Here there was a clash of cultures, as PLM implementers learned that the risk of a successful PLM implementation is vague as many soft values define the success. It is not a machine or platform that has to work after some time.
Another example was related to requirements management. Here the EPC company wanted to become clear and specific to their customer. However their customer reacted very strange. Instead of being happy that the EPC company invested in more upfront thinking and analysis, the customer got annoyed as they were not used to be specific so early in the process. They told the EPC company, “if you have so many questions, probably you do not understand the business”.
So everyone in the EPC business is pushed to accept a higher risk and uncertainty than other industries. However, the big reward is that you are allowed to have a cost of failure above 15 – 20 percent without feeling bad. Which this percentage you would be out of business in other industries. And this brings us to another important inhibitor.
Accepted high cost of failure
As the industry accepts this high cost of failure, companies are not triggered to work different or to redesign their processes in order to lower the inefficiencies. The UK government mandates BIM Level 2 for their projects starting in 2016 and beyond, to reduce costs through inefficiencies.
But will the UK government invest to facilitate and aim for data ownership? Probably not, as the aim of governments is not to be extreme economical. Being not liable has a bigger value than being more efficient for governments as I learned. Being more efficient is the message to the outside world to keep the taxpayer satisfied.
It is hard to change this way of thinking. It requires a cultural change through the whole value chain. And cultural change is the “worst” thing that can happen to a company. The biggest inhibitor.
Cultural change
Cultural change is a point that touches all industries and there is no difference between the construction industry and for example a classical discrete manufacturing company. Because of global competition and comparable products other industries have been forced already to work different, in order to survive (and are still challenged)
The cultural change lies in people. We (the older generation) are educated and brought up in classical engineering models that reflect the post second world war best practices. Being important in a process is your job justification and job guarantee.
New paradigms, based on a digital world instead of a document-shifting world, need to be defined and matured and will make many classical data processing jobs redundant. Read this interesting article from the Economist: The Onrushing Wave
This is a challenge for every company. The highest need to implement this cultural change is ironically for those countries with the highest legacy: Western Europe / the United-States.
As these countries also have the highest labor cost, the impact of, keep on doing the old stuff, will reduce their competitiveness. The impact for construction companies is less, as the construction industry is still a local business, as at the end resources will not travel the globe to execute projects.
However cheaper labor costs become more and more available in every country. If companies want to utilize them, they need to change the process. They need shift towards more thinking and knowledge in the early lifecycle to avoid the need for high qualified people to be in the field to the fix errors.
Sharing instead of owning
For me the major purpose of PLM is to provide an infrastructure for people to share information in such a manner that others, not aware of the information, can still easily find and use the information in a relevant context of their activities. The value: People will decide on actual information and no longer become reactive on fixing errors due to lack of understanding the context.
The problem for the construction industry is that I have not seen any vendor focusing on sharing the big picture. Perhaps the BLM discussion will be a first step. For the major tool providers, like Autodesk and Bentley, their business focus is on the continuity of their tools, not on the continuity of data.
Last week I noticed a cloud based Issue Management solution, delivered by Kubus. Issue Management is one of the typical and easy benefits a PLM infrastructure can deliver. In particular if issues can be linked to projects, construction parts, processes, customers. If this solution becomes successful, the extension might be to add more data elements to the cloud solution. Main question will remain: Who owns the data ? Have a look:
For continuity of data, you need standards and openness – IFC is one of the many standards needed in the full scope of collaboration. Other industries are further developed in their standards driven by end-user organizations instead of vendors. Companies should argue with their vendors that openness is a right, not a privilege.
Conclusion
A year ago, I was more optimistic about the construction industry adopting PLM practices. What I have learned this year, and based on feedback from others, were are not at the turning point yet. Change is difficult to achieve from one day to the other. Meanwhile, the whole value chain in the construction industry has different objectives. Nobody will take the risk or can afford the risk.
I remain interested to see where the construction industry is heading.
What do you think will 2015 be the year of a breakthrough?
Shaping the PLM platform of the Future
In this post my observations from the PDT 2014 Europe conference which was hosted in the Microsoft Conference center in Paris and organized by Eurostep and CIMdata.
It was the first time I attended this event. I was positively surprised about the audience and content. Where other PLM conferences were often more focusing on current business issues, here a smaller audience (130 persons) was looking into more details around the future of PLM. Themes like PLM platforms, the Circular Economy, Open Standards and longevity of data were presented and discussed here.
The emergence of the PLM platform
Pieter Bilello from CIMdata kicked off with his presentation: The emergence of the PLM platform. Peter explained we have to rethink our PLM strategy for two main reasons:
1. The product lifecycle will become more and more circular due to changing business models and in parallel the different usage/availability of materials will have an impact how we design and deliver products
2. The change towards digital platforms at the heart of our economy (The Digital Revolution as I wrote about also in previous posts) will impact organizations dramatically.
Can current processes and tools support today’s complexity. And what about tomorrow? According to a CIMdata survey there is a clear difference in profit and performance between leaders and followers, and the gap is increasing faster. “Can you afford yourself to be a follower ?” is a question companies should ask themselves.
Rethinking PLM platform does not bring the 2-3 % efficiency benefit but can bring benefits from 20 % and more.
Peter sees a federated platform as a must for companies to survive. I in particular likes his statement:
The new business platform paradigm is one in which solutions from multiple providers must be seamlessly deployed using a resilient architecture that can withstand rapid changes in business functions and delivery modalities
Industry voices on the Future PLM platform
Auto
Steven Vetterman from ProSTEP talked about PLM in the automotive industry. Steven started describing the change in the automotive industry, by quoting Heraclitus Τα πάντα ρεί – the only constant is change. Steven described two major changes in the automotive industry:
1. The effect of globalization, technology and laws & ecology
2. The change of the role of IT and the impact of culture & collaboration
Interesting observation is that the preferred automotive market will shift to the BRIC countries. In 2050 more than 50 % of the world population (estimate almost 10 billion people at that time) will be living in Asia, 25 percent in Africa. Europe and Japan are aging. They will not invest in new cars.
For Steven, it was clear that current automotive companies are not yet organized to support and integrate modern technologies (systems engineering / electrical / software) beyond mechanical designs. Neither are they open for a true global collaboration between all players in the industry. Some of the big automotive companies are still struggling with their rigid PLM implementation. There is a need for open PLM, not driven from a single PLM system, but based on a federated environment of information.
Aero
Yves Baudier spoke on behalf of the aerospace industry about the standardization effort at their Strategic Standardization Group around Airbus and some of its strategic suppliers, like Thales, Safran, BAE systems and more. If you look at the ASD Radar, you might get a feeling for the complexity of standards that exist and are relevant for the Airbus group.
It is a complex network of evolving standard all providing (future) benefits in some domains. Yves was talking about the through Lifecycle support which is striving for data creation once and reuse many times during the lifecycle. The conclusion from Yves, like all the previous speakers is that: The PLM Platform of the Future will be federative, and standards will enable PLM Interoperability
Energy and Marine
Shefali Arora from Wärtsilä spoke on behalf of the energy and marine sector and gave an overview of the current trends in their business and the role of PLM in Wärtsilä. With PLM, Wärtsilä wants to capitalize on its knowledge, drive costs down and above all improve business agility. As the future is in flexibility. Shefali gave an overview of their PLM roadmap covering the aspects of PDM (with Teamcenter), ERP (SAP) and a PLM backbone (Share-A-space). The PLM backbone providing connectivity of data between all lifecycle stages and external partners (customer / suppliers) based on the PLCS standard. Again another session demonstrating the future of PLM is in an open and federated environment
Intermediate conclusion:
The future PLM platform is a federated platform which adheres to standards provides openness of interfaces that permit the platform to be reliable over multiple upgrade cycles and being able to integrate third-parties (Peter Bilello)
Systems Engineering
The afternoon session I followed the Systems Engineering track. Peter Bilello gave an overview of Model-Based Systems engineering and illustrated based on a CIMdata survey that even though many companies have a systems engineering strategy in place it is not applied consistently. And indeed several companies I have been dealing with recently expressed their desire to integrate systems engineering into their overall product development strategy. Often this approach is confused by believing requirements management and product development equal systems engineering. Still a way to go.
Dieter Scheithauer presented his vision that Systems Engineering should be a part of PLM, and he gave a very decent, academic overview how all is related. Important for companies that want to go into that direction, you need to understand where you aiming at. I liked his comparison of a system product structure and a physical product structure, helping companies to grab the difference between a virtual, system view and a physical product view:
More Industry voices
Construction industry
The afternoon session started with Christophe Castaing, explaining BIM (Building Information Modeling) and the typical characteristics of the construction industry. Although many construction companies focus on the construction phase, for 100 pieces of information/exchange to be managed during the full life cycle only 5 will be managed during the initial design phase (BIM), 20 will be managed during the construction phase (BAM) and finally 75 will be managed during the operation phase (BOOM). I wrote about PLM and BIM last year: Will 2014 become the year the construction industry will discover PLM?
Christophe presented the themes from the French MINnD project, where the aim is starting from an Information Model to come to a platform, supporting and integrated with the particular civil and construction standards, like IFC. CityGml but also PLCS standard (isostep ISO 10303-239
Consumer Products
Amir Rashid described the need for PLM in the consumer product markets stating the circular economy as one of the main drivers. Especially in consumer markets, product waste can be extremely high due to the short lifetime of the product and everything is scrapped to land waste afterward. Interesting quote from Amir: Sustainability’s goal is to create possibilities not to limit options. He illustrated how Xerox already has sustainability as part of their product development since 1984. The diagram below demonstrates how the circular economy can impact all business today when well-orchestrated.
Marc Halpern closed the tracks with his presentation around Product Innovation Platforms, describing how Product Design and PLM might evolve in the upcoming digital era. Gartner believes that future PLM platforms will provide insight (understand and analyze Big Data), Adaptability (flexible to integrate and maintain through an open service oriented architecture), promoting reuse (identifying similarity based on metadata and geometry), discovery (the integration of search analysis and simulation) and finally community (using the social paradigm).
If you look to current PLM systems, most of them are far from this definition, and if you support Gartner’s vision, there is still a lot of work for PLM vendor to do.
Interesting Marc also identified five significant risks that could delay or prevent from implementing this vision:
- inadequate openness (pushing back open collaboration)
- incomplete standards (blocking implementation of openness)
- uncertain cloud performance (the future is in cloud services)
- the steep learning curve (it is a big mind shift for companies)
- Cyber-terrorism (where is your data safe?)
After Marc´s session there was an interesting panel discussion with some the speakers from that day, briefly answering discussing questions from the audience. As the presentations have been fairly technical, it was logical that the first question that came up was: What about change management?
A topic that could fill the rest of the week but the PDT dinner was waiting – a good place to network and digest the day.
DAY 2
Day 2 started with two interesting topics. The first presentation was a joined presentation from Max Fouache (IBM) and Jean-Bernard Hentz (Airbus – CAD/CAM/PDM R&T and IT Backbones). The topic was about the obsolescence of information systems: Hardware and PLM applications. As in the aerospace industry some data needs to be available for 75 years. You can imagine that during 75 years a lot can change to hardware and software systems. At Airbus, there are currently 2500 applications, provided by approximate 600 suppliers that need to be maintained. IBM and Airbus presented a Proof of Concept done with virtualization of different platforms supporting CATIA V4/V5 using Linux, Windows XP, W7, W8 which is just a small part of all the data.
The conclusion from this session was:
To benefit from PLM of the future, the PLM of the past has to be managed. Migration is not the only answer. Look for solutions that exist to mitigate risks and reduce costs of PLM Obsolescence. Usage and compliance to Standards is crucial.
Standards
Next Howard Mason, Corporate Information Standards Manager took us on a nice journey through the history of standards developed in his business. I loved his statement: Interoperability is a right, not a privilege
In the systems engineering track Kent Freeland talked about Nuclear Knowledge Management and CM in Systems Engineering. As this is one of my favorite domains, we had a good discussion on the need for pro-active Knowledge Management, which somehow implies a CM approach through the whole lifecycle of a plant. Knowledge management is not equal to store information in a central place. It is about building and providing data in context that it can be used.
Ontology for systems engineering
Leo van Ruijven provided a session for insiders: An ontology for Systems Engineering based on ISO 15926-11. His simplified approach compared to the ISO 15288 lead to several discussion between supporters and opponents during lunch time.
Master Data Management
After lunch time Marc Halpern gave his perspective on Master Data Management, a new buzz-word or discipline need to orchestrate enterprise collaboration.
Based on the type of information companies want to manage in relation to each other supported by various applications (PLM, ERP, MES, MRO, …) this can be a complex exercise and Marc ended with recommendations and an action plan for the MDM lead. In my customer engagements I also see more and more the digital transformation leads to MDM questions. Can we replace Excel files by mastered data in a database?
Almost at the end of the day I was speaking about the PDM platform of the people targeted for the people from the future. Here I highlighted the fundamental change in skills that’s upcoming. Where my generation was trained to own and capture information as much as possible information in your brain (or cabinet), future generations are trained and skilled in finding data and building information out of it. Owning (information) is not crucial for them. Perhaps as the world is moving fast. See this nice YouTube movie at the end.
Ella Jamsin ended the conference on behalf of the Ellen MacArthur Foundation explaining the need to move to a circular economy and the PLM should play a role in that. No longer is PLM from cradle-to-grave but PLM should support the lifecycle from cradle-to-cradle.
Unfortunate I could not attend all sessions as there were several parallel sessions. Neither have I written about all sessions I attended. The PDT Europe conference, a conference for people who mind about the details around the PLM future concepts and the usage of standards, is a must for future strategists.
Business is changing and becoming digital as you might have noticed. If you haven´t noticed it, you might be disconnected from the world or work in a stable silo. A little bit simplified and provocative otherwise you would not read further.
The change towards digital also has its effect on how PLM is evolving. Initially considered as an extension of PDM, managing engineering data, slowly evolving to an infrastructure to support the whole product lifecycle.
The benefits from a real PLM infrastructure are extremely high as it allows people to work smarter, identify issues earlier and change from being reactive towards proactive. In some industries, this change in working is they only way to stay in business. Others with still enough margin will not act.
Note: I am talking about a PLM infrastructure as I do not believe in a single PLM system anymore. For me PLM is supported through a collection of services across the whole product lifecycle, many potentially in one system or platform.
Changing from an engineering-centric system towards an infrastructure across the departmental silos is the biggest challenge for PLM. PLM vendors and ERP vendors with a PLM offering are trying provide this infrastructure and mainly fight against Excel. As an Excel file can easy pass the border from one department to the other. No vision needed for Excel.
A PLM infrastructure however requires a vision. A company has to look at its core business processes and decide on which information flows through the organization or even better their whole value chain.
Building this vision, understanding this vision and then being able to explain the vision is a challenge for all companies. Where sometime even management says
“Why do we need to have a vision, just fix the problem”
also people working in departments are not looking forward to change their daily routines because they need to share information. Here you here statements like
“Why people feel the need to look at the big picture. I want to have my work done.”
So if current businesses do not change, will there be a change?
Here I see the digital world combined with search-based applications coming up. Search based applications allow companies to index their silos and external sources and get an understanding of the amount of data there exists. And from these results learn that there is a lot of duplicated data or invalid information at different places.
This awareness might create the understanding that if instead of having hundred thousands of Excels in the organization, it would be better to have the data inside a database, uniquely stored and connected to other relevant information.
I have described this process in my past three posts and agreed the context remains complex if you are not involved or interested in it.
To grasp this concept first of all, you need to have the will to understand it. There is a lot of strategic information out there from companies like Accenture, Capgemini and more.
Next if you want to understand it in a more down-to-earth manner it is important to listen and talk with your peers from other companies, other industries. This is currently happening all around the world and I invite you to participate.
Here is a list of events that I am attending or planned to attend but too far away:
October 7th Stockholm – ENOVIA user conference
Here I will participate as a panel member in the discussion around the concept of zero files. Here we want to explain and discuss to the audience what a data-centric approach means for an organization. Also, customers will share their experiences. This conference is focusing on the ENOVIA community – you can still register here
October 14th-15th Paris – Product Data Technology 2014 conference
Here I will speak about the PLM future (based on data) and what PLM should deliver for the future generations. This conference is much broader and addresses all PLM related topics in a broader perspective
October 28 Product Innovation conference in San Diego
I have always enjoyed participating to this conference as like the PDT2014 it brings people together for networking and discussions. Mostly on business topics not on IT-issues.
November 26 Infuseit seminar in Copenhagen
Relative new in the Nordics Infuseit, a PLM consultancy company, is able to attract an audience that wants to work on understanding the PLM future. Instead of listening to presenters, here you are challenged to to discuss and contribute to build a common opinion. I will be there too.
Conclusion: It is time to prepare yourself for the change – it is happening and be educated an investment that will be rewarding for your company
What do you think – Is data-centric a dream ?
In my previous post, I talked about the unstoppable trend towards digital information and knowledge based on data becoming the new business paradigm.
Building knowledge based on information extracted from data, instead of working with documents and people, who need to manipulate these documents.
Moreover, the reasons to move towards a digital data-oriented approach are the immense business benefits it can bring to an organization. Having online visibility on information in context of other information from different stakeholders allows companies to be more proactive.
A proactive company will react faster to the market or customer. This will reduce waste and resources (materials / people) and therefore in the end be more competitive. This is all described in my first post, with relevant links to various global references.
In this post, I want to describe in an example what the differences are between a document-oriented and a data-oriented approach and how it affects people and business. This might give you an impression of the expected business benefits.
The ultimate goal behind a data-oriented approach is to have a single version of the truth for a product, project or plant. This can be realized by treating information as data elements in various connected database, where on demand reports or dashboards can be created based on actual information, instead of documents generated by duplicating data in new systems and locations. Digital data will provide paperless processes accessible almost anywhere around the world.
As an example, I will explain the difference between document-centric and data-centric when dealing with specifications.
The specification
Everyone knows the challenge with specifications. Most of the time in printed documents describing how a product or service should work from the client point of view. There are two principles behind specifications:
- Complexity. The more complex product or service is, the bigger chance that specifications are not complete or hundred percent understood, leading to an iterative change process. The challenge here is to manage the change and the consistency of the full specifications.
- Industry and margins. In a repetitive business, for example, the automotive or other mass consumer products, products can be quiet complex and once sold hard to maintain and repair. In a competitive business, an error in the field can consume a lot of the expected profit. In the construction industry, where most of the time single projects are executed by a chain of disciplines, the industry (still) accepts the costs overrun and the high costs of fixing issues in the field, instead of being clearer upfront during the design and planning.
Let’s stay with the example in the middle of complexity and industry volume. In color the various stages of the process.
The document based specification – 1
When the document based specification arrives, the company has to get an understanding of the content. The project manager has a first read through the document (100+ pages) and decides to send the document (it is a pdf) to sales, engineering, legal and planning. Engineering decides to distribute the document internally to mechanical, electrical and quality (for compliance).
The project manager stresses everyone on a weekly base to deliver the responses and tries to understand if the answers will come in time. There are some meetings needed with the stakeholders as the whole understanding needs to be consistent. Based on several iterations a response is compiled.
The data-oriented approach – 1
When the document based specification arrives, the project leader first stores the document as a reference in the PLM system and extracts all the customer requirements as data elements in the system. While extracting the requirements, the projects manager groups them into digital folders (functional / non-functional, contractual, regulations, etc.) and assigns them to the relevant stakeholders, who get notified by the system. Each of the persons assigned, again the engineering manager has distributed the discipline specific requirements internally.
The project manager watches the progress of the requirements analyses which are around a virtual model. There is still a need for meetings with the stakeholders to agree on the solution approach. Everything is stored and visible online in the system. This visibility has helped some of the stakeholders to be better-aligned upfront. In the end, the response is generated and converted to the customer’s format.
Not much benefit for step 1
If you compare the two approaches, there is mainly one person happy: the project manager. Instead of spending time to collect the status of all information, direct visibility on the response helps him/her to prioritize of focus where attention is needed, instead of discovering it on a weekly base.
There is some small benefit from the virtual model as other stakeholders can have a better understanding of the actual progress.
However, for the rest, all stakeholders are complaining. It is difficult to work. (Fl)Excel was much easier. Moreover, thinking about a virtual model takes time as we are not used working in this way. Typically something for aerospace you might think.
And now the benefits come – step 2
The customer has placed the order, and the project has started. The design has started, and people start to discover discrepancies or ambiguous demands that need to be negotiated with the customer. Is it part of the project and if not, should it become part of the project and at which costs (for whom)
The document oriented approach – 2
Several engineers are now discussing with the counterparts at the customer the detailed interpretation of the requirements, either through face-to-face meetings or emails. Changes are collected and sent to the project manager, who tries to understand what has changed and how to merge it in an on-going specification document. To avoid many revisions, he/she tries to update the document on a bi-weekly base, send it to the internal stakeholders for review and with their feedback generates a specification document for the customer that supposed to cover the latest agreements.
Unfortunate not all changes have reached the document as some of the stakeholders were busy and forgot to include some of the changes agreed with the customer as they were in a lost email. Also, a previous change of a requirement was overwritten as an update from quality used the old data. Finally, some design solutions were changed, which raised the costs. And not sure if the product with all its changes will be compliant after delivery. However, luckily nobody noticed so far, not even the customer
The data-oriented approach – 2
Thanks to the virtual model and the relations between all the requirements, any change in a requirement gets notified in the system. When a requirement is further clarified, it is updated in the system. When a requirement needs to be changed, it is clear what the impact of this change is. A change workflow assures that decisions are made visible and approved. Potentially changes that lead to more work were quoted to the customer for acceptance. Luckily the compliancy engineer noted that the change of materials used would lead to a compliancy issue. On a bi-weekly base, the project manager generates an agreed specification for the customer based on the data in the system.
Benefits are growing.
The project manager remains the happiest person and is even happier as less discussion is needed about who changed what and why. Alternatively, discussions about changes that should exist and cannot be found. The time saved by the project manager could be used to collaborate even better with the teams (without annoying them) or perhaps a second project to manage in parallel.
Other stakeholders start to enjoy the data-oriented approach too. Less ambiguity on their side too, fewer iterations because changes were not apparent. As all information is related to the virtual model online, the actual status is clear when making a decision. Less fixing afterwards and luckily still project meeting between the stakeholders to synchronize. The PLM system does not eliminate communication; it provides a reliable baseline of the truth. No need (and option) to look in your archives.
At this stage, benefits start to become clear. Fewer iterations and better decisions will have an impact on the costs and project stress. Still a complaint from the engineers might be that they need to do too much upfront thinking although some years later they might discover that this will be their main job. Fixing issues from the past have diminished.
And then the ultimate benefits
Now the project has reached the physical state. It is manufactured or under commissioning.
The document oriented approach – 3
In the document oriented approach, many issues might pop-up because they have not been considered in the early phase, or they got lost during document exchanges. Does the product work as specified? Is the building certified as specified?
The customer is king and for manufacturing companies this might lead to product recalls or launch delays. In the construction world, people in the field, will fix the issues by using skilled resources and creating a waste of materials and/or resources.
Data handover to the owner is a nightmare for a the project-centric delivery. Several people have been searching for documents, specifications and emails to build and compile the required documents for handover.
The data-oriented approach – 3
In the data-centric approach, the behavior of the physical product works as expected as most of the issues have been solved in the virtual model. When testing the product it works as specified as the specifying requirements have always been linked to the product. Moreover, they have been agreed and approved by the relevant stakeholders. Where relevant, the customer has paid for the extra work specified.
The handover process was not so stressful as before with the document-oriented approach. As the required information was known and specified upfront related to the requirements, the maturity process of the virtual model assured this data exists in the system. Now the as-built information matches the as-specified information. What a relief.
Conclusion
It is clear that the significant benefits can be found in step 3. I wrote the comparison in an extreme manner, knowing that reality lies in the middle. Excellent people can comprehend and fix more upfront because of their experience. Building the ultimate virtual model is not yet an easy achievement either.
The savings in materials and required resources are significant in a data-oriented approach. The time savings and the quality enhancements might change your company into a market-leader. The cost savings achieved through a pro-active approach will make your margin growing (unless competition does the same) and enable you to innovate.
One final remark on business change
If business change could be achieved by selecting the right tool or system without a business change, you will never get a competitive advantage. As your competitor can buy these too.
However, if you change to a data-centric approach, it will be a though change process and therefore once implemented you will leave competitors behind that keep on hanging on the past.
My holidays are over. After reading and cycling a lot, it is time to focus again on business and future. Those of you who have followed my blog the past year must have noticed that I have been talking on a regular base about business moving to a data-oriented approach instead of a document / file-based approach. I wrote an introduction to this topic at the beginning of this year: Did you notice PLM has been changing?
It is part of a bigger picture, which some people might call the Second Machine Age, Industry 4.0, The Third Wave or even more disturbing The onrushing wave.
This year I have had many discussions around this topic with companies acting in various industries; manufacturing, construction, oil & gas, nuclear and general EPC-driven companies. There was some commonality in all these discussions:
- PLUS: Everyone believes it is a beautiful story and it makes sense
- MINUS: Almost nobody wants to act upon it as it is an enormous business change and to change the way a company works you need C-level understanding
- PLUS: Everyone thinks the concept is clear to them
- MINUS: Few understand what it means to work data-oriented and what the impact on their business would be
Therefore, what I will try to do in the upcoming blog posts (two-three-four ??) is to address the two negative observations and how to make them more precise.
What is data / information / knowledge?
Data for me is a collection of small artifacts (numbers, characters, lines, sound bits, …) which have no meaning at all. This could be bundled together as a book, a paper drawing, a letter but also bundled together as a digital format like an eBook, a CAD file, an email and even transmission bytes of a network / internet provider can be considered as data.
Data becomes significant once provided in the context of each other or in the context of other data. At that time, we start calling it information. For that reason, a book or a drawing provides information as the data has been structured in such a manner to become meaningful. The data sent through the network cable only becomes information when it is filtered and stripped from the irrelevant parts.
Information is used to make decisions based on knowledge. Knowledge is the interpretation of information, which combined in a particular way, helps us to make decisions. And the more decisions we make and the more information we have about the results of these decisions, either by us or other, it will increase our knowledge.
Data and big data
Now we have some feeling about data, information and knowledge. For academics, there is room to discuss and enhance the definition. I will leave it by this simple definition.
Big data is the term for all digital data that is too large to handle in a single data management system, but available and searchable through various technologies. Data can come from any source around the world as through the internet an infrastructure exists to filter and search for particular data.
By analyzing and connecting the data coming from these various sources, you can generate information (placing the data in context) and build knowledge. As it is an IT-driven activity, this can be done in the background and give almost actual data to any person. This is a big difference with information handling in the old way, where people have to collect and connect manual the data.
The power of big data applies to many business areas. If you know how your customers are thinking and associating their needs to your products, you can make them better and more targeted to your potential market. Or, if you know how your products are behaving in the field during operation (Internet of Things) you can provide additional services, instant feedback and be more proactive. Plus the field data once analyzed provide actual knowledge helping you to make better products or offer more accurate services.
Wasn’t there big data before?
Yes, before the big data era there was also a lot of information available. This information could be stored in “analogue” formats ( microfiche, paper, clay tablets, papyrus) or in digital formats, better known as files or collections of files (doc, pdf, CAD-files, ZIP….).
Note the difference. Here I am speaking about information as the data is contained in these formats.
You have to open or be in front of information container first, before seeing the data. In the digital world, this is often called document management, content management. The challenge of these information containers is that you need to change the whole container version once you modify one single piece of data inside it. And each information container holds duplicated information from a data element. Therefore, it is hard to manage a “single version of the truth” approach.
And here comes the data-oriented approach
The future is about storing all these pieces of data inside connected data environments, instead of storing a lot of data inside a (versioned) information container (a file / a document).
Managing these data elements in the context of each other allow people to build information from any viewpoint – project oriented, product oriented, manufacturing oriented, service oriented, etc.
The data remains unique, therefore supporting much closer the single version of the truth approach. Personally I consider the single version of the truth as a utopia, however reducing the amount of duplicated data by having a data-oriented approach will bring a lot more efficiency.
In my next post, I will describe an example of a data-oriented approach and how it impacts business, both from the efficiency point of view and from the business transformation point of view. As the data-oriented approach can have immense benefits . However, they do not come easy. You will have to work different.
Some more details
An important point to discuss is that this data-oriented approach requires a dictionary, describing the primary data elements used in a certain industry. The example below demonstrates a high-level scheme for a plant engineering environment.
Data standards exist in almost any industry or they are emerging and crucial for the longevity and usage of the data. I will touch it briefly in one of the upcoming posts, however, for those interested in this topic in relation to PLM, I recommend attending the upcoming PDT Europe. If you look at the agenda there is a place to learn and discuss a lot about the future of PLM.
I hope to see you there.
Everyone wants to be a game changer and in reality almost no one is a game changer. Game changing is a popular term and personally I believe that in old Europe and probably also in the old US, we should have the courage and understanding changing the game in our industries.
Why ? Read the next analogy.
1974
With my Dutch roots and passion for soccer, I saw the first example of game changing happening in 1974 with soccer. The game where 22 players kick a ball from side to side, and the Germans win in the last minute.
My passion and trauma started that year where the Dutch national team changed the soccer game tactics by introducing totaalvoetbal.
The Dutch team at that time coached by Rinus Michels and with star player Johan Cruyff played this in perfection.
Defenders could play as forwards and they other way around. Combined with the offside-trap; the Dutch team reached the finals of the world championship soccer both in 1974 and 1978. Of course losing the final in both situations to the home playing teams (Germany in 74 – Argentina in 78 with some help of the referee we believe)
This concept brought the Dutch team for several years at the top, as the changed tactics brought a competitive advantage. Other teams and players, not educated in the Dutch soccer school could not copy that concept so fast
At the same time, there was a game changer for business upcoming in 1974, the PC.
On the picture, you see Steve Jobs and Steve Wozniak testing their Apple 1 design. The abbreviation IT was not common yet and the first mouse device and Intel 8008 processor were coming to the market.
This was disruptive innovation at that time, as we would realize 20 years later. The PC was a game changer for business.
2006
Johan Cruyff remained a game changer and when starting to coach and influence the Barcelona team, it was his playing concept tika-taka that brought the Spanish soccer team and the Barcelona team to the highest, unbeatable level in the world for the past 8 years
Instead of having strong and tall players to force yourself to the goal, it was all about possession and control of the ball. As long as you have the ball the opponent cannot score. And if you all play very close together around the ball, there is never a big distance to pass when trying to recapture the ball.
This was a game changer, hard to copy overnight, till the past two years. Now other national teams and club teams have learned to use these tactics too, and the Spanish team and Barcelona are no longer lonely at the top.
Game changers have a competitive advantage as it takes time for the competition to master the new concept. And the larger the change, the bigger the impact on business.
Also, PLM was supposed to be a game changer in 2006. The term PLM became more and more accepted in business, but was PLM really changing the game ?
PLM at that time was connecting departments and disciplines in a digital manner with each other, no matter where they were around the globe. And since the information was stored in centralized places, databases and file sharing vaults, it created the illusion that everyone was working along the same sets of data.
The major successes of PLM in this approach are coming from efficiency through digitization of data exchange between departments and the digitization of processes. Already a significant step forward and bringing enough benefits to justify a PLM implementation.
Still I do not consider PLM in 2006 a real game changer. There was often no departmental or business change combined with it. If you look at the soccer analogy, the game change is all about a different behavior to reach the goal, it is not about better tools (or shoes).
The PLM picture shows the ideal 2006 picture, how each department forwards information to the next department. But where is PLM supporting after sales/services in 2006 ? And the connection between After Sales/Services and Concept is in most of the companies not formalized or existing. And exactly that connection should give the feedback from the market, from the field to deliver better products.
The real game changer starts when people learn and understand sharing data across the whole product or project lifecycle. The complexity is in the word sharing. There is a big difference between storing everything in a central place and sharing data so other people can find it and use it.
People are not used to share data. We like to own data, and when we create or store data, we hate the overhead of making data sharable (understandable) or useful for others. As long as we know where it is, we believe our job is safe.
But our job is no longer safe as we see in the declining economies in Europe and the US. And the reason for that:
Data is changing the game
In the recent years the discussion about BI (Business Intelligence) and Big Data emerged. There is more and more digital information available. And it became impossible for companies to own all the data or even think about storing the data themselves and share it among their dispersed enterprises. Combined with the rise of cloud-based platforms, where data can be shared (theoretically) no matter where you are, no matter which device you are using, there is a huge potential to change the game.
It is a game changer as it is not about just installing the new tools and new software. There are two major mind shifts to make.
- It is about moving from documents towards data. This is an extreme slow process. Even if your company is 100 % digital, it might be that your customer, supplier still requires a printed and wet-signed document or drawing, as a legal confirmation for the transaction. Documents are comfortable containers to share, but they are killing for fast and accurate processing of the data that is inside them.
- It is about sharing and combining data. It does not make sense to dump data again in huge databases. The value only comes when the data is shared between disciplines and partners. For example, a part definition can have hundreds of attributes, where some are created by engineering, other attributes created by purchasing and some other attributes directly come from the supplier. Do not fall in the ERP-trap that everything needs to be in one system and controlled by one organization.
Because of the availability of data, the world has become global and more transparent for companies. And what you see here is that the traditional companies in Europe and the US struggle with that. Their current practices are not tuned towards a digital world, more towards the classical, departmental approach. To change this, you need to be a game changer, and I believe many CEOs know that they need to change the game.
The upcoming economies have two major benefits:
- Not so much legacy, therefore, building a digital enterprise for them is easier. They do not have to break down ivory towers and 150 years of proud ownership.
- The average cost of labor is lower than the costs in Europe and the US, therefore, even if they do not do it right at the first time; there is enough margin to spend more resources to meet the objectives.
The diagram I showed in July during the PI Apparel conference was my interpretation of the future of PLM. However, if you analyze the diagram, you see that it is not a 100 % classical PLM scope anymore. It is also about social interaction, supplier execution and logistics. These areas are not classical PLM domains and therefore I mentioned in the past, the typical PLM system might dissolve in something bigger. It will be all about digital processes based on data coming for various sources, structured and unstructured. Will it still be PLM or will we call it different ?
The big consultancy firms are all addressing this topic – not necessary on the PLM level:
2012 Cap Gemini – The Digital advantage: …..
2013 Accenture – Dealing with digital technology’s disruptive impact on the workforce
2014 McKinsey – Why every leader should care about digitization and disruptive innovation
For CEOs it is important to understand that the new, upcoming generations are already thinking in data (generation Y and beyond). By nature, they are used to share data instead of owning data in many aspects. Making the transition to the future is, therefore, also a process of connecting and understanding the future generations. I wrote about it last year: Mixing past and future generations with a PLM sauce
This cannot be learned from an ivory tower. The easiest way is not to be worried by this trend and continue working as before, losing business and margin slowly year by year.
As in many businesses people are fired for making big mistakes, doing nothing unfortunate is most of the time not considered as a big mistake, although it is the biggest mistake.
During the upcoming PI Conference in Berlin I will talk about this topic in more detail and look forward to meet and discuss this trend with those of you who can participate.
The soccer analogy stops here, as the data approach kills the the old game.
In soccer, the maximum remains 11 players on each side and one ball. In business, thanks to global connectivity, the amount of players and balls involved can be unlimited.
A final observation:
In my younger days, I celebrated many soccer championships, still I am not famous as a soccer player.
Why ?
Because the leagues I was playing in, were always limited in scope: by age, local,regional, etc. Therefore it was easy to win in a certain scope and there are millions of soccer champions beside me. For business, however, there are almost no borders.
Global competition will require real champions to make it work !!!





[…] (The following post from PLM Green Global Alliance cofounder Jos Voskuil first appeared in his European PLM-focused blog HERE.) […]
[…] recent discussions in the PLM ecosystem, including PSC Transition Technologies (EcoPLM), CIMPA PLM services (LCA), and the Design for…
Jos, all interesting and relevant. There are additional elements to be mentioned and Ontologies seem to be one of the…
Jos, as usual, you've provided a buffet of "food for thought". Where do you see AI being trained by a…
Hi Jos. Thanks for getting back to posting! Is is an interesting and ongoing struggle, federation vs one vendor approach.…