You are currently browsing the tag archive for the ‘Bimodal’ tag.
This is my concluding post related to the various aspects of the model-driven enterprise. We went through Model-Based Systems Engineering (MBSE) where the focus was on using models (functional / logical / physical / simulations) to define complex product (systems). Next we discussed Model Based Definition / Model-Based Enterprise (MBD/MBE), where the focus was on data continuity between engineering and manufacturing by using the 3D Model as a master for design, manufacturing and eventually service information.
And last time we looked at the Digital Twin from its operational side, where the Digital Twin was applied for collecting and tuning physical assets in operation, which is not a typical PLM domain to my opinion.
Now we will focus on two areas where the Digital Twin touches aspects of PLM – the most challenging one and the most over-hyped areas I believe. These two areas are:
- The Digital Twin used to virtually define and optimize a new product/system or even a system of systems. For example, defining a new production line.
- The Digital Twin used to be the virtual replica of an asset in operation. For example, a turbine or engine.
Digital Twin to define a new Product/System
There might be some conceptual overlap if you compare the MBSE approach and the Digital Twin concept to define a new product or system to deliver. For me the differentiation would be that MBSE is used to master and define a complex system from the R&D point of view – unknown solution concepts – use hardware or software? Unknown constraints to be refined and optimized in an iterative manner.
In the Digital Twin concept, it is more about a defining a system that should work in the field. How to combine various systems into a working solution and each of the systems has already a pre-defined set of behavioral / operational parameters, which could be 3D related but also performance related.
You would define and analyze the new solution virtual to discover the ideal solution for performance, costs, feasibility and maintenance. Working in the context of a virtual model might take more time than traditional ways of working, however once the models are in place analyzing the solution and optimizing it takes hours instead of weeks, assuming the virtual model is based on a digital thread, not a sequential process of creating and passing documents/files. Virtual solutions allow a company to optimize the solution upfront instead of costly fixing during delivery, commissioning and maintenance.
Why aren’t we doing this already? It takes more skilled engineers instead of cheaper fixers downstream. The fact that we are used to fixing it later is also an inhibitor for change. Management needs to trust and understand the economic value instead of trying to reduce the number of engineers as they are expensive and hard to plan.
In the construction industry, companies are discovering the power of BIM (Building Information Model) , introduced to enhance the efficiency and productivity of all stakeholders involved. Massive benefits can be achieved if the construction of the building and its future behavior and maintenance can be optimized virtually compared to fixing it in an expensive way in reality when issues pop up.
The same concept applies to process plants or manufacturing plants where you could virtually run the (manufacturing) process. If the design is done with all the behavior defined (hardware-in-the-loop simulation and software-in-the-loop) a solution has been virtually tested and rapidly delivered with no late discoveries and costly fixes.
Of course it requires new ways of working. Working with digital connected models is not what engineering learn during their education time – we have just started this journey. Therefore organizations should explore on a smaller scale how to create a full Digital Twin based on connected data – this is the ultimate base for the next purpose.
Digital Twin to match a product/system in the field
When you are after the topic of a Digital Twin through the materials provided by the various software vendors, you see all kinds of previews what is possible. Augmented Reality, Virtual Reality and more. All these presentations show that clicking somewhere in a 3D Model Space relevant information pops-up. Where does this relevant information come from?
Most of the time information is re-entered in a new environment, sometimes derived from CAD but all the metadata comes from people collecting and validating data. Not the type of work we promote for a modern digital enterprise. These inefficiencies are good for learning and demos but in a final stage a company cannot afford silos where data is collected and entered again disconnected from the source.
The main problem: Legacy PLM information is stored in documents (drawings / excels) and not intended to be shared downstream with full quality.
Read also: Why PLM is the forgotten domain in digital transformation.
If a company has already implemented an end-to-end Digital Twin to deliver the solution as described in the previous section, we can understand the data has been entered somewhere during the design and delivery process and thanks to a digital continuity it is there.
How many companies have done this already? For sure not the companies that are already a long time in business as their current silos and legacy processes do not cater for digital continuity. By appointing a Chief Digital Officer, the journey might start, the biggest risk the Chief Digital Officer will be running another silo in the organization.
So where does PLM support the concept of the Digital Twin operating in the field?
For me, the IoT part of the Digital Twin is not the core of a PLM. Defining the right sensors, controls and software are the first areas where IoT is used to define the measurable/controllable behavior of a Digital Twin. This topic has been discussed in the previous section.
The second part where PLM gets involved is twofold:
- Processing data from an individual twin
- Processing data from a collection of similar twins
Processing data from an individual twin
Data collected from an individual twin or collection of twins can be analyzed to extract or discover failure opportunities. An R&D organization is interested in learning what is happening in the field with their products. These analyses lead to better and more competitive solutions.
Predictive maintenance is not necessarily a part of that. When you know that certain parts will fail between 10.000 and 20.000 operating hours, you want to optimize the moment of providing service to reduce downtime of the process and you do not want to replace parts way too early.
The R&D part related to predictive maintenance could be that R&D develops sensors inside this serviceable part that signal the need for maintenance in a much smaller time from – maintenance needed within 100 hours instead of a bandwidth of 10.000 hours. Or R&D could develop new parts that need less service and guarantee a longer up-time.
For an R&D department the information from an individual Digital Twin might be only relevant if the Physical Twin is complex to repair and downtime for each individual too high. Imagine a jet engine, a turbine in a power plant or similar. Here a Digital Twin will allow service and R&D to prepare maintenance and simulate and optimize the actions for the physical world before.

The five potential platforms of a digital enterprise
The second part where R&D will be interested in, is in the behavior of similar products/systems in the field combined with their environmental conditions. In this way, R&D can discover improvement points for the whole range and give incremental innovation. The challenge for this R&D organization is to find a logical placeholder in their PLM environment to collect commonalities related to the individual modules or components. This is not an ERP or MES domain.
Concepts of a logical product structure are already known in the oil & gas, process or nuclear industry and in 2017 I wrote about PLM for Owners/Operators mentioning Bjorn Fidjeland has always been active in this domain, you can find his concepts at plmPartner here or as an eLearning course at SharePLM.
To conclude:
- This post is way too long (sorry)
- PLM is not dead – it evolves into one of the crucial platforms for the future – The Product Innovation Platform
- Current BOM-centric approach within PLM is blocking progress to a full digital thread
More to come after the holidays (a European habit) with additional topics related to the digital enterprise
Perhaps an ambiguous title this time as it can be interpreted in various ways. I think that all these interpretations are one of the most significant problems with PLM. Ambiguity everywhere. Its definition, its value and as you might have noticed from the past two blog posts the required skill-set for PLM consultants.
As I am fine-tuning my presentation for the upcoming PLMx 2018 Event in Hamburg, some things become clearer for me. This is one of the advantages of blogging, speaking at PLM conferences and discussing PLM with companies that are eager to choose to right track for PLM. You are forced to look in more depth to be consistent and need to have arguments to support your opinion about what is happening in the scope of PLM. And from these learnings I realize often that the WHY PLM remains a big challenge for various reasons.
Current PLM
In the past twenty years, companies have implemented PLM systems, where the primary focus was on the P (Product) only from Product Lifecycle Management. PLM systems have been implemented as an engineering tool, as an evolution of (Product Data Management).
PLM systems have never been designed from the start as an enterprise system. Their core capabilities are related to engineering processes and for that reason that is why most implementations start with engineering. Later more data-driven PLM-systems like Aras and Autodesk have begun from another angle, data connectivity between different disciplines as a foundation, avoiding to get involved with the difficulty of engineering first.
This week I saw the publication of the PLMPulse survey results by i42R / MarketKey where they claim:
The results from first industry-led survey on our status of Product Lifecycle Management and future priorities
The PLMPulse report is based on five different surveys as shown in the image above. Understanding the various aspects of PLM from usage, business value, organizational constraints, information value and future potential. More than 350 people from all around the world answered the various questions related to these survey. Unfortunate inputs from some Asian companies are missing. We are all curious what happens in China as there, companies do not struggle with the same legacy related to PLM as other countries. Are they more embracing PLM in a different way?
The results as the editors also confirm, are not shocking and confirming that PLM has the challenge to get out of the engineering domain. Still, I recommend downloading the survey as it has interesting details. After registration you can download the report from here.
What’s next
During the upcoming PLMx 2018 Hamburg conference there will be a panel discussion where the survey results will be discussed. I am afraid that this debate will result again in a discussion where we will talk about the beauty and necessity of PLM and we wonder why PLM is not considered crucial for the enterprise.
There are a few challenges I see for PLM and hopefully they will be addressed. Most discussions are about WHAT PLM should/could do and not WHY. If you want to get to the WHY of PLM, you need to be able to connect the value of PLM to business outcomes that resonate at C-level. Often PLM implementations are considered costly and ROI and business value are vague.
As the PLMPulse report also states, the ROI for PLM is most of the time based on efficiency and cost benefits related to the current way of working. These benefits usually do not offer significant ROI numbers. Major benefits come for working in a different way and focusing on working closer to your customer. Business value is hard to measure.
How do you measure the value of multidisciplinary collaboration or being more customer-centric? What is the value of being better connected to your customer and being able to react faster? These situations are hard to prove at the board level, as here people like to see numbers, not business transformations.
Focus on the WHY and HOW
A lot of the PLM messages that you can read through various marketing or social channels are related to futuristic concepts and high-level dreams that will come true in the next 10-20 years. Most companies however have a planning horizon of 2 years max 5 years. Peter Bilello from CIMdata presented one of their survey results at the PDT conference in 2014, shown below:
Technology and vision are way ahead of reality. Even the area where the leaders focusing the distance between technology and vision gets bigger. The PLM focus is more down-to-earth and should not be on what we are able to do, but the focus should be on what would be the next logical step for our company to progress to the future.
System of Record and System of Engagement
At the PLMx conference I will share my experiences related to PLM transformations with the audience. One and a half-year ago we started talking about the bi-modal approach. Now more and more I see companies adopting the concepts of bi-modal related to PLM. Still most organizations struggle with the fact that their PLM should be related to one PLM system or one PLM vendor, where I believe we should come to the conclusion that there are two PLM modes at this moment. And this does not imply there need to be only one or two systems – it will become a federated infrastructure.
Current modes could be an existing PLM backbone, focusing on capturing engineering data, the classical PLM system serving as a system of record. And a second, new growing PLM-related infrastructure which will be a digital, most likely federated, platform where modern customer-centric PLM processes will run. As the digital platform will provide real-time interaction it might be considered as a system of engagement, complementary to the system of record.
It will be the system of engagement that should excite the board members as here new ways of working can be introduced and mastered. As there are no precise blueprints for this approach, this is the domain where innovative thinking needs to take place.
That’s why I hope that neutral PLM conferences will less focus on WHAT can be done. Discussions like MBSE, Digital Thread, Digital Twin, Virtual Reality / Augmented Reality are all beautiful to watch. However, let’s focus first on WHY and HOW. For me besides the PLMx Hamburg conference, other upcoming events like PDT 2018 (this time in the US and Europe) are interesting events and currently PDT the call for papers is open and hopefully we find speakers that can teach and inspire.
CIMdata together with Eurostep are organizing these events in May (US) and October (Europe). The theme for the CIMdata roadmap conference will be “Charting the Course to PLM Value together – Expanding the value footprint of PLM and Tackling PLM’s Persistent Pain Points” where PDT will focus on Collaboration in the Engineering Supply Chain – the extended digital thread. These themes need to be addressed first before jumping into the future. Looking forward to meeting you there.
Conclusions
In the world of PLM, we are most of the time busy with explaining WHAT we (can/will) do. Like a cult group sometimes we do not understand why others do not see the value or beauty of our PLM concepts. PLM dialogues and conferences should therefore focus more on WHY and HOW. Don’t worry, the PLM vendors/implementers will always help you with WHAT they can do and WHY it is different.
It is already the 6th consecutive year that MarketKey organized the Product Innovation conference with its primary roots in PLM. For me, the PI conferences have always been a checkpoint for changes and progress in the field.
This year about 100 companies participated in the event with the theme: Digital Transformation. From Hype to Value? Sessions were split into three major streams: digital transformation, extended PLM, and Business Enabled Innovation larded with general keynote speeches. I wanted to attend all sessions (and I will do virtually later through PI.TV), but in this post, my observations are from the event highlights from the extended PLM sessions.
From iCub to R1
Giorgio Metta gave an overview of the RobotCub project, where teams are working on developing a robot that can support human beings in our day-to-day live. Some of us are used to industrial robots and understand their constraints. A robot to interact with human beings is extreme more complex, and its development is still in the early stages. This type of robot needs to learn and interpret its environment while remaining accurate and safe for the persons interacting with the robot.
One of the interesting intermediate outcome from the project is that a human-like robot with legs and arms is far too expensive and complicated to handle. Excellent for science fiction movies, but in reality too difficult to control its balance and movements.
This was an issue with the iCUB robot. Now Giorgio and the teams are working on the new R1 robot, maybe not “as-human” as the iCUB robot, but more affordable. It is not only the mechanics that challenge the researchers. Also, the software supporting the artificial intelligence required for a self-learning and performing safe robot is still in the early days.
An inspiring keynote speech to start the conference.
Standardizing PLM Components
The first Extended PLM session was Guido Klette (Rheinmetall), describing the challenges the Rheinmetall group has related to develop and support PLM needs. The group has several PLD/PLM-like systems in place. Guido does not believe in one size fits all to help every business in the group. They have already several PLM “monsters” in their organization. For more adequate support, Rheinmetall has defined a framework with PLM components and dependencies to a more granular choice of functionality to meet individual businesses.
A challenge for this approach, identified by a question from the audience, is that it is a very scientific approach not addressing the difference in culture between countries. Guido agreed and mentioned that despite culture, companies joining the Rheinmetall group most of the time were happy to adhere to such a structured approach.
My takeaway: the component approach fits very well with the modern thinking that PLM should not be supported by a single “monster” system but can be addressed by components providing at the end the right business process support.
PLM as a business asset
Björn Axling gave an excellent presentation describing the PLM perspective from the Husqvarna group. He addressed the external and internal challenges and opportunities for the group in a structured and logical approach which probably apply for most manufacturing companies in a global market. Björn explained that in the Husqvarna group PLM is considered as a business approach, more than ever, Product Lifecycle Management needs to be viewed as the DNA of a company which was the title of one of his slides.
I like his eleven key imperatives (see the above picture) in particular key imperative #9 which is often forgotten:
Take definitions, nomenclature and data management very seriously – the devil is in the details.
This point will always fire back on you if you did not give it the needed attention from the start. Of course, the other ten points are also relevant. The challenge in every PLM project is to get these points addressed and understood in your company.
How to use PLM to enable Industry 4.0?
Martin Eigner´s presentation was building upon his consistent messages that PDM and PLM should be evolving into SysML with a growing need for Model-Based Systems Engineering (MBSE) support.
The title of the presentation was related to Industry 4.0 more focusing on innovation in for Germany´s manufacturing industry. Germany has always been strong in manufacturing, not so strong in product innovation. Martin mentioned that later this year the German government will start another initiative, Engineering 4.0, which should be exciting for our PLM community.
Martin elaborated on the fact that end-to-end support for SysLM can be achieved through a backbone based on linked data. Do not try to solve all product information views in a single system is the lesson learned and preached.
For me, it was interesting to see that also Martin picked up on the bimodal approach for PLM, required to support a transition to a modern digital enterprise (see picture). We cannot continue to build upon our old PLM environments to support, future digital businesses.
PLM and Digital Transformation
In my afternoon session (Jos Voskuil), I shared the observations that companies invest a lot in digital transformation downstream by introducing digital platforms for ERP, CRM, MES and Operations. PLM is often the forgotten platform that needs to change to support a digital enterprise with all its benefits. You can see my presentation here on SlideShare. I addressed here the bimodal approach as discussed in a previous blog post, introduced in Best Practices or Next Practices.
In case your company is not ready yet for a digital transformation or bimodal approach I addressed the need to become model-driven instead of document-driven. And of course for a digital enterprise, the quality of the data counts. I wrote about these topics recently: Digital PLM requires a Model-Based Enterprise and The importance of accurate data: ACT NOW!
Closed-Loop PLM
The last extended PLM presentation from day 1 was given by Felix Nyffenegger, professor for PLM/CAx at HSR (University of Applied Science in Rapperswil (CH)). Felix shared his discovery journey into Industry 4.0, and IoT combined with experiences from the digitalLab@HSR, leading into the concept of closed-loop PLM.
I liked in particular how Felix brought the various views on the product together into one diagram, telling the full story of closed-loop PLM – necessary for a modern implementation framework.
A new age for airships
The last presentation of the day was from Chris Daniels describing the journey of Hybrid Air Vehicles with their Airlander 10 project. Where the classical airships, the most infamous perhaps the Hindenburg, have disappeared due to their flaws, the team of Hybrid Air Vehicles built upon the concept of airships in a defense project with the target to deliver a long endurance multi-intelligence vehicle. The advantage of airships is that they can stay in the air for several days, serving as communication hotspot, communication or rescue ship for places hard to reach with traditional aircraft or helicopter. The Airlander can be operation without going back to a base for 5 days, which is extremely long when you compare this to other aircraft.
The Airlander project is a typical example of incremental innovation used to optimize and extend the purpose of an airship. Combined with the fact that Chris was an excellent speaker made it a great closure of the day
Conclusion
This post is just an extract of one day and one stream of the conference. Already too large for a traditional blog post. Next week I will follow-up with day two and respond beyond 140 characters to the tweet below:
I am just back from the annual PDT conference (12th edition), this year hosted in Paris from 9 to 10 November, co-located with CIMdata’s PLM Road Map 2016 for Aerospace & Defense. The PDT conference, organized by EuroStep and CIMdata, is a relatively small conference with a little over a hundred attendees. The attractiveness of this conference is that the group of dedicated participants is very interactive with each other sharing honest opinions and situations, sometimes going very deep into the details, needed to get the full picture. The theme of the conference was: “Investing for the future while managing product data legacy and obsolescence.” Here are some of the impressions from these days, giving you food for thought to join next year.
Setting the scene
Almost traditionally Peter Bilello (CIMdata) started the conference followed by Marc Halpern (Gartner). Their two presentations had an excellent storyline together.
Pieter Bilello started and discussed Issues and Remedies for PLM Obsolescence. Peter did not address PLM obsolescence for the first time. It is a topic many early PLM adaptors are facing and in a way the imminent obsolescence of their current environments block them of taking advantage of new technologies and capabilities current PLM vendors offer. Having learned from the past CIMdata provides a PLM Obsolescence Management model, which should be on every companies agenda, in the same way as data quality (which I will address later). Being proactive in obsolescence can save critical situations and high costs. From the obsolescence theme, Peter looked forward to the future and the value product innovation platforms can offer, given the requirements that data should be able to flow through the organization, connecting to other platforms and applications, increasing the demand to adhere and push for standards.
Marc Halpern followed with his presentation, titled: More custom products demand new IT strategies and new PLM application where he focused on the new processes and methodology needed for future businesses with a high-focus on customer-specific deliveries, speed, and automation. Automation is always crucial to reducing production costs. In this delivery process 3D printing could bring benefits and Mark shared the plusses and minuses of 3D printing. Finally, when automation of a customer specific order would be possible, it requires a different IT-architecture, depicted by Mark. After proposing a roadmap for customizable products, Mark shared some examples of ROI benefits reported by successful transformation projects. Impressive !!
My summary of these two sessions is that both CIMdata and Gartner confirm the challenges companies have to change their old legacy processes and PLM environments which support the past, meanwhile moving to more, customer-driven processes and, modern data-driven PLM functionality. This process is not just an IT or Business change, it will also be a cultural change.
JT / STEP AP242 / PLCS
Next, we had three sessions related to standards, where Alfred Katzenbach told the success story of JT, the investment done to get this standard approved and performing based on an active community to get the most out of JT, beyond its initial purpose of viewing and exchanging data in a neutral format. Jean-Yves Delanaunay explained in Airbus Operation the STEP AP242 definition is used as the core standard for 3D Model Based Definition (MB) exchange, part of the STEP standards suite and as the cornerstone for Long Term Archiving and Retrieval of Aerospace & Defense 3D MBD.
There seems to be some rivalry between JT and STEP242 viewing capabilities, which go beyond my understanding as I am not an expert from the field here. Nigel Shaw ended the morning session positioning PLCS as a standard for interoperability of information along the whole lifecycle of a product. Having a standardized data model as Nigel showed would be a common good approach for PLM vendors to converge to a more interoperable standard.
My summary of standards is that there is a lot of thinking, evaluation, and testing done by an extensive community of committed people. It will be hard for a company to define a better foundation for a standard in their business domain. Vendors are focusing on performance inside their technology offering and therefore will never push for standards (unless you use their products as a standard). A force for adhering to standards should come from the user community.
Using standards
After lunch we had three end-users stories from:
- Eric Delaporte (Renault Group) talked about their NewPDM project and the usage of standards mainly for exchanges. Two interesting observations: Eric talks about New PDM – the usage of the words New (when does New become regular?) and PDM (not talking about PLM ?) and secondly as a user of standards he does not care about the JT/AP242 debate and uses both standards where applicable and performing.
- Sebastien Olivier (France Ministry of Defense) gave a bi-annual update of their PCLS-journey used in two projects, Pencil (Standardized Exchange platform and centralized source of logistical information) and MAPS (Managing procurement contracts for buying In-Service Support services) and the status of their S3000L implementation (International procedure for Logistic Support Analysis). A presentation for the real in-crowd of this domain.
- Juha Rautjarvi discussion how efficient use of knowledge for safety and security could be maintained and enhanced through collaboration. Here Juha talks about the Body of Knowledge which should be available for all stakeholders in the context of security and safety. And like a physical product this Body of Knowledge goes through a lifecycle, continuous adapting to what potentially arises from the outside world
My conclusion on this part was that if you are not really in these standards on a day-to-day base (and I am not), it is hard to pick the details. Still, the higher level thought processes behind these standard approaches allow you to see the benefits and impact of using standards, which is not the same as selecting a tool. It is a strategic choice.
Modular / Bimodular / not sexy ?
Jakob Asell from Modular Management gave an overview how modularity can connect the worlds of sales, engineering, and manufacturing by adding a modular structure as a governing structure to the various structures used by each discipline. This product architecture can be used for product planning and provides and end-to-end connectivity of information. Modular Management is assisting companies in moving towards this approach.
Next my presentation title: The importance of accurate data. Act now! addressed the topic of the switch from classical, linear, document-driven PLM towards a modern, more incremental and data-driven PLM approach. Here I explained the disadvantage of the old evolutionary approach (impossible – too slow/too costly) and an alternative method, inspired by Gartner’s bimodular IT-approach (read my blog post: Best Practices or Next Practices). No matter which option you are looking for correct and quality data is the oil for the future, so companies should consider allowing the flow of data as a health issue for the future.
The day was closed with a large panel, where the panelist first jumped on the topic bimodal (bipolar ?? / multimodal ??) talking about mode 1 (the strategic approach) and mode 2 (the tactical and fast approach based on Gartner’s definition). It was clear that the majority of the panel was in Mode 1 mode. Which fluently lead to the discussion of usage of standards (and PLM) as not being attractive for the young generations (not sexy). Besides the conclusion that it takes time to understand the whole picture and see the valuable befits a standard can bring and join this enthusiasm
Conclusion
I realize myself that this post is already too long according blogging guidelines. Therefore I will tell more about day 2 of the conference next week with Airbus going bimodal and more.
Stay tuned for next week !
The past weeks I have discussed at various events two topics that appeared to be different:
- The change from an analogue, document-driven enterprise towards a digital, data-driven enterprise with all its effects. E.g. see From a linear world to fast and circular?
- The change in generations upcoming. The behavior and the attitude of the analogue generation(s) and the difference in behavior from the digital generation(s).
During PDT2015 (a review of the conference here), we discussed all the visible trends that business in exponential changing in some industries due to digitalization and every cheaper technology. The question not answered during that conference was: How are we going to make this happen in your company?
HOW ?
Last week I spoke at a PLM forum in Athens and shared with the audience the opportunities for Greece to catch-up and become a digital service economy like Singapore. Here I pictured an idealistic path how this could happen (based on an ideal world where people think long-term).
A mission impossible, perhaps.
The primary challenge to move from analogue towards digital is to my opinion the difference in behavior of the analogue and digital generations (and I am generalizing of course)
The analogue generation has been educated that knowledge is power. Store all you know in your head or keep it in books close to you. Your job was depending on people needing you. Those who migrated to the digital world most of the time continued the same behavior. Keep information on your hard disk or mailbox. A job was designed for life and do not plan to share as your job might come at risk. Continuous education was not part of their work pattern. And it is this generation that is in power in most of the traditional businesses.
The digital generation has been educated (I hope so – not sure for every country) to gather information, digest and process it and come with a result. There is no need to store information in your head as there is already an information overflow. Store in your head methodology and practices to find and interpret data. The digital generation for sure wants a stable work environment but they already grew up with the mindset that there is no job for life, having seen several crises. It is all about being flexible and keep your skills up-to-date.
So we have the dilemma here that business is moving from analogue towards digital, where the analogue business represents the linear processes that the old generation was used to. Digital business is much more an iterative approach, acting and adapting on what happens around you. A perfect match for the digital generations.
A dilemma ?
Currently the old generation is leading and they will not easy step aside due to their classical education and behavior. We cannot expect behavior to change, just because it is logically explained. In that case, everyone would stop smoking or adopt other healthy standards.
The dilemma reminded me of the Innovators Dilemma, a famous theory from Clayton Christensen, which also could apply to analogue and digital businesses. Read more about the Innovators Dilemma here in one of my older blog posts: The Innovator´s dilemma and PLM. You can replace the incumbent with the old analogue generation and the disruptive innovation comes from using digital platforms and information understood by the digital generation. If you follow this theory, it would mean old businesses would disappear and new businesses would pop-up and overtake the old companies. Interesting conclusion, however, will there be disruption everywhere?
Recently I saw Peter Sondergaard from Gartner presenting at Gartner Symposium/ITxpo 2015 in Orlando. In his keynote speech, he talked about the value of algorithms introducing first how companies should move from their traditional analogue business towards digital business in a bimodal approach. Have a read of the press release here.
If you have the chance to view his slick and impressive keynote video (approx. 30 minutes) you will understand it better. Great presentation. In the beginning Peter talks about the bimodal approach sustaining old, slowly dying analogue businesses and meanwhile building teams developing a digital business approach. The graph below says it all.
Interesting from this approach is that a company can evolve without being disrupted. Still my main question remains: Who will lead this change from the old analogue business towards modern digital business approach. Will it be the old generation coaching the new generation or will there be a natural evolution at the board level required before this process starts?
HOW ?
I have no conclusion this time as I am curious to your opinion. A shift in business is imminent, but HOW will companies / countries pick-up this shift?
Your thoughts or experiences ?
Recent Comments