You are currently browsing the category archive for the ‘Data centric’ category.

I am writing this post during the Easter weekend in the Netherlands. Easter / Passover / Pascha / are religious festivities that happen around this time, depending on full moons, etc. I am not the expert here, however, what I like about Easter is that is it is an optimistic religious celebration, connecting history, the “dark days,” and the celebration of new life.

Of course, my PLM-twisted brain never stops associating and looking into an analogy, I saw last week a LinkedIn post from Mark Reisig, about Aras ACE 2019 opening with the following statement:

Digital Transformation – it used to be called PLM,” said Aras CEO Peter Schroer, as he opened the conference with some thoughts around attaining sustainable Digital Transformation and owning the lifecycle.

Was this my Easter Egg surprise? I thought we were in the middle of the PLM Renaissance as some other vendors and consultants talk about this era. Have a look at a recent Engineering.com TV-report: Turning PLM on its head

All jokes aside, the speech from Peter Schroer contained some interesting statements and I want to elaborate on them in this post as the space to comment in LinkedIn is not designed for a long answer.

PLM is Digital Transformation?

In the past few years, there has been a discussion if the acronym PLM (Product Lifecycle Management) is perhaps outdated. PTC claimed thanks to IoT (Internet of Things) now PLM equals IoT, as you can read in  Mark Taber’s 2018 guest article in Digital Engineering: IoT Equals PLM.
Note: Mark is PTC’s vice president of marketing and go-to-market marketing according to the bio at the bottom of the article. So a lot of marketing words, which  strengthens the believers of the old world, that everything new is probably marketing.

Also during the PDT conferences, we discussed if PLM should be replaced by a new acronym and I participated in that discussion too – my Nov 2018 postWill MBSE be the new PLM instead of IoT? is a reflection of my thoughts at that time.

For me, Digital Transformation is a metamorphosis from a document-driven, sequential processes towards data-driven, iterative processes. The metamorphosis example used a lot at this moment, is the one from Caterpillar towards the Butterfly. This process is not easy when it comes to PLM-related information, as I described in my PI PLMx 2019 London Presentation and blog post: The Challenges of a Connected Ecosystem for PLM. The question is even: Will there be a full metamorphosis at the end or will we keep on working in two different modes of operations?

However, Digital Transformation does not change the PLM domain. Even after a successful digital transformation, there will be PLM. The only significant difference in the future – PLM boarders will not be so evident anymore when implementing capabilities in a system or a platform. The upcoming of digital platforms will dissolve or fade the traditional PLM-mapped capabilities.

You can see these differences already by taking an in-depth look at how Oracle, SAP or Propel address PLM. Each of them starts from a core platform with different PLM-flavored extensions, sometimes very different from the traditional PLM Vendors. So Digital transformation is not the replacement of PLM.

Back to Peter Schroer’s rebuttal of some myths. Note: DX stands for Digital Transformation

Myth #1: DX leverages disruptive tech

Peter Schroer:

 It’s easy to get excited about AI, AR, and the 3D visual experience. However, let’s be real. The first step is to get rid of your spreadsheets and paper documentation – to get an accurate product data baseline. We’re not just talking a digital CAD model, but data that includes access to performance data, as-built parts, and previous maintenance work history for everyone from technicians to product managers

Here I am fully aligned with Peter. There are a lot of fancy features discussed by marketing teams, however, when working in the field with companies, the main challenge is to get an organization digital aligned, sharing data accessible along the whole lifecycle with the right quality.

This means you need to have a management team, understanding the need for data governance, data quality and understanding the shift from data ownership to data accountability.  This will only happen with the right mix of vision, strategy and the execution of the strategy – marketing does not make it happen

 

Myth #2: DX results in increased market share, revenue, and profit

Peter Schroer:

Though there’s a lot of talk about it – there isn’t yet any compelling data which proves this to be true. Our goal at Aras is to make our products safer and faster. To support a whole suite of industrial applications to extend your DX strategy quite a bit further.

Here I agree and disagree, depending on the context of this statement. Some companies have gone through a digital transformation and therefore increased their market share, revenue, and profit. If you read books like Leading Transformation or Leading Digital, you will find examples of companies that have gone through successful digital transformations. However, you might also discover that most of these companies haven’t transformed their PLM-domain, but other parts of their businesses.

Also, it is interesting to read a 2017 McKinsey post: The case for digital reinvention, where you will get the confirmation that a lot of digital initiatives did not bring more top-line revenue and most of the times lead to extra costs. Interesting to see where companies focus their digital strategies – picture below:

Where only 2 percent of the respondents were focusing on supply chains, this is, according to the authors of the article, one of the areas with the highest potential ROI. And digital supply chains are closely related to modern PLM – so this is an area with enough work to do by all PLM practitioners– connecting ecosystems (in real-time)

Myth #3: Market leaders are the most successful at DX

Peter Schroer:

If your company is hugely profitable at the moment, it’s highly likely that your organization is NOT focused on Digital Transformation. The lifespan of S&P 500 companies continuing to shrink below 20 years.

How to Attain Sustainable Digital Transformation

– Stop buying disposable systems. It’s about an adaptable platform – it needs to change as your company changes.

– Think incremental. Do not lose momentum. Continuous change is a multi-phase journey. If you are in or completed phase I, then that means there is a phase II, a phase III, and so on.

– Align people & processes.  Mistakes will happen, “the tech side is only 50% of DX” – Aras CEO.

Here I agree with Peter on the business side, be it that some of the current market leaders are already digital. Look at Apple, Google, and Amazon. However, the majority of large enterprises have severe problems with various aspects of a digital transformation as the started in the past before digital technologies became affordable..

Digitization allows information to flow without barriers within an organization, leading to rapid insights and almost direct communication with your customers, your supply chain or other divisions within your company. This drives the need to learn and build new, lean processes and get people aligned to them. Learning to work in a different mode.

And this is extremely difficult for a market leader – as market leader fear for the outside changing world is often not felt. Between the C-level vision and people working in the company, there are several layers of middle management. These layers were created to structure and stabilize the old ways of working.

I wrote about the middle management challenge in my last blog post: The Middle Management dilemma. Almost in the same week there was an article from McKinsey: How companies can help midlevel managers navigate agile transformations.
Conclusion: It is not (only) about technology as some of the tech geeks may think.

Conclusion

Behind the myths addressed by Peter Schroer, there is a complex transformation on-going. Probably not a metamorphosis. With the Easter spirit in mind connected to PLM, I believe digital transformations are possible – Not as a miracle but driven by insights into all aspects. I hope this post gave you some more ideas and please read the connected articles – they are quite relevant if you want to discover what’s below the surface.

Image: waitbutwhy.com

Two weeks ago I wrote about the simplification discussion around PLM – Why PLM never will be simple.  There I focused on the fact that even sharing information in a consistent, future proof way of working, is already challenging, despite easy to use communication tools like email or social communities.

I mentioned that sharing PLM data is even more challenging due to their potential revision, version, status, and context.  This brings us to the topic of configuration management, needed to manage the consistency of information, a challenge with the increasingly sophisticated products or systems. Simple tools will never fix this complexity.

To manage the consistency of a product,  configuration management (CM) is required. Two weeks ago I read the following interesting post from CMstat: A Brief History of Configuration Management Software.

An excellent introduction if you want to know more about the roots of CM, be it that the post at the end starts to flush out all the disadvantages and reasons why you should not think about CM using PLM systems.

The following part amused me:

 The Reality of Enterprise PLM

It is no secret that PLM solutions were often sold based in good part on their promise to provide full-lifecycle change control and systems-level configuration management across all functions of the enterprise for the OEM as well as their supply and service chain partners. The appeal of this sales stick was financial; the cost and liability to the corporation from product failures or disasters due to a lack of effective change control was already a chief concern of the executive suite. The sales carrot was the imaginary ROI projected once full-lifecycle, system-level configuration control was in effect for the OEM and supply chain.

Less widely known is that for many PLM deployments, millions of budget dollars and months of calendar time were exhausted before reaching the point in the deployment road map where CM could be implemented. It was not uncommon that before the CM stage gate was reached in the schedule, customer requirements, budget allocations, management priorities, or executive sponsors would change. Or if not these disruptions within the customer’s organization, then the PLM solution provider, their software products or system integrators had been changed, acquired, merged, replaced, or obsoleted. Worse yet for users who just had a job to do was when solutions were “reimagined” halfway through a deployment with the promise (or threat) of “transforming” their workflow processes.

Many project managers were silently thankful for all this as it avoided anyone being blamed for enterprise PLM deployment failures that were over budget, over schedule, overweight, and woefully underwhelming. Regrettably, users once again had to settle for basic change control instead of comprehensive configuration management.

I believe the CMstat-writer is generalizing too much and preaches for their parish. Although my focus lies on PLM, I also learned the importance of CM and for that reason I will share a view on CM from the PLM side:

Configuration Management is not a target for every company

The origins of Configuration Management come from the Aerospace and Defense (A&D) industries. These industries have high quality, reliability and traceability constraints. In simple words, you need to prove your product works correctly specified in all described circumstances and keep this consistent along the lifecycle of the product.

Moreover, imagine you delivered the perfect product, next implementing changes require a full understanding of the impact of the change. What is the impact of the change on the behavior or performance? In A&D is the question is it still safe and reliable?

Somehow PLM and CM are enemies. The main reason why PLM-systems are used is Time to Market — bringing a product as fast as possible to the market with acceptable quality. Being first is sometimes more important than high quality. CM is considered as a process that slows down Time to Market as managing consistency, and continuous validating takes time and effort.

Configuration Management in Aviation is crucial as everyone understands that you cannot afford to discover a severe problem during a flight. All the required verification and validation efforts make CM a costly process along the product lifecycle. Airplane parts are 2 – 3 times more expensive than potential the same parts used in other industries. The main reason: airplane parts are tested and validated for all expected conditions along their lifecycle.  Other industries do not spend so much time on validation. They validate only where issues can hurt the company, either for liability or for costs.

Time to Market even impacts the aviation industry  as we can see from the commercial aircraft battle(s) between Boeing and Airbus. Who delivers the best airplane (size/performance) at the right moment in the global economy? The Airbus 380 seemed to miss its targets in the future – too big – not flexible enough. The Boeing 737 MAX appears to target a market sweet spot (fuel economy) however the recent tragic accidents with this plane seemed to be caused by Time to Market pressure to certify the aircraft too early. Or is the complexity of a modern airplane unmanageable?

CM based on PLM-systems

Most companies had their configuration management practices long before they started to implement PLM. These practices were most of the time documented in procedures, leading to all kind of coding systems for these documents. Drawing numbers (the specification of a part/product), Specifications, Parts Lists, all had a meaningful identifier combined with a version/revision and status. For example, the Philips 12NC coding system is famous in the Netherlands and is still used among spin-offs of Philips and their supplier as it offers a consistent framework to manage configurations.

Storing these documents into a PDM/PLM-system to provide centralized access was not a big problem; however, companies also expected the PLM-system to support automation and functionality to support their configuration management procedures.

A challenge for many implementers for several reasons:

  • PLM-systems do not offer a standard way of working – if they would do so, they could only serve a small niche market – so it needs to be “configured/customized.”
  • Company configuration management rules sometimes cannot be mapped to the provided PLM data-model and their internal business logic. This has led to costly customizations where, in the best case, implementer and company agreed somewhere in the middle. Worst case as the writer from the CM blog is mentioning it becomes an expensive, painful project
  • Companies do not have a consistent configuration management framework as Time to Market is leading – we will fix CM later is the idea, and they let their PLM –implementer configure the PLM-system as good a possible. Still, at the management level, the value of CM is not recognized.
    (see also: PLM-CM-ALM – not sexy ?)

In companies that I worked with, those who were interested in a standardized configuration management approach were trained in CMII. CMII (or CM2) is a framework supported by most PLM-systems, sometimes even as a pre-configured template to speed-up the implementation. Still, as PLM-systems serve multiple industries, I would not expect any generic PLM-vendor to offer Commercial Off-The-Shelf (COTS) CM-capabilities – there are too many legacy approaches. You can find a good and more in-depth article related to CMII here: Towards Integrated Configuration Change Management (CMII) from Lionel Grealou.

 

What’s next?

Current configuration management practices are very much based on the concepts of managing document. However, products are more and more described in a data-driven, model-based approach. You can find all the reasons why we are moving to a model-based approach in my last year’s blog post. Important to realize is that current CM practices in PLM were designed with mechanical products and lifecycles as a base. With the combination of hardware and software, integrated and with different lifecycles, CM has to be reconsidered with a new holistic concept. The Institute of Process Excellence provides CM2 training but is also active in developing concepts for the digital enterprise.

Martijn Dullaart, Lead Architect Configuration Management @ ASML & Chair @ IPE/CM2 Global Congress has published several posts related to CM and a Model-Based approach – you find them here related to his LinkedIn profile. As you can read from his articles organizations are trying to find a new consistent approach.

Perhaps CM as a service to a Product Innovation Platform, as the CMstat blog post suggests? (quote from the post below)

In Part 2 of this CMsights series on the future of CM software we will examine the emerging strategy of “Platform PLM” where functional services like CM are delivered via an open, federated architecture comprised of rapidly-deployable industry-configured applications.

I am looking forward to Part2 of CMsights . An approach that makes sense to me as system boundaries will disappear in a digital enterprise. It will be more critical in the future to create consistent data flows in the right context and based on data with the right quality.

Conclusion

Simple tools and complexity need to be addressed in the right order. Aligning people and processes efficiently to support a profitable enterprise remains the primary challenge for every enterprise. Complex products, more dependent on software than hardware, are requiring new ways of working to stay competitive. Digitization can help to implement these new ways of working. Experienced PLM/CM experts know the document-driven past. Now it is time for a new generation of PLM and CM experts to start from a digital concept and build consistent and workable frameworks. Then the simple tools can follow.

 

In this post, I will explain the story behind my presentation at PI PLMx London. You can read my review of the event here: “The weekend after ……” and you can find my slides on SlideShare: HERE.

For me, this presentation is a conclusion of a thought process and collection of built-up experiences in the past three to  five years, related to the challenges digital transformation is creating for PLM and what makes it hard to go through compared to other enterprise business domains.  So here we go:

Digital transformation or disruption?

Slide 2 (top image) until 5 are dealing with the common challenges of business transformation. In nature, the transformation from a Caterpillar (old linear business) to a Butterfly (modern, agile, flexible) has the cocoon stage, where the transformation happens. In business unfortunate companies cannot afford a cocoon phase, it needs to be a parallel change.

Human beings are not good at change (slide 3 & 4), and the risk is that a new technology or a new business model will disrupt your business if you are too confident – see examples from the past. The disruption theory introduced by Clayton Christensen in his book, the Innovators Dilemma is an excellent example of how this can happen.  Some of my thoughts are in The Innovator’s dilemma and generation change (2015)

Although I know some PLM vendors consider themselves as disruptor, I give them no chance in the PLM domain. The main reason: The existing PLM systems are so closely tied to the data they manage, that switching from one PLM system to a more modern PLM system does not pay off.  The data models are so diverse that it is better to stay with the existing environment.

What is clear for modern digital businesses is that if you could start from scratch or with almost no legacy you can move faster forward than the rest. But only if supported by a strong leadership , a(understandable) vision and relentless execution.

The impression of evolution

Marc Halpern’s slide presented at PDT 2015 is one of my favorite slides, as it maps business maturity to various characteristics of an organization, including the technologies used.

 

Slide 7 till 18 are zooming in on the terms Coordinated and Connected and the implications it has for data, people and business. I have written about Coordinated and Connected recently: Coordinated or Connected (2018)

A coordinated approach: Delivering the right information at the right moment in the proper context is what current PLM implementations try to achieve. Allowing people to use their own tools/systems as long as they deliver at the right moment their information (documents/files) as part of the lifecycle/delivery process. Very linear and not too complicated to implement you would expect. However it is difficult ! Here we already see the challenge of just aligning a company to implement a horizontal flow of data. Usability of the PLM backbone and optimized silo thinking are the main inhibitors.

In a connected approach: Providing actual information for anyone connected in any context the slide on the left shows the mental picture we need to have for a digital enterprise. Information coming from various platforms needs to be shareable and connected in real-time, leading, in particular for PLM, to a switch from document-based deliverables to models and parameters that are connected.

Slide 15 has examples of some models.  A data-driven approach creates different responsibilities as it is not about ownership anymore but about accountability.

The image above gives my PLM-twisted vision of which are the five core platforms for an enterprise.  The number FIVE is interesting as David Sherburne just published his Five Platforms that Enable Digital Transformation and in 2016 Gartner identified Five domains for the digital platform .- more IT-twisted ? But remember the purpose of digital transformation is: FIVE!

From Coordinated to Connected is Digital Transformation

Slide 19 till 27 further elaborate on the fact that for PLM there is no evolutionary approach possible, going from a Coordinated technology towards a Connected technology.

For three reasons:  different type of data (document vs. database elements), different people (working in a connected environment requires modern digital skills) and different processes (the standard methods for mechanical-oriented PLM practices do not match processes needed to deliver systems (hardware & software) with an incremental delivery process).

Due to the incompatibility of the data, more and more companies discover that a single PLM-instance cannot support both modes – staying with your existing document-oriented PLM-system does not give the capabilities needed for a model-driven approach. Migrating the data from a traditional PLM-environment towards a modern data-driven environment does not bring any value. The majority of the coordinated data is not complete and with the right quality to use a data-driven environment. Note: in  a data-driven environment you do not have people interpreting the data – the data should be correct for automation / algorithms.

The overlay approach, mentioned several times in various PLM-blogs, is an intermediate solution. It provides traceability and visibility between different data sources (PLM, ALM, ERP, SCM, …). However it does not make the information in these systems better accessible.

So the ultimate conclusion is: You need both approaches, and you need to learn to work in a hybrid environment !

What can various stakeholders do?

For the management of your company, it is crucial they understand the full impact of digital transformation. It is not about a sexy customer website, a service platform or Virtual Reality/Augmented Reality case for the shop floor or services. When these capabilities are created disconnected from the source (PLM), they will deliver inconsistencies in the long-term. The new digital baby becomes another silo in the organization. Real digital transformation comes from an end-to-end vision and implementation.  The result of this end-to-end vision will be the understanding that there is a duality in data, in particular for the PLM domain.

Besides the technicalities, when going through a digital transformation, it is crucial for the management to share their vision in a way it becomes a motivational story, a myth, for all employees. As Yuval Harari, writer of the book Sapiens,  suggested, we (Home Sapiens) need an abstract story, a myth to align a larger group of people to achieve a common abstract goal. I discussed this topic in my posts: PLM as a myth? (2017)  and PLM – measurable or a myth?

Finally, the beauty of new digital businesses is that they are connected and can be monitored in real-time. That implies you can check the results continuously and adjust – scale of fail!

Consultants and strategists in a company should also take the responsibility, to educate the management and when advising on less transformational steps, like efficiency improvements: Make sure you learn and understand model-based approaches and push for data governance initiatives. This will at least narrow the gap between coordinated and connected environments.

This was about strategy – now about execution:

For PLM vendors and implementers, understanding the incompatibility of data between current PLM practices – coordinated and connected – it will lead to different business models. Where traditionally the new PLM vendor started first with a rip-and-replace of the earlier environment – no added value – now it is about starting a new parallel environment.  This implies no more big replacement deals, but more a long-term. strategic and parallel journey.  For PLM vendors it is crucial that being able to offer to these modes in parallel will allow them to keep up their customer base and grow. If they would choose for coordinated or connected only it is for sure a competitor will work in parallel.

For PLM users, an organization should understand that they are the most valuable resources, realizing these people cannot make a drastic change in their behavior. People will adapt within their capabilities but do not expect a person who grew up in the traditional ways of working (linear / analogue) to become a successful worker in the new mode (agile / digital). Their value lies in transferring their skills and coaching new employees but do not let them work in two modes. And when it comes to education: permanent education is crucial and should be scheduled – it is not about one or two trainings per year – if the perfect training would exist, why do students go to school for several years ? Why not give them the perfect PowerPoint twice a year?

Conclusions

I believe after three years of blogging about this theme I have made my point. Let’s observe and learn from what is happening in the field – I remain curious and focused about proof points and new insights. This year I hope to share with you new ideas related to digital practices in all industries, of course all associated with the human side of what we once started to call PLM.

Note: Oleg Shilovitsky just published an interesting post this weekend: Why complexity is killing PLM and what are future trajectories and opportunities? Enough food for discussion. One point: The fact that consumers want simplicity does not mean PLM will become simple – working in the context of other information is the challenge – it is human behavior – team players are good in anticipating – big egos are not. To be continued…….

 

 

 

 

 

 

 

 

 

I was happy to take part at the PI PLMx London event last week. It was here and in the same hotel that this conference saw the light in 2011  – you can see my blog post from that event here: PLM and Innovation @ PLMINNOVATION 2011.

At that time the first vendor-independent PLM conference after a long time and it brought a lot of new people together to discuss their experience with PLM. Looking at the audience that time, many of the companies that were there, came back during the years, confirming the value this conference has brought to their PLM journey.

Similar to the PDT conference(s) – just announced for this year last week – here – the number of participants is diminishing.

Main hypotheses:

  1. the PLM-definition has become too vague. Going to a PLM conference does not guarantee it is your type of PLM discussions you expect to see?
  2. the average person is now much better informed related to PLM thanks to the internet and social media (blogs/webinars/ etc.) Therefore, the value retrieved from the PLM conference is not big enough any more?
  3. Digital Transformation is absorbing all the budget and attention downstream the organization not creating the need and awareness of modern PLM to the attention of the management anymore. g., a digital twin is sexier to discuss than PLM?

What do you think about the above three hypotheses – 1,2 and/or 3?

Back to the conference. The discussion related to PLM has changed over the past nine years. As I presented at PI from the beginning in 2011, here are the nine titles from my sessions:

2011       PLM – The missing link
2012       Making the case for PLM
2013       PLM loves Innovation
2014       PLM is changing
2015       The challenge of PLM upgrades
2016       The PLM identity crisis
2017       Digital Transformation affects PLM
2018       PLM transformation alongside Digitization
2019       The challenges of a connected Ecosystem for PLM

Where the focus started with justifying PLM, as well as a supporting infrastructure, to bring Innovation to the market, the first changes became visible in 2014. PLM was changing as more data-driven vendors appeared with new and modern (metadata) concepts and cloud, creating the discussion about what would be the next upgrade challenge.

The identity crisis reflected the introduction of software development / management combined with traditional (mechanical) PLM – how to deal with systems? Where are the best practices?

Then from 2017 on until now Digital Transformation and the impact on PLM and an organization became the themes to discuss – and we are not ready yet!

Now some of the highlights from the conference. As there were parallel sessions, I had to divide my attention – you can see the full agenda here:

How to Build Critical Architecture Models for the New Digital Economy

The conference started with a refreshing presentation from David Sherburne (Carestream) explaining their journey towards a digital economy.  According to David, the main reason behind digitization is to save time, as he quoted Harvey Mackay an American Businessman and Journalist,

Time is free, but it is priceless. You cannot own it, but you can use it. You can’t keep it, but you can spend it. Once you have lost it, you never can get it back

I tend to agree with this simplification as it makes the story easy to explain to everyone in your company. Probably I would add to that story that saving time also means less money spent on intermediate resources in a company, therefore, creating a two-sided competitive advantage.

David stated that today’s digital transformation is more about business change than technology and here I wholeheartedly agree. Once you can master the flow of data in your company, you can change and adapt your company’s business processes to be better connected to the customer and therefore deliver the value they expect (increases your competitive advantage).

Having new technology in place does not help you unless you change the way you work.

David introduced a new acronym ILM (Integrated Lifecycle Management) and I am sure some people will jump on this acronym.

David’s presentation contained an interesting view from the business-architectural point of view. An excellent start for the conference where various dimensions of digital transformation and PLM were explored.

Integrated PLM in the Chemical industry

Another interesting session was from Susanna Mäentausta  (Kemira oy)  with the title: “Increased speed to market, decreased risk of non-compliance through integrated PLM in Chemical industry.” I selected her session as from my past involvement with the process industry, I noticed that PLM adoption is very low in the process industry. Understanding Why and How they implemented PLM was interesting for me. Her PLM vision slide says it all:

There were two points that I liked a lot from her presentation, as I can confirm they are crucial.

  • Although there was a justification for the implementation of PLM, there was no ROI calculation done upfront. I think this is crucial, you know as a company you need to invest in PLM to stay competitive. Making an ROI-story is just consoling the people with artificial number – success and numbers depend on the implementation and Susanna confirmed that step 1 delivered enough value to be confident.
  • There were an end-to-end governance and a communication plan in place. Compared to PLM projects I know, this was done very extensive – full engagement of key users and on-going feedback – communicate, communicate, communicate. How often do we forget this in PLM projects?

Extracting More Value of PLM in an Engineer-to-Order Business

Sami Grönstrand & Helena Gutierrez presented as an experienced duo (they were active in PI P PLMx Hamburg/Berlin before) – their current status and mission for PLM @ Outotec. As the title suggests, it was about how to extract more value from PL M, in an Engineering to Order Business.

What I liked is how they simplified their PLM targets from a complex landscape into three story-lines.

If you jump into all the details where PLM is contributing to your business, it might get too complicated for the audience involved. Therefore, they aligned their work around three value messages:

  • Boosting sales, by focusing on modularization and encouraging the use of a product configurator. This instead of developing every time a customer-specific solution
  • Accelerating project deliverables, again reaping the benefits of modularization, creating libraries and training the workforce in using this new environment (otherwise no use of new capabilities). The results in reducing engineering hours was quite significant.
  • Creating New Business Models, by connecting all data using a joint plant structure with related equipment. By linking these data elements, an end-to-end digital continuity was established to support advanced service and support business models.

My conclusion from this session was again that if you want to motivate people on a PLM-journey it is not about the technical details, it is about the business benefits that drive these new ways of working.

Managing Product Variation in a Configure-To-Order Business

In the context of the previous session from Outotec, Björn Wilhemsson’s session was also addressing somehow the same topic of How to create as much as possible variation in your customer offering, while internally keep the number of variants and parts manageable.

Björn, Alfa Laval’s OnePLM Programme Director, explained in detail the strategy they implemented to address these challenges. His presentation was very educational and could serve as a lesson for many of us related to product portfolio management and modularization.

Björn explained in detail the six measures to control variation, starting from a model-strategy / roadmap (thinking first) followed by building a modularized product architecture, controlling and limiting the number of variants during your New Product Development process. Next as Alfa Laval is in a Configure-To-Order business, Björn the implementation of order-based and automated addition of pre-approved variants (not every variant needs to exist in detail before selling it), followed by the controlled introduction of additional variants and continuous analysis of quoted and sold variant (the power of a digital portfolio) as his summary slides shows below:

Day 1 closed with an inspirational keynote; Lessons-Learnt from the Mountaineering Experience 8848 Meter above sea level  – a mission to climb the highest mountain on each of the continents in 107 days – 9 hours – setting a new world record by Jonathan Gupta.

There are some analogies to discover between his mission and a PLM implementation. It is all about having the total picture in mind. Plan and plan, prepare step-by-step in detail and rely on teamwork – it is not a solo journey – and it is about reaching a top (deliverable phase) in the most efficient way.

The differences: PLM does not need world records, you need to go with the pace an organization can digest and understand. Although the initial PLM climate during implementation might be chilling too, I do not believe you have to suffer temperatures below 50 degrees Celsius.

During the morning, I was involved in several meetings, therefore unfortunate unable to see some of the interesting sessions at that time. Hopefully later available on PI.TV for review as slides-only do not tell the full story. Although there are experts that can conclude and comment after seeing a single slide. You can read it here from my blog buddy Oleg Shilovitsky’s post : PLM Buzzword Detox. I think oversimplification is exactly creating the current problem we have in this world – people without knowledge become louder and sure about their opinion compared to knowledgeable people who have spent time to understand the matter.

Have a look at the Dunning-Kruger effect here (if you take the time to understand).

 

PLM: Enabling the Future of a Smart and Connected Ecosystem

Peter Bilello from CIMdata shared his observations and guidance related to the current ongoing digital business revolution that is taking place thanks to internet and IoT technologies. It will fundamentally transform how people will work and interact between themselves and with machines. Survival in business will depend on how companies create Smart and Connected Ecosystems. Peter showed a slide from the 2015 World Economic Forum (below) which is still relevant:

Probably depending on your business some of these waves might have touched your organization already. What is clear that the market leaders here will benefit the most – the ones owning a smart and connected ecosystem will be the winners shortly.

Next, Peter explained why PLM, and in particular the Product Innovation Platform, is crucial for a smart and connected enterprise.  Shiny capabilities like a digital twin, the link between virtual and real, or virtual & augmented reality can only be achieved affordably and competitively if you invest in making the source digital connected. The scope of a product innovation platform is much broader than traditional PLM. Also, the way information is stored differs – moving from documents (files) towards data (elements in a database).  I fully agree with Peter’s opinion here that PLM is conceptually the Killer App for a Smart & Connected Ecosystem and this notion is spreading.

A recent article from Forbes in the category Leadership: Is Your Company Ready For Digital Product Life Cycle Management? shows there is awareness.  Still very basic and people are still confused to understand what is the difference with an electronic file (digital too ?) and a digital definition of information.

The main point to remember here: Digital information can be accessed directly through a programming interface (API/Service) without the need to open a container (document) and search for this piece of information.

Peter then zoomed in on some topics that companies need to investigate to reach a smart & connected ecosystem. Security (still a question hardly addressed in IoT/Digital Twin demos), Standards and Interoperability ( you cannot connect in all proprietary formats economically and sustainably) A lot of points to consider and I want to close with Peter’s slide illustrating where most companies are in reality

The Challenges of a Connected Ecosystem for PLM

I was happy to present after Peter Bilello and David Sherburne (on day 1) as they both gave a perspective on digital transformation complementary to what I submitted. My presentation was focusing on the incompatibility of current coordinated business systems and the concept of a connected ecosystem.

You can already download my slides from SlideShare here: The Challenges of a Connected Ecosystem for PLM . I will explain my presentation in an upcoming blog post as slides without a story might lead to the wrong interpretation, and we already reached 2000 words. Few words to come.

How to Run a PLM Project Using the Agile Manifesto

Andrew Lodge, head of Engineering Systems at JCB explained how applying the agile mindset towards a PLM project can lead to faster and accurate results needed by the business. I am a full supporter for this approach as having worked in long and waterfall-type of PLM implementations there was always the big crash and user dissatisfaction at the final delivery. Keeping the business involved every step seems to be the solution. The issue I discovered here is that agile implementation requires a lot of people, in particular, business, to be involved heavily. Some companies do not understand this need and dropped /reduced business contribution to the least, killing the value of an agile approach

 

Concluding

For me coming back to London for the PI PLMx event was very motivational. Where the past two, three conferences before in Germany might have led to little progress per year, this year, thanks to new attendees and inspiration, it became for me a vivid event, hopefully growing shortly. Networking and listening to your peers in business remains crucial to digest it all.

 

A week ago I attended the joined CIMdata Roadmap and PDT Europe conference in Stuttgart as you can recall from last week’s post: The weekend after CIMdata Roadmap / PDT Europe 2018. As there was so much information to share, I had to split the report into two posts. This time the focus on the PDT Europe. In general, the PDT conferences have always been focusing on sharing experiences and developments related to standards. A topic you will not see at PLM Vendor conferences. Therefore, your chance to learn and take part if you believe in standards.

This year’s theme: Collaboration in the Engineering and Manufacturing Supply Chain – the Extended Digital Thread and Smart Manufacturing. Industry 4.0 plays a significant role here.

 

Model-based X: What is it and what is the status?

I have seen Peter Bilello presenting this topic now several times, and every time there is a little more progress. The fact that there is still an acronym war illustrated that the various aspects of a model-based approach are not yet defined. Some critics will be stating that’s because we do not need model-based and it is only a vendor marketing trick again.  Two comments here:

  • If you want to implement an end-to-end model-based approach including your customers and supply chain, you cannot avoid standard. More will become clear when you read the rest of this post. Vendors will not promote standards as it reduces their capabilities to deliver unique So standards must come from the market, not from the marketing.
  • In 2007 Carl Bass, at that time CEO at Autodesk made his statement: “There are only three customers in the world that have a PLM problem; Dassault, PTC, and There are no other companies that say I have a PLM problem”. Have a look here. PLM is understood by now and even by Autodesk. The statement illustrates that in the beginning the PLM target was not clear and people thought PLM was a system instead of a strategic approach. Model-based ways of working have to go through the same learning path, hopefully, faster.

Peter’s presentation was a good walk-through pointing out what exists, where we focus and that there is still working to be done. Not by vendors but by companies. Therefore I wholeheartedly agree with Peter’s closing remarks – no time to sit back and watch if you want to benefit from model-based approaches.

Smart Manufacturing

Kenny Swope, known from his presentations related to Boeing, now spoke to us as the Chair of the ISO/TC 184/SC 4 workgroup related to Industrial Data. To say it in decoded mode: Kenny is heading Sub-committee 4 with a focus on Industrial Data. SC4 is part of a more prominent theme: Automation Systems and integration identified by TC 184 all as part of the ISO framework. The scope:

Standardization of the content, meaning, structure, representation and quality management of the information required to define an engineered product and its characteristics at any required level of detail at any part of its lifecycle from conception through disposal, together with the interfaces required to deliver and collect the information necessary to support any business or technical process or service related to that engineered product during its lifecycle.

Perhaps boring to read if you think about all the demos you have seen at trade shows related to Smart Manufacturing. If you want these demos to become true in a vendor-independent environment, you will need to agree on a common framework of definitions to ensure future continuity beyond the demo. And here lies the business excitement, the real competitive advantages companies can have implementing Smart Manufacturing in a Scaleable, future-oriented way.

One of the often heard statements is that standards are too slow or incomplete. Incomplete is not a problem when there is a need, the standard will follow. Compare it with language, we will always invent new words for new concepts.

Being slow might be the case in the past. Kenny showed the relative fast convergence from country-specific Smart Manufacturing standards into a joined ISO/IEC framework – all within three years. ISO and IEC have been teaming-up already to build Smart Manufacturing Reference models.

This is already a considerable effort,  as the local reference models need to be studied and mapped to a common architecture. The target is to have a first Technical Specification for a joint standard final 2020 – quite fast!

Meinolf Gröpper from the German VDMA  presented what they are doing to support Smart Manufacturing / Industrie 4.0. The VDMA is a well-known engineering federation with 3200 member companies, 85 % of them are Small and Medium Enterprises – the power of the German economy.

The VDMA provides networking capabilities, readiness assessments for members to be the enabler for companies to transform. As Meinolf stated Industrie 4.0 is not about technology, it is about cross-border services and international cooperation. A strategy that every company has to develop and if possible implement at its own pace. Standards will accelerate the implementation of Industrie 4.0

The Smart Manufacturing session was concluded by Gunilla Sivard, Professor at KTH in Stockholm and Hampus Wranér, Consultant at Eurostep. They presented the work done on the DIgln project, targeting an infrastructure for Smart Manufacturing.

The presentation showed the implementation of the testbed using twittering bus communication and the ISO 10303-239 PLCS information standard as the persistent layer. The results were promising to further build capabilities on top of the infrastructure below:

The conclusion from the Smart Manufacturing session was that emerging and available standards can accelerate the deployment.

 

Enabling digital continuity in the Factory of the Future

Alcibiades Gonzalez-Noval from Airbus shared challenges and the strategy for Airbus’s factory of the future based on digital continuity from the virtual world towards the physical world, connecting with PLM, ERP, and MOM. Concepts many companies are currently working on with various maturity stages.

I agree with his lessons learned. We cannot think in silos anymore in a digital future – everything is connected. And please forget the PoC, to gain time start piloting and fail or succeed fast. Companies have lost years because of just doing PoCs and not going into action. The last point, networks segregation for sure is an issue, relevant for plant operations. I experienced this also in the past when promoting PLM concepts for (nuclear) owners/operators of plants. Network security is for sure an issue to resolve.

 

Cross-Discipline Lifecycle Collaboration Forum
Setting up the digital thread across engineering and the value chain.

Peter Gerber, Chairman of CDLC Forum and Data Exchange & Integration Leader at Schaefller and Pierre Bodin at Senior Manager Mews Partners, presented their findings related to the challenge of managing complex products (mechanical, electrical, software using system engineering methodology)  to work properly at affordable cost in a real-time mode, multidisciplinary and coordination across the whole value chain. Something you might expect could be done when reviewing all PLM Vendor’s marketing materials, something you might expect hard to do when remembering Martin Eigner’s statement that 95 % of the companies have not solved mechatronics collaboration yet. (See: The weekend after CIMdata PLM Roadmap and PDT Europe)

A demonstrator was defined, and various vendors participated in building a demonstrator based on their Out-Of-The-Box capabilities. The result showed that for all participants there were still gaps to resolve for full collaboration. A new version of the demonstrator is now planned for the middle of next year – curious to learn the results at that time. Multi-disciplinary collaboration is a (conceptual) pillar for future digital business – it needs to be possible.

 

A Digital Thread based on the PLCS standard.

Nigel Shaw, Eurostep’s managing director in the UK, took us through his evolution of PLCS (Product Life Cycle Support) and extension of the ISO 10303 STEP standard. (STEP Standard for Exchange of Product data). Nigel mentioned how over all these years, millions (and a lot of brain power) have been invested in PLCS to where it is now.

PLCS has been extremely useful as an interface standard for contracting, provide product data in a neutral way. As an example, last year the Swedish Defense organization (FMV) and France’s DGA made PLCS DEXs as part of the contractual conditions. It would be too costly to have all product data for all defense systems in proprietary vendor formats and this over the product lifecycle.

Those following the standards in the process industry will rely on ISO 15926 / CFIHOS as this standard’s dictionary, and data model is more geared to process data- and in particular the exchange of data from the various contractors with the owner/operator.

Coming back to PLCS and the Digital Twin – it is all about digital continuity of information. Otherwise, if we have to recreate information in every lifecycle stage of a product (design/manufacturing / operations), it will be too costly and not digital connected. This illustrates the growing needs for standards. I had nothing to add to Nigel’s conclusions:

It is interesting to note that product management has moved a long way over the last 10-20 years however as we include more and more into PLM, there are all the time new concepts to be solved. The cases we discuss today in our PLM communities were most of the time visions 10 years ago. Nowadays we want to include Model-Based Systems Engineering, 3D Modeling and simulation, electronics and software and even aftermarket, product support in true PLM. This was not the case 20 years ago. The people involved in the development of PLCS were for sure visionaries as product data connectivity along the whole lifecycle is needed and enabled by the standard.

 

Investing in Industry 4.0?
Hard Realities of the Grand Vision.

Marc Halpern from Gartner is one of the regular speakers at the PDT conference. Unfortunate he could not be with us that day, however, through a labor-intensive connection (mobile phone close to the speaker and Nigel Shaw trying to stay in sync with the presented slides) we could hear Marc speak about what we wanted to achieve too – a digital continuity.

Marc restated the massive potential of Industrie 4.0 when it comes to scalability, agility, flexibility, and efficiency.

Although technologies are evolving rapidly, it is the existing legacy that inhibits fast adoption. A topic that was also central in my presentation. It is not just a change in technology, there is much more connected.

Marc recommends a changing role for IT, where they should focus more on business priorities and business leadership strategies. This as opposed to the classical role of the IT organization where IT needed to support the business, now they will be part of leading the business too.

To orchestrate such an IT evolution, Marc recommends a “systems of systems” planning and execution across IT and Business. One of my recent blog posts: Moving to a model-based enterprise:  The business (information) model can be seen in that context.

How to deal with the incompatible future?

I was happy to conclude the sessions with the topic that concerns me the most at this time. Companies in their current business are already struggling to get aligned and coordinated between disciplines and external stakeholders, the gap to be connected is vast as it requires a master data management approach, an enterprise data model and model-based ways of working. Read my posts from the past ½ year starting here, and you get the picture.

Note: This image is based on Marc Halpern’s (Gartner) Technology/Maturity diagram from PDT 2015

I concluded with explaining companies need to learn to work in two modes. One mode will be the traditional way of working which I call the coordinated approach and a growing focus on operating in a connected mode.  You can see my full presentation here on SlideShare: How to deal with the incompatible future.

Conclusion

The conference was closed with a panel discussion where we shared our concerns related to the challenges companies face to change their traditional ways of working meanwhile entering a digital era. The positive points are there – baby steps – PLM is becoming understood, the significance of standards is becoming more clear. The need: a long-term vision.

 This concludes my review of an excellent conference – I learned again a lot and I hope to see you next year too. Thanks again to CIMdata and Eurostep for organizing this event

 

 

 

 

 

 

Last week I attended the long-awaited joined conference from CIMdata and Eurostep in Stuttgart. As I mentioned in earlier blog posts. I like this conference because it is a relatively small conference with a focused audience related to a chosen theme.

Instead of parallel sessions, all attendees follow the same tracks and after two days there is a common understanding for all. This time there were about 70 people discussing the themes:  Digitalizing Reality—PLM’s role in enabling the digital revolution (CIMdata) and Collaboration in the Engineering and Manufacturing Supply Chain –the Extended Digital Thread and Smart Manufacturing (EuroStep)

As you can see all about Digital. Here are my comments:

The State of the PLM Industry:
The Digital Revolution

Peter Bilello kicked off with providing an overview of the PLM industry. The PLM market showed an overall growth of 7.3 % toward 43.6 Billion dollars. Zooming in into the details cPDM grew with 2.9 %. The significant growth came from the PLM tools (7.7 %). The Digital Manufacturing sector grew at 6.2 %. These numbers show to my opinion that in particular, managing collaborating remains the challenging part for PLM. It is easier to buy tools than invest in cPDM.

Peter mentioned that at the board level you cannot sell PLM as this acronym is too much framed as an engineering tool. Also, people at the board have been trained to interpret transactional data and build strategies on that. They might embrace Digital Transformation. However, the Product innovation related domain is hard to define in numbers. What is the value of collaboration? How do you measure and value innovation coming from R&D? Recently we have seen more simplified approaches how to get more value from PLM. I agree with Peter, we need to avoid the PLM-framing and find better consumable value statements.

Nothing to add to Peter’s closing remarks:

 

An Alternative View of the Systems Engineering “V”

For me, the most interesting presentation of Day 1 was Don Farr’s presentation. Don and his Boeing team worked on depicting the Systems Engineering process for a Model-Based environment. The original “V” looks like a linear process and does not reflect the multi-dimensional iterations at various stages, the concept of a virtual twin and the various business domains that need to be supported.

The result was the diamond symbol above. Don and his team have created a consistent story related to the depicted diamond which goes too far for this blog post. Current the diamond concept is copyrighted by Boeing, but I expect we will see more of this in the future as the classical systems engineering “V” was not design for our model-based view of the virtual and physical products to design AND maintain.

 

Sponsor vignette sessions

The vignette sponsors of the conference, Aras, ESI,-group, Granta Design, HCL, Oracle and TCS all got a ten minutes’ slot to introduce themselves, and the topics they believed were relevant for the audience. These slots served as a teaser to come to their booth during a break. Interesting for me was Granta Design who are bringing a complementary data service related to materials along the product lifecycle, providing a digital continuity for material information. See below.

 

The PLM – CLM Axis vital for Digitalization of Product Process

Mikko Jokela, Head of Engineering Applications CoE, from ABB, completed the morning sessions and left me with a lot of questions. Mikko’s mission is to provide the ABB companies with an information infrastructure that is providing end-to-end digital services for the future, based on apps and platform thinking.

Apparently, the digital continuity will be provided by all kind of BOM-structures as you can see below.In my post, Coordinated or Connected, related to a model-based enterprise I call this approach a coordinated approach, which is a current best practice, not an approach for the future. There we want a model-based enterprise instead of a BOM-centric approach to ensure a digital thread. See also Don Farr’s diamond. When I asked Mikko which data standard(s) ABB will use to implement their enterprise data model it became clear there was no concept yet in place. Perhaps an excellent opportunity to look at PLCS for the product related schema.

A general comment: Many companies are thinking about building their own platform. Not all will build their platform from scratch. For those starting from scratch have a look at existing standards for your industry. And to manage the quality of data, you will need to implement Master Data Management, where for the product part the PLM system can play a significant role. See Master Data Management and PLM.

 

Systems of Systems Approach to Product Design

Professor Martin Eigner keynote presentation was about the concepts how new products and markets need a Systems of Systems approach combined with Model-Based Systems Engineering (MBSE) and Product Line Engineering (PLE) where the PLM system can be the backbone to support the MBSE artifacts in context. All these concepts require new ways of working as stated below:

And this is a challenge. A quick survey in the room (and coherent with my observations from the field) is the fact that most companies (95 %) haven’t even achieved to work integrated for mechatronics products. You can imagine the challenge to incorporate also Software, Simulation, and other business disciplines. Martin’s presentations are always an excellent conceptual framework for those who want to dive deeper a start point for discussion and learning.

Additive Manufacturing (Enabled Supply) at Moog

Moog Inc, a manufacturer of precision motion controls for various industries have made a strategic move towards Additive Manufacturing. Peter Kerl, Moog’s Engineering Systems Manager, gave a good introduction what is meant by Additive Manufacturing and how Moog is introducing Additive Manufacturing in their organization to create more value for their customer base and attract new customers in a less commodity domain. As you can image delivering products through Additive Manufacturing requires new skills (Design / Materials), new processes and a new organizational structure. And of course a new PLM infrastructure.

Jim van Oss, Moog’s PLM Architect and Strategist, explained how they have been involved in a technology solution for digital-enabled parts leveraging blockchain technology.  Have a look at their VeriPart trademark. It was interesting to learn from Peter and Jim that they are actively working in a space that according to the Gartner’s hype curve is in the early transform phase.  Peter and Jim’s presentation were very educational for the audience.

For me, it was also interesting to learn from Jim that at Moog they were really practicing the modes for PLM in their company. Two PLM implementations, one with the legacy data and the wrong data for the future and one with the new data model for the future. Both implementations build on the same PLM vendor’s release. A great illustration showing the past and the future data for PLM are not compatible

Value Creation through Synergies between PLM & Digital Transformation

Daniel Dubreuil, Safran’s CDO for Products and Services gave an entertaining lecture related to Safran’s PLM journey and the introduction of new digital capabilities, moving from an inward PLM system towards a digital infrastructure supporting internal (model-based systems engineering / multiple BOMs) and external collaboration with their customers and suppliers introducing new business capabilities. Daniel gave a very precise walk-through with examples from the real world. The concluding slide: KEY SUCCESS FACTORS was a slide that we have seen so many times at PLM events.

Apparently, the key success factors are known. However, most of the time one or more of these points are not possible to address due to various reasons. Then the question is: How to mitigate this risk as there will be issues ahead?

 

Bringing all the digital trends together. What’s next?

The day ended with a virtual Fire Place session between Peter Bilello and Martin Eigner, the audience did not see a fireplace however my augmented twitter feed did it for me:

Some interesting observations from this dialogue:

Peter: “Having studied physics is a good base for understanding PLM as you have to model things you cannot see” – As I studied physics I can agree.

Martin: “Germany is the center of knowledge for Mechanical, the US for Electronics and now China becoming the center for Electronics and Software” Interesting observation illustrating where the innovation will come from.

Both Peter and Martin spent serious time on the importance of multidisciplinary education. We are teaching people in silos, faculties work in silos. We all believe these silos must be broken down. It is hard to learn and experiment skills for the future. Where to start and lead?

Conclusion:

The PLM roadmap had some exciting presentations combined with CIMdata’s PLM update an excellent opportunity to learn and discuss reality. In particular for new methodologies and technologies beyond the hype. I want to thank CIMdata for the superb organization and allowing me to take part. Next week I will follow-up with a review of the PDT Europe conference part (Day 2)

 

 

Ontology example: description of the business entities and their relationships

In my recent posts, I have talked a lot about the model-based enterprise and already after my first post: Model-Based – an introduction I got a lot of feedback where most of the audience was automatically associating the words Model-Based to a 3D CAD Model.
Trying to clarify this through my post: Why Model-Based – the 3D CAD Model stirred up the discussion even more leading into: Model- Based: The confusion.

A Digital Twin of the Organization

At that time, I briefly touched on business models and business processes that also need to be reshaped and build for a digital enterprise. Business modeling is necessary if you want to understand and streamline large enterprises, where nobody can overview the overall company. This approach is like systems engineering where we try to understand and simulate complex systems.

With this post, I want to close on the Model-Based series and focus on the aspects of the business model. I was caught by this catchy article: How would you like a digital twin of your organization? which provides a nice introduction to this theme.  Also, I met with Steve Dunnico, Creator and co-founder of Clearvision, a Swedish startup company focusing on modern ways of business modeling.

 

Introduction

 Jos (VirtualDutchman):  Steve can you give us an introduction to your company and the which parts of the model-based enterprise you are addressing with Clearvision?

Steve (Clearvision):  Clearvision started as a concept over two decades ago – modeling complex situations across multiple domains needed a simplistic approach to create a copy of the complete ecosystem. Along the way, technology advancements have opened up big-data to everyone, and now we have Clearvision as a modeling tool/SaaS that creates a digital business ecosystem that enables better visibility to deliver transformation.

As we all know, change is constant, so we must transition from the old silo projects and programs to a business world of continuous monitoring and transformation.
Clearvision enables this by connecting the disparate parts of an organization into a model linking people, competence, technology services, data flow, organization, and processes.
Complex inter-dependencies can be visualized, showing impact and opportunity to deliver corporate transformation goals in measured minimum viable transformation – many small changes, with measurable benefit, delivered frequently.  This is what Clearvision enables!

Jos: What is your definition of business modeling?

Steve: Business modeling historically, has long been the domain of financial experts – taking the “business model” of the company (such as production, sales, support) and looking at cost, profit, margins for opportunity and remodeling to suit. Now, with the availability of increased digital data about many dimensions of a business, it is possible to model more than the financials.

This is the business modeling that we (Clearvision) work with – connecting all the entities that define a business so that a change is connected to process, people, data, technology and other dimensions such as cost, time, quality.  So if we change a part, all of the connected parts are checked for impact and benefit.

Jos: What are the benefits of business modeling?

Steve: Connecting the disparate entities of a business opens up limitless opportunities to analyze “what is affected if I change this?”.  This can be applied to simple static “as-is” gap analyses, to the more advanced studies needed to future forecast and move into predictive planning rather than reactive.

 The benefits of using a digital model of the business ecosystem are applicable to the whole organization.  The “C-suite” team get to see heat-maps for not only technology-project deliveries but can use workforce-culture maps to assess the company’s understanding and adoption of new ways of working and achievement of strategic goals.  While at an operational level, teams can collaborate more effectively knowing which parts of the ecosystem help or hinder their deliveries and vice-versa.

Jos: Is business modeling applicable for any type or size of the company?

The complexity of business has driven us to silo our way of working, to simplify tasks to achieve our own goals, and it is larger organizations which can benefit from modeling their business ecosystems.  On that basis, it is unlikely that a standalone small business would engage in its own digital ecosystem model.  However, as a supplier to a larger organization, it can be beneficial for the larger organizations to model their smaller suppliers to ensure a holistic view of their ecosystem.

The core digital business ecosystem model delivers integrated views of dependencies, clashes, hot-spots to support transformation

Jos: How is business modeling related to digital transformation?

Digital transformation is an often heard topic in large corporations, by implication we should take advantage of the digital data we generate and collect in our businesses and connect it, so we benefit from the whole not work in silos.  Therefore, using a digital model of a business ecosystem will help identify areas of connectivity and collaboration that can deliver best benefit but through Minimum Viable Transformation, not a multi-year program with a big-bang output (which sometimes misses its goals…).

Today’s digital technology brings new capabilities to businesses and is driving competence changes in organizations and their partner companies.  So another use of business modeling is to map competence of internal/external resources to the needed capabilities of digital transformation.  Mapping competence rather than roles brings a better fit for resources to support transformation.  Understanding which competencies we have and what the gaps are pr-requisite to plan and deliver transformation.

Jos: Then perhaps close with your Clearvision mission where you fit (uniquely)?

Having worked on early digital business ecosystem models in the late 90’s, we’ve cut our teeth on slow processing time, difficult to change data relationships and poor access to data, combined with a very silo’d work mentality.  Clearvision is now positioned to help organizations realize that the value of the whole of their business is greater than the sum of their parts (silos) by enabling a holistic view of their business ecosystem that can be used to deliver measured transformation on a continual basis.

 Jos: Thanks Steve for your contribution and with this completing the series of post related to a model-based enterprise with its various facets. I am aware this post the opinion from one company describing the importance of a model-based business in general. There are no commercial relations between the two of us and I recommend you to explore this topic further in case relevant for your situation.

Conclusion

Companies and their products are becoming more and more complex, most if it happening now, a lot more happening in the near future. In order to understand and manage this complexity models are needed to virtually define and analyze the real world without the high costs of making prototypes or changes in the real world. This applies for organizations, for systems, engineering and manufacturing coordination and finally in-field operating systems.  They all can be described by – connected – models. This is the future of a model-based enterprise

Coming up next time: CIMdata PDM Roadmap Europe and PDT Europe. You can still register and meet a large group of people who care about the details of aspects of a digital enterprise

 

The digital thread according to GE

In my earlier posts, I have explored the incompatibility between current PLM practices and future needs for digital PLM.  Digital PLM is one of the terms I am using for future concepts. Actually, in a digital enterprise, system borders become vague, it is more about connected platforms and digital services. Current PLM practices can be considered as Coordinated where the future for PLM is aiming at Connected information. See also Coordinated or Connected.

Moving from current PLM practices towards modern ways of working is a transformation for several reasons.

  • First, because the scope of current PLM implementation is most of the time focusing on engineering. Digital PLM aims to offer product information services along the product lifecycle.
  • Second, because the information in current PLM implementations is mainly stored in documents – drawings still being the leading In advanced PLM implementations BOM-structures, the EBOM and MBOM are information structures, again relying on related specification documents, either CAD- or Office files.

So let’s review the transformation challenges related to moving from current PLM to Digital PLM

Current PLM – document management

The first PLM implementations were most of the time advanced cPDM implementations, targeting sharing CAD models and drawings. Deployments started with the engineering department with the aim to centralize product design information. Integrations with mechanical CAD systems had the major priority including engineering change processes. Multidisciplinary collaboration enabled by introducing the concept of the Engineering Bill of Materials (EBOM).  Every discipline, mechanical, electrical and sometimes (embedded) software teams, linked their information to the EBOM. The product release process was driven by the EBOM. If the EBOM is released, the product is fully specified and can be manufactured.

Although people complain implementing PLM is complex, this type of implementation is relatively simple. The only added mental effort you are demanding from the PLM user is to work in a structured way and have a more controlled (rigid) way of working compared to a directory structure approach. For many people, this controlled way of working is already considered as a limitation of their freedom. However, companies are not profitable because their employees are all artists working in full freedom. They become successful if they can deliver in some efficient way products with consistent quality. In a competitive, global market there is no room anymore for inefficient ways of working as labor costs are adding to the price.

The way people work in this cPDM environment is coordinated, meaning based on business processes the various stakeholders agree to offer complete sets of information (read: documents) to contribute to the full product definition. If all contributions are consistent depends on the time and effort people spent to verify and validate its consistency. Often this is not done thoroughly and errors are only discovered during manufacturing or later in the field. Costly but accepted as it has always been the case.

Next Step PLM – coordinated document management / item-centric

When the awareness exists that data needs to flow through an organization is a consistent manner, the next step of PLM implementations come into the picture. Here I would state we are really talking about PLM as the target is to share product data outside the engineering department.

The first logical extension for PLM is moving information from an EBOM view (engineering) towards a Manufacturing Bill of Materials (MBOM) view. The MBOM is aiming to represent the manufacturing definition of the product and becomes a placeholder to link with the ERP system and suppliers directly. Having an integrated EBOM / MBOM process with your ERP system is already a big step forward as it creates an efficient way of working to connect engineering and manufacturing.

As all the information is now related to the EBOM and MBOM, this approach is often called the item-centric approach. The Item (or Part) is the information carrier linked to its specification documents.

 

Managing the right version of the information in relation to a specific version of the product is called configuration management. And the better you have your configuration management processes in place, the more efficient and with high confidence you can deliver and support your products.  Configuration Management is again a typical example where we are talking about a coordinated approach to managing products and documents.

Implementing this type of PLM requires already more complex as it needs different disciplines to agree on a collective process across various (enterprise) systems. ERP integrations are technically not complicated, it is the agreement on a leading process that makes it difficult as the holistic view is often failing.

Next, next step PLM – the Digital Thread

Continuing reading might give you the impression that the next step in PLM evolution is the digital thread. And this can be the case depending on your definition of the digital thread. Oleg Shilovitsky recently published an article: Digital Thread – A new catchy phrase to replace PLM? related to his observations from  ConX18 illustrate that there are many viewpoints to this concept. And of course, some vendors promote their perfect fit based on their unique definition. In general, I would classify the idea of Digital Thread in two approaches:

The Digital Thread – coordinated

In the Digital Thread – coordinated approach we are not revolutionizing the way of working in an enterprise. In the coordinated approach, the PLM environment is connected with another overlay, combining data from various disciplines into an environment where the dependencies are traceable. This can be the Aras overlay approach (here explained by Oleg Shilovitsky), the PTC Navigate approach or others, using a new extra layer to connect the various discipline data and create traceability in a more or less non-intrusive way. Similar concepts, but less intrusive can be done through Business Intelligence applications, although they are more read-only than a system approach.

The Digital Thread – connected

In the Digital Thread – connected approach the idea is that information is stored in an extreme granular way and shared among disciplines. Instead of the coordinated way, where every discipline can have their own data sources, here the target is to be data-driven (neutral/standard formats). I described this approach in the various aspects of the model-based enterprise. The challenge of a connected enterprise is the standardized data definition to make it available for all stakeholders.

Working in a connected enterprise is extremely difficult, in particular for people educated in the old-fashioned ways of working. If you have learned to work with shared documents, like Google Docs or Office documents in sharing mode, you will understand the mental change you have to go through. Continuous sharing the information instead of waiting until you feel your part is complete.

In the software domain, companies are used to work this way and to integrate data in a continuous stream. We have to learn to apply these practices also to a complete product lifecycle, where the product consists of hardware and software.

Still, the connect way if working is the vision where digital enterprises should aim for as it dramatically reduces the overhead of information conversion, overhead, and ambiguity. How we will implement in the context of PLM / Product Innovation is a learning process, where we should not be blocked by our echo chamber as Jan Bosch states it in his latest post: Don’t Get Stuck In Your Company’s Echo Chamber

Jan Bosch is coming from the software world, promoting the Software-Centric Systems conference SC2 as a conference to open up your mind. I recommend you to take part in upcoming PLM related events: CIMdata’s PLM roadmap Europe combined with PDT Europe on 24/25th October in Stuttgart, or if you are living in the US there is the upcoming PI PLMx CHICAGO 2018 on Nov 5/6th.

Conclusion

Learning and understanding are crucial and takes time. A digital transformation has many aspects to learn – keep in mind the difference between coordinated (relatively easy) and connected (extraordinarily challenging but promising). Unfortunate there is no populist way to become digital.

Note:
If you want to continue learning, please read this post – The True Impact of Industry 4.0 Revealed  -and its internal links to reference information from Martijn Dullaart – so relevant.

 

What I want to discuss this time is the challenging transformation related to product data that needs to take place.

The top image of this post illustrates the current PLM world on the left, and on the right the potential future positioning of PLM in a digital enterprise.  How the right side will behave is still vague – it can be a collection of platforms or a vast collection of small services all contributing to the performance of the company.  Some vendors might dream, all these capabilities are defined in one system of systems, like the human body; all functions are available and connected.

Coordinated or connected?

This is THE big question for a future digital enterprise. In the current PLM approach, there are governance structures that allow people to share data along the product lifecycle in a structured way.

These governance structures can be project breakdown structures, where with a phase-gate approach the full delivery is guided. Deliverables related to task and gates will make sure information is stored available for every stakeholder. For example, a well-known process in the automotive industry, the Advanced Product Quality Process ( APQP process) is a standardized approach to make sure parts or products are introduced with the right quality for the customer.

Deliverables at any stage in the process can be reviewed or consumed by another stakeholder. The result is most of the time a collection of approved documents (Office-type, Design & Test files) stored centrally. This is what I would call a coordinated data approach.

In complex environments, besides the project governance, there will be product structures and Bill of Materials, where each object in such a structure will be the placeholder for related information. In case of a product structure it can be its specifications per component, in case of a Bill of Materials, it can be its design specification (usually in CAD models) and its manufacturing specifications, in case of an MBOM.

An example of structures used in Enovia

Although these structures contain information about the product composition themselves, the related information makes the content understandable/realizable.

Again it is a coordinated approach, and most PLM systems and implementations are focusing on providing these structures.

Sometimes with their own system only – you need to follow the vendor portfolio to get the full benefit  or sometimes the system is positioned as an overlay to existing systems in the company, therefore less invasive.

Presentation from Martin Eigner – explaining the overlay concept based on Aras

Providing the single version of the truth is often associated with this approach. The question is: Is the green bin on the left the single version of the truth?

The Coordinated – Single Version of the Truth – problem

The challenge of a coordinated approach is that there is no thorough consistency checking if the data delivered is representing the real truth. Through serious review procedures, we do our best to make sure every deliverable has the required content and quality. As information inside these deliverables is not connected to the outside world, there will be discrepancies between reality and what has been stored. Still, we feel comfortable enough as an organization to pretend we know where the risks are. Until the costly impossible happens !

The connected enterprise

The ultimate dream of a digital enterprise is that everything relevant is connected in context. This means no more documents or files but a very granular information model for linking data and keeping it in context. We can apply algorithms and automation to connected data and use Artificial Intelligence to make sense of massive amounts of data.

Connected data allows us to share combined sets of information that are relevant to a particular role. Real-time dashboarding is one of the benefits of such an infrastructure. There are still a lot of challenges with this approach. How do we know which information is valid in the context of other information? What are the rules that describe a valid product or project baseline at a particular time?

Although all data is stored as unique information objects in a network of information, we cannot apply the old mechanisms for a coordinated approach all the time. Generated reports from a connected environment can still serve as baselines or records related to a specific state, such as when the design was approved for manufacturing, we can generate approved Product Baselines structures or Bill of Materials structures.

However, this linearity in lifecycle for passing information through an enterprise will not exist anymore. It might be there are various design alternatives and the delivery process is already part of the design phase. Through integrated virtual simulation and testing, we reach a state that the product satisfies the market for that moment and the delivery process is known at the same time

Almost immediately and based on first experiences from the field, new features can be added virtually tested and validated for the next stage. We need to design new PLM infrastructures that can support this granularity and therefore complexity.

The connected – Single Version of the Truth – problem

The concepts I described related to the connected enterprise made me realize that this is analogue to how the brain works. Our brain is a giant network of connected information, dynamically maintaining associations, having different abstraction levels and always pretending there is one truth.

If you want to understand a potential model of the brain, please read On Intelligence from Jeff Hawkins. With the possible upcoming of the Quantum Computer, we might be able to create performing brain models.

In my earlier post: Are we blocking our future,  I referred to the book; The Idiot Brain: What Your Head is Really Up To from Dean Burnett, where Dean is stating that due to the complexity of stored information our brain continuously adapts “non-compliant” information to make sure the owner of the brain feels comfortable.

What we think that is the truth might be just the creation from the brain, combining the positive parts into a compelling story and suppressing or deleting information that does not fit.  Although it sounds absurd, I believe if we are able to create a connected digital enterprise we will face the same symptoms.  Due to the complexity of connected information, we are looking for the best suitable version, and as all became so complex, ordinary human beings will no longer be able to distinguish this

 

Conclusion:

As part of the preparation for the upcoming PDT Europe 2018, I was investigating the topics coordinated and connected enterprise to discover potential transformation steps. We all need to explore the future with an open mind, and the challenge is: WHERE and HOW FAST can we transform from coordinated to connected? I am curious if you have experiences or thoughts on this topic.

 

 

 

Earth GIF - Find & Share on GIPHY

At this moment we are in the middle of the year. Usually for me a quiet time and a good time to reflect on what has happened so far and to look forward.

Three themes triggered me to write this half-year:

  • The changing roles of (PLM) consultancy
  • The disruptive effect of digital transformation on legacy PLM
  • The Model-driven approaches

A short summary per theme here with links to the original posts for those who haven’t followed the sequence.

The changing roles of (PLM) consultancy

Triggered by Oleg Shilovitsky’s post Why traditional PLM ranking is dead. PLM ranking 2.0 a discussion started related to the changing roles of PLM choice and the roles of a consultant.  Oleg and I agreed that using the word dead in a post title is a way to catch extra attention. And as many people do not read more than the introduction, this is a way to frame ideas (not invented by us, look at your newspaper and social media posts).  Please take your time and read this post till the end.

Oleg and I concluded that the traditional PLM status reports provided by consultancy firms are no longer is relevant. They focus on the big vendors, in a status-quo and most of them are 80 % the same on their core PLM capabilities. The challenge comes in how to select a PLM approach for your company.

Here Oleg and I differ in opinion. I am more looking at PLM from a business transformation point of view, how to improve your business with new ways of working. The role of a consultant is crucial here as the consultant can help to formalize the company’s vision and areas to focus on for PLM. The value of the PLM consultant is to bring experience from other companies instead of inventing new strategies per company. And yes, a consultant should get paid for this added value.

Oleg believes more in the bottom-up approach where new technology will enable users to work differently and empower themselves to improve their business (without calling it PLM). More or less concluding there is no need for a PLM consultant as the users will decide themselves about the value of the selected technology. In the context of Oleg’s position as CEO/Co-founder of OpenBOM, it is a logical statement, fighting for the same budget.

The discussion ended during the PLMx conference in Hamburg, where Oleg and I met with an audience recorded by MarketKey. You can find the recording Panel Discussion: Digital Transformation and the Future of PLM Consulting here.
Unfortunate, like many discussions, no conclusion. My conclusion remains the same – companies need PLM coaching !

The related post to this topic are:

 

The disruptive effect of digital transformation on legacy PLM

A topic that I have discussed the past two years is that current PLM is not compatible with a modern data-driven PLM. Note: data-driven PLM is still “under-development”. Where in most companies the definition of the products is stored in documents / files, I believe that in order to manage the complexity of products, hardware and software in the future, there is a need to organize data related to models not to files. See also: From Item-centric to model-centric ?

For a company it is extremely difficult to have two approaches in parallel as the first reaction is: “let’s convert the old data to the new environment”.

This statement has been proven impossible in most of the engagements I am involved in and here I introduced the bimodal approach as a way to keep the legacy going (mode 1) and scale-up for the new environment (mode 2).

A bimodal approach is sometimes acceptable when the PLM software comes from two different vendors. Sometimes this is also called the overlay approach – the old system remains in place and a new overlay is created to connect the legacy PLM system and potentially other systems like ALM or MBSE environments. For example some of the success stories for Aras complementing Siemens PLM.

Like the bimodal approach the overlay approach creates the illusion that in the near future the old legacy PLM will disappear. I partly share that illusion when you consider the near future a period of 5 – 10+ years depending on the company’s active products. Faster is not realistic.

And related to bimodal, I now prefer to use the terminology used by McKinsey: our insights/toward an integrated technology operating model in the context of PLM.

The challenge is that PLM vendors are reluctant to support a bimodal approach for their own legacy PLM as then suddenly this vendor becomes responsible for all connectivity between mode 1 and mode 2 data – every vendors wants to sell only the latest.

I will elaborate on this topic during the PDT Europe conference in Stuttgart – Oct 25th . No posts on this topic this year (yet) as I am discussing, learning and collecting examples from the field. What kept me relative busy was the next topic:

The Model-driven approaches

Most of my blogging time I spent on explaining the meaning behind a modern model-driven approach and its three main aspects: Model-Based Systems Engineering, Model-Based Definition and Digital Twins. As some of these aspects are still in the hype phase, it was interesting to see the two different opinions are popping up. On one side people claiming the world is still flat (2D), considering model-based approaches just another hype, caused by the vendors. There is apparently no need for digital continuity. If you look into the reactions from certain people, you might come to the conclusion it is impossible to have a dialogue, throwing opinions is not a discussion..

One of the reasons might be that people reacting strongly have never experienced model-based efforts in their life and just chime in or they might have a business reason not to agree to model-based approached as it does not align with their business? It is like the people benefiting from the climate change theory – will the vote against it when facts are known ? Just my thoughts.

There is also another group, to which I am connected, that is quite active in learning and formalizing model-based approaches. This in order to move forward towards a digital enterprise where information is connected and flowing related to various models (behavior models, simulation models, software models, 3D Models, operational models, etc., etc.) . This group of people is discussing standards and how to use and enhance them. They discuss and analyze with arguments and share lessons learned. One of the best upcoming events in that context is the joined CIMdata PLM Road Map EMEA and the PDT Europe 2018 – look at the agenda following the image link and you should get involved too – if you really care.

 

And if you are looking into your agenda for a wider, less geeky type of conference, consider the PI PLMx CHICAGO 2018 conference on Nov 5 and 6. The agenda provides a wider range of sessions, however I am sure you can find the people interested in discussing model-based learnings there too, in particular in this context Stream 2: Supporting the Digital Value Chain

My related posts to model-based this year were:

Conclusion

I spent a lot of time demystifying some of PLM-related themes. The challenge remains, like in the non-PLM world, that it is hard to get educated by blog posts as you might get over-informed by (vendor-related) posts all surfing somewhere on the hype curve. Do not look at the catchy title – investigate and take time to understand HOW things will this work for you or your company. There are enough people explaining WHAT they do, but HOW it fit in a current organization needs to be solved first. Therefore the above three themes.

%d bloggers like this: